Title :
Proximal alternating-direction message-passing for MAP LP relaxation
Author :
Guoqiang Zhang ; Heusdens, Richard
Author_Institution :
Dept. of Intell. Syst., Delft Univ. of Technol., Delft, Netherlands
Abstract :
Linear programming (LP) relaxation for MAP inference over (factor) graphic models is one of the fundamental problems in machine learning. In this paper, we propose a new message-passing algorithm for the MAP LP-relaxation by using the proximal alternating-direction method of multipliers (PADMM). At each iteration, the new algorithm performs two layers of optimization, that is node-oriented optimization and factor-oriented optimization. On the other hand, the recently proposed augmented primal LP (APLP) algorithm, based on the ADMM, has to perform three layers of optimization. Our algorithm simplifies the APLP algorithm by removing one layer of optimization, thus reducing the computational complexities and further accelerating the convergence rate. We refer to our new algorithm as the proximal alternating-direction (PAD) algorithm. Experimental results confirm that the PAD algorithm indeed converges faster than the APLP method.
Keywords :
computational complexity; computer graphics; learning (artificial intelligence); linear programming; message passing; APLP algorithm; Linear programming relaxation; MAP LP relaxation; MAP inference; PAD algorithm; PADMM; augmented primal LP algorithm; computational complexities; convergence rate; factor-oriented optimization; graphic models; machine learning; node-oriented optimization; proximal alternating-direction message-passing; proximal alternating-direction method of multipliers; Algorithm design and analysis; Approximation methods; Convergence; Graphics; Linear programming; Minimization; Optimization; ADMM; LP relaxation; MAP; PADMM; graphic models; message-passing;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on
Conference_Location :
Vancouver, BC
DOI :
10.1109/ICASSP.2013.6638289