Title :
Convergence of a neural network for sparse approximation using the nonsmooth Łojasiewicz inequality
Author :
Balavoine, Aurele ; Rozell, Christopher J. ; Romberg, Justin
Author_Institution :
Sch. of Electr. & Comput. Eng., Georgia Inst. of Technol., Atlanta, GA, USA
Abstract :
Sparse approximation is an optimization program that produces state-of-the-art results in many applications in signal processing and engineering. To deploy this approach in real-time, it is necessary to develop faster solvers than are currently available in digital. The Locally Competitive Algorithm (LCA) is a dynamical system designed to solve the class of sparse approximation problems in continuous time. But before implementing this network in analog VLSI, it is essential to provide performance guarantees. This paper presents new results on the convergence of the LCA neural network. Using recently-developed methods that make use of the Łojasiewicz inequality for nonsmooth functions, we prove that the output and state trajectories converge to a single fixed point. This improves on previous results by guaranteeing convergence to a singleton even when the optimization program has infinitely many and non-isolated solution points.
Keywords :
approximation theory; convergence of numerical methods; neural nets; optimisation; signal processing; LCA neural network; dynamical system; locally competitive algorithm; neural network convergence; nonsmooth Łojasiewicz inequality; nonsmooth functions; optimization program; output trajectory; signal engineering; signal processing; sparse approximation; state trajectory; Approximation methods; Convergence; Cost function; Linear programming; Neural networks; Trajectory; Vectors;
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4673-6128-6
DOI :
10.1109/IJCNN.2013.6706832