Title :
A semismooth Newton method for adaptive distributed sparse linear regression
Author :
Dmitriy Shutin;Boris Vexler
Author_Institution :
Institute of Communications and Navigation, German Aerospace Center (DLR), M?nchner Str. 20, 82234 Wessling, Germany
Abstract :
The presented work studies an application of a technique known as a semismooth Newton (SSN) method to accelerate the convergence of distributed quadratic programming LASSO (DQP-LASSO) - a consensus-based distributed sparse linear regression algorithm. The DQP-LASSO algorithm exploits an alternating directions method of multipliers (ADMM) algorithm to reduce a global LASSO problem to a series of local (per agent) LASSO optimizations, which outcomes are then appropriately combined. The SSN algorithm enjoys superlinear convergence and thus permits implementing these local optimizations more efficiently. Yet in some cases SSN might experience convergence issues. Here it is shown that the ADMM-inherent regularization also provides sufficient regularization to stabilize the SSN algorithm, thus ensuring a stable convergence of the whole scheme. Additionally, the structure of the SSN algorithm also permits an adaptive implementation of a distributed sparse regression. This allows for an estimation of time-varying sparse vectors, as well as leverages storage requirements for processing streams of data.
Keywords :
"Optimization","Convergence","Artificial neural networks","Linear regression","Acceleration","Conferences","Estimation"
Conference_Titel :
Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2015 IEEE 6th International Workshop on
DOI :
10.1109/CAMSAP.2015.7383829