DocumentCode :
3106244
Title :
Stationary point variational Bayesian attribute-distributed sparse learning with ℓ1 sparsity constraints
Author :
Shutin, Dmitriy ; Kulkarni, Sanjeev R. ; Poor, H. Vincent
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
fYear :
2011
fDate :
13-16 Dec. 2011
Firstpage :
277
Lastpage :
280
Abstract :
The paper proposes a new variational Bayesian algorithm for ℓ1-penalized multivariate regression with attribute-distributed data. The algorithm is based on the variational Bayesian version of the SAGE algorithm that realizes a training of individual agents in a distributed fashion and sparse Bayesian learning (SBL) with hierarchical sparsity prior modeling of the agent weights. The SBL introduces constraints on the weights of individual agents, thus reducing the effects of overfitting and removing/suppressing poorly performing agents in the ensemble estimator. The ℓ1 constraint is introduced using a product of a Gaussian and an exponential probability density function with the resulting marginalized prior being a Laplace pdf. Such a hierarchical formulation of the prior allows for a computation of the stationary points of the variational update expressions for prior parameters, as well as deriving conditions that ensure convergence to these stationary points. Using synthetic data it is demonstrated that the proposed algorithm performs very well in terms of the achieved MSE, and outperforms other algorithms in the ability to sparsify non-informative agents, while at the same time allowing distributed implementation and flexible agent update protocols.
Keywords :
Gaussian processes; distributed processing; learning (artificial intelligence); multi-agent systems; probability; regression analysis; variational techniques; Gaussian product; Laplace pdf; MSE; SAGE algorithm; SBL; agent weights; attribute-distributed data; distributed fashion; distributed implementation; ensemble estimator; exponential probability density function; flexible agent update protocols; hierarchical formulation; hierarchical sparsity prior modeling; marginalized prior; multivariate regression; noninformative agents; performing agents; prior parameters; sparse Bayesian learning; sparsity constraints; stationary point variational Bayesian attribute-distributed sparse learning; stationary points; synthetic data; variational Bayesian algorithm; variational Bayesian version; variational update expressions; Bayesian methods; Convergence; Inference algorithms; Optimization; Prediction algorithms; Probability density function; Training; Attribute-distributed learning; sparse Bayesian learning; variational Bayesian inference;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), 2011 4th IEEE International Workshop on
Conference_Location :
San Juan
Print_ISBN :
978-1-4577-2104-5
Type :
conf
DOI :
10.1109/CAMSAP.2011.6136003
Filename :
6136003
Link To Document :
بازگشت