• Title of article

    Sparse Bayesian Learning for Basis Selection

  • Author/Authors

    D. P. Wipf and B. D. Rao، نويسنده ,

  • Issue Information
    روزنامه با شماره پیاپی سال 2004
  • Pages
    12
  • From page
    2153
  • To page
    2164
  • Abstract
    Sparse Bayesian learning (SBL) and specifically relevance vector machines have received much attention in the machine learning literature as a means of achieving parsimonious representations in the context of regression and classification. The methodology relies on a parameterized prior that encourages models with few nonzero weights. In this paper, we adapt SBL to the signal processing problem of basis selection from overcomplete dictionaries, proving several results about the SBL cost function that elucidate its general behavior and provide solid theoretical justification for this application. Specifically, we have shown that SBL retains a desirable property of the ℓ0-norm diversity measure (i.e., the global minimum is achieved at the maximally sparse solution) while often possessing a more limited constellation of local minima. We have also demonstrated that the local minima that do exist are achieved at sparse solutions. Later, we provide a novel interpretation of SBL that gives us valuable insight into why it is successful in producing sparse representations. Finally, we include simulation studies comparing sparse Bayesian learning with basis pursuit and the more recent FOCal Underdetermined System Solver (FOCUSS) class of basis selection algorithms. These results indicate that our theoretical insights translate directly into improved performance.
  • Keywords
    sparse Bayesian learning , sparse representations. , linear inverseproblems , Basis selection , Diversity measures
  • Journal title
    IEEE TRANSACTIONS ON SIGNAL PROCESSING
  • Serial Year
    2004
  • Journal title
    IEEE TRANSACTIONS ON SIGNAL PROCESSING
  • Record number

    403605