• DocumentCode
    12350
  • Title

    The Connection Between Bayesian Estimation of a Gaussian Random Field and RKHS

  • Author

    Aravkin, Aleksandr Y. ; Bell, Bradley M. ; Burke, James V. ; Pillonetto, Gianluigi

  • Author_Institution
    IBM T. J. Watson Res. Center, Yorktown Heights, NY, USA
  • Volume
    26
  • Issue
    7
  • fYear
    2015
  • fDate
    Jul-15
  • Firstpage
    1518
  • Lastpage
    1524
  • Abstract
    Reconstruction of a function from noisy data is key in machine learning and is often formulated as a regularized optimization problem over an infinite-dimensional reproducing kernel Hilbert space (RKHS). The solution suitably balances adherence to the observed data and the corresponding RKHS norm. When the data fit is measured using a quadratic loss, this estimator has a known statistical interpretation. Given the noisy measurements, the RKHS estimate represents the posterior mean (minimum variance estimate) of a Gaussian random field with covariance proportional to the kernel associated with the RKHS. In this brief, we provide a statistical interpretation when more general losses are used, such as absolute value, Vapnik or Huber. Specifically, for any finite set of sampling locations (that includes where the data were collected), the maximum a posteriori estimate for the signal samples is given by the RKHS estimate evaluated at the sampling locations. This connection establishes a firm statistical foundation for several stochastic approaches used to estimate unknown regularization parameters. To illustrate this, we develop a numerical scheme that implements a Bayesian estimator with an absolute value loss. This estimator is used to learn a function from measurements contaminated by outliers.
  • Keywords
    Bayes methods; Gaussian processes; Markov processes; Monte Carlo methods; learning (artificial intelligence); maximum likelihood estimation; Bayesian estimation; Gaussian random field; Huber loss; RKHS estimation; Vapnik loss; absolute value loss; function reconstruction; infinite-dimensional reproducing kernel Hilbert space; machine learning; maximum a posteriori estimation; minimum variance estimation; posterior mean; quadratic loss; regularization parameter estimation; sampling location; statistical interpretation; Bayes methods; Kernel; Loss measurement; Noise; Noise measurement; Pollution measurement; Vectors; Gaussian processes; Markov chain Monte Carlo (MCMC); kernel-based regularization; regularization networks; representer theorem; reproducing kernel Hilbert spaces (RKHSs); support vector regression;
  • fLanguage
    English
  • Journal_Title
    Neural Networks and Learning Systems, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    2162-237X
  • Type

    jour

  • DOI
    10.1109/TNNLS.2014.2337939
  • Filename
    6871416