Title :
Information Consistency of Nonparametric Gaussian Process Methods
Author :
Seeger, Matthias W. ; Kakade, Sham M. ; Foster, Dean P.
Author_Institution :
Max Planck Inst. for Biol. Cybern., Tubingen
fDate :
5/1/2008 12:00:00 AM
Abstract :
Bayesian nonparametric models are widely and successfully used for statistical prediction. While posterior consistency properties are well studied in quite general settings, results have been proved using abstract concepts such as metric entropy, and they come with subtle conditions which are hard to validate and not intuitive when applied to concrete models. Furthermore, convergence rates are difficult to obtain. By focussing on the concept of information consistency for Bayesian Gaussian process (GP)models, consistency results and convergence rates are obtained via a regret bound on cumulative log loss. These results depend strongly on the covariance function of the prior process, thereby giving a novel interpretation to penalization with reproducing kernel Hilbert space norms and to commonly used covariance function classes and their parameters. The proof of the main result employs elementary convexity arguments only. A theorem of Widom is used in order to obtain precise convergence rates for several covariance functions widely used in practice.
Keywords :
Bayes methods; Gaussian processes; Hilbert spaces; covariance analysis; Bayesian nonparametric model; Gaussian process method; covariance function; cumulative log loss; kernel Hilbert space norm; metric entropy; statistical prediction; Bayesian methods; Concrete; Convergence; Eigenvalues and eigenfunctions; Entropy; Gaussian processes; Hilbert space; Kernel; Predictive models; Statistical distributions; Bayesian prediction; Gaussian process; eigenvalue asymptotics; information consistency; nonparametric statistics; online learning; posterior consistency; regret bound;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2007.915707