• Title of article

    Fast learning from -mixing observations

  • Author/Authors

    Hang، نويسنده , , H. and Steinwart، نويسنده , , I.، نويسنده ,

  • Issue Information
    دوفصلنامه با شماره پیاپی سال 2014
  • Pages
    16
  • From page
    184
  • To page
    199
  • Abstract
    We present a new oracle inequality for generic regularized empirical risk minimization algorithms learning from stationary α -mixing processes. Our main tool to derive this inequality is a rather involved version of the so-called peeling method. We then use this oracle inequality to derive learning rates for some learning methods such as empirical risk minimization (ERM), least squares support vector machines (SVMs) using given generic kernels, and SVMs using the Gaussian RBF kernels for both least squares and quantile regression. It turns out that for i.i.d. processes our learning rates for ERM and SVMs with Gaussian kernels match, up to some arbitrarily small extra term in the exponent, the optimal rates, while in the remaining cases our rates are at least close to the optimal rates.
  • Keywords
    Non-parametric classification and regression , Support vector machines (SVMs) , Empirical risk minimization (ERM) , Alpha-mixing processes
  • Journal title
    Journal of Multivariate Analysis
  • Serial Year
    2014
  • Journal title
    Journal of Multivariate Analysis
  • Record number

    1566690