• DocumentCode
    2492478
  • Title

    On a training scheme based on orthogonalization and thresholding for a nonparametric regression problem

  • Author

    Hagiwara, Katsuyuki

  • Author_Institution
    Fac. of Educ., Mie Univ., Tsu, Japan
  • fYear
    2010
  • fDate
    18-23 July 2010
  • Firstpage
    1
  • Lastpage
    8
  • Abstract
    For a nonparametric regression problem, we have been proposed a training scheme based on orthogonalization and thresholding, in which a machine is assumed to be a weighted sum of fixed basis functions. In the basis of the scheme, vectors of basis function outputs are orthogonalized and coefficients of the orthogonalized vectors are estimated instead of weights. The coefficients are set to zero if those are less than predetermined threshold levels which are assigned componentwisely to every coefficients. We then obtain a resulting weight vector by transforming the thresholded coefficients. In this article, we presented theoretical details for supporting threshold levels applied in the training scheme. For a simple situation, we also gave an upper bound of a generalization error of the training scheme. As an implication of the bound, we found that an increase of generalization error is of O(log n/n) when there is a sparse representation of a target function in an orthogonal domain. In implementing the training scheme, an eigendecomposition or a Gram-Schmidt procedure for orthogonalization has been employed, in which the corresponding training methods are referred as HTED and HTGS. Also modified versions of HTED and HTGS has been proposed to reduce an estimation bias, which are referred as HTED2 and HTGS2 respectively. We also examined performances of the training methods on real benchmark datasets, in which HTED2 and HTGS2 exhibit relatively good generalization performances. In addition to the generalization performance, HTGS2 is found to obtain a sparse representation of a target function in terms of basis functions.
  • Keywords
    computational complexity; eigenvalues and eigenfunctions; learning (artificial intelligence); nonparametric statistics; regression analysis; Gram-Schmidt procedure; HTED; HTGS; eigendecomposition; fixed basis functions; generalization error; nonparametric regression problem; orthogonalization; thresholding; training scheme; Elevators;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks (IJCNN), The 2010 International Joint Conference on
  • Conference_Location
    Barcelona
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-6916-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2010.5596649
  • Filename
    5596649