DocumentCode
3487225
Title
Survey of two selected superlinear learning techniques
Author
GÉczy, Peter ; Usui, Shiro
Author_Institution
RIKEN Brain Sci. Inst., Saitama, Japan
Volume
1
fYear
2002
fDate
18-22 Nov. 2002
Firstpage
472
Abstract
The article surveys theoretical and practical aspects of two superlinear learning algorithms. The introduced algorithms feature novel solution to the line search subproblem simplified to a single step calculation of the appropriate values of step length and/or momentum term. It remarkably improves the computational complexity and implementation of the line search subproblem and yet does not harm the stability of the methods. The algorithms are theoretically proven to be convergent and universal within the proposed classification framework. They are capable of reaching superlinear convergence rates on an arbitrary task. Performance of the proposed algorithms is extensively evaluated on five data sets and compared to the relevant standard first order optimization techniques.
Keywords
computational complexity; convergence; learning (artificial intelligence); search problems; arbitrary task; classification framework; computational complexity; data sets; line search subproblem; momentum term; single step calculation; standard first order optimization techniques; step length; superlinear convergence rates; superlinear learning algorithms; Computational complexity; Convergence; Ear; Jacobian matrices; Least squares methods; Neural networks; Optimization methods; Polynomials; Search methods; Stability;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN
981-04-7524-1
Type
conf
DOI
10.1109/ICONIP.2002.1202215
Filename
1202215
Link To Document