DocumentCode
2174138
Title
Online learning with kernels: Overcoming the growing sum problem
Author
Singh, Abhishek ; Ahuja, Narendra ; Moulin, Pierre
Author_Institution
Beckman Inst. for Adv. Sci. & Technol., Univ. of Illinois at Urbana-Champaign, Urbana, IL, USA
fYear
2012
fDate
23-26 Sept. 2012
Firstpage
1
Lastpage
6
Abstract
Online kernel algorithms have an important computational drawback. The computational complexity of these algorithms grow linearly over time. This makes these algorithms difficult to use for real time signal processing applications that need to continuously process data over prolonged periods of time. In this paper, we present a way of overcoming this problem. We do so by approximating kernel evaluations using finite dimensional inner products in a randomized feature space. We apply this idea to the Kernel Least Mean Square (KLMS) algorithm, that has recently been proposed as a non-linear extension to the famed LMS algorithm. Our simulations show that using the proposed method, constant computational complexity can be achieved, with no observable loss in performance.
Keywords
approximation theory; computational complexity; learning (artificial intelligence); least mean squares methods; signal processing; computational complexity; computational drawback; finite dimensional inner product; growing sum problem; kernel evaluation approximation; kernel least mean square algorithm; nonlinear extension; online kernel algorithm; online learning; randomized feature space; signal processing; Approximation algorithms; Kernel; Least squares approximation; Signal processing algorithms; Training; Vectors;
fLanguage
English
Publisher
ieee
Conference_Titel
Machine Learning for Signal Processing (MLSP), 2012 IEEE International Workshop on
Conference_Location
Santander
ISSN
1551-2541
Print_ISBN
978-1-4673-1024-6
Electronic_ISBN
1551-2541
Type
conf
DOI
10.1109/MLSP.2012.6349811
Filename
6349811
Link To Document