Title :
Scalable Nonparametric Low-Rank Kernel Learning Using Block Coordinate Descent
Author :
En-Liang Hu ; Kwok, James T.
Author_Institution :
Dept. of Math., Yunnan Normal Univ., Kunming, China
Abstract :
Nonparametric kernel learning (NPKL) is a flexible approach to learn the kernel matrix directly without assuming any parametric form. It can be naturally formulated as a semidefinite program (SDP), which, however, is not very scalable. To address this problem, we propose the combined use of low-rank approximation and block coordinate descent (BCD). Low-rank approximation avoids the expensive positive semidefinite constraint in the SDP by replacing the kernel matrix variable with VTV, where V is a low-rank matrix. The resultant nonlinear optimization problem is then solved by BCD, which optimizes each column of V sequentially. It can be shown that the proposed algorithm has nice convergence properties and low computational complexities. Experiments on a number of real-world data sets show that the proposed algorithm outperforms state-of-the-art NPKL solvers.
Keywords :
approximation theory; computational complexity; convergence; learning (artificial intelligence); matrix algebra; optimisation; BCD; NPKL solvers; block coordinate descent; computational complexities; convergence properties; low-rank approximation; low-rank matrix; nonlinear optimization problem; scalable nonparametric low-rank kernel learning; Approximation methods; Closed-form solutions; Eigenvalues and eigenfunctions; Fasteners; Kernel; Manifolds; Optimization; Block coordinate descent (BCD); clustering analysis; kernel learning; low-rank approximation; low-rank approximation; semidefinite programming (SDP); semidefinite programming (SDP).;
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
DOI :
10.1109/TNNLS.2014.2361159