Title :
On speeding up computation in information theoretic learning
Author :
Seth, Sohan ; Principe, José C.
Author_Institution :
Comput. Neuro- Eng. Lab., Univ. of Florida, Gainesville, FL, USA
Abstract :
With the recent progress in kernel based learning methods, computation with Gram matrices has received immense attention. However, the complexity of computing the entire Gram matrix is quadratic in terms of number of samples. Therefore, a considerable amount of work has been focused on extracting relevant information from the Gram matrix without accessing all the elements. Most of these methods exploits the positive definiteness and rapidly decaying eigenstructure of the Gram matrix. Although information theoretic learning (ITL) is conceptually different from kernel based learning, several ITL estimators can be written in terms of Gram matrices. However, the difference between ITL and kernel based methods is that a few ITL estimators include a special type of matrix which is neither positive definite nor symmetric. In this paper we discuss how the techniques applied in kernel based learning can be applied to reduce computational complexity of the ITL estimators involving both Gram matrices and these other matrices.
Keywords :
computational complexity; eigenvalues and eigenfunctions; learning (artificial intelligence); matrix algebra; Gram matrices; computational complexity; eigenstructure; information extraction; information theoretic learning; kernel based learning method; quadratic matrix; Computer networks; Data mining; Euclidean distance; Kernel; Learning systems; Matrix decomposition; Mutual information; Neural networks; Pervasive computing; Symmetric matrices;
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2009.5178933