Title :
An introduction to information theoretic learning
Author :
Principe, Jose C. ; Xu, Dongxin
Author_Institution :
Lab. of Comput. NeuroEng., Florida Univ., Gainesville, FL, USA
Abstract :
Learning from examples has been traditionally done with correlation or with the mean square error (MSE) criterion, in spite of the fact that learning is intrinsically related with the extraction of information from examples. The problem is that Shannon (1948) introduced the idea of information entropy which has a sound theoretical foundation but is not easy to implement in a learning from examples scenario. In this paper Renyi´s entropy definition (1976) is used and integrated with a nonparametric estimator of the probability density function (Parzen window). The experimental results on blind source separation confirm the theory. Although the work is preliminary, the “information potential” method is rather general and will have many applications
Keywords :
entropy; learning by example; neural nets; probability; Parzen window; blind source separation; example-based learning; information entropy; information extraction; information potential method; information theoretic learning; neural nets; nonparametric estimator; probability density function; Data mining; Density functional theory; Energy measurement; Information entropy; Laboratories; Machine learning; Mean square error methods; Mutual information; Neural engineering; Signal processing algorithms;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.832648