DocumentCode
337547
Title
Training MLPs layer-by-layer with the information potential
Author
Xu, Dongxin ; Principe, Jose C.
Author_Institution
Comput. NeuroEng. Lab., Florida Univ., Gainesville, FL, USA
Volume
2
fYear
1999
fDate
15-19 Mar 1999
Firstpage
1045
Abstract
In the area of information processing one fundamental issue is how to measure the relationship between two variables based only on their samples. In a previous paper, the idea of information potential which was formulated from the so called quadratic mutual information was introduced, and successfully applied to problems such as blind source separation and pose estimation of SAR (synthetic aperture radar) Images. This paper shows how information potential can be used to train a MLP (multilayer perceptron) layer-by-layer, which provides evidence that the hidden layer of a MLP serves as an “information filter” which tries to best represent the desired output in that layer in the statistical sense of mutual information
Keywords
entropy; learning (artificial intelligence); multilayer perceptrons; signal sampling; MLP; hidden layer; information filter; information potential; information processing; layer-by-layer training; multilayer perceptron; quadratic mutual information; Area measurement; Blind source separation; Electric variables measurement; Gain measurement; Information entropy; Information processing; Laboratories; Mutual information; Neural engineering; Synthetic aperture radar;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech, and Signal Processing, 1999. Proceedings., 1999 IEEE International Conference on
Conference_Location
Phoenix, AZ
ISSN
1520-6149
Print_ISBN
0-7803-5041-3
Type
conf
DOI
10.1109/ICASSP.1999.759922
Filename
759922
Link To Document