Title :
Entropy minimization algorithm for multilayer perceptrons
Author :
Erdogmus, Deniz ; Principe, Jose C.
Author_Institution :
Comput. Neuroeng. Lab., Florida Univ., Gainesville, FL, USA
Abstract :
We (2000) have previously proposed the use of quadratic Renyi´s error entropy with a Parzen density estimator with Gaussian kernels as an alternative optimality criterion for supervised neural network training, and showed that it produces better performance on the test data compared to the mean squares error (MSE). The error entropy criterion imposes the minimization of average information content in the error signal rather than simply minimizing the energy as MSE does. We have also developed a nonparametric entropy estimator for Renyi´s definition that makes possible the use of any entropy order and any suitable kernel function in Parzen density estimation. The new estimator reduces to the previously used estimator for the special choice of Gaussian kernels and quadratic entropy. In this paper, we briefly present the new criterion and show how to apply it to MLP training. We also address the issue of global optimization by the control of the kernel size in the Parzen window estimation
Keywords :
Gaussian processes; estimation theory; learning (artificial intelligence); minimum entropy methods; multilayer perceptrons; optimisation; Parzen density estimator; Renyi error entropy; entropy minimization; learning; multilayer perceptrons; nonparametric entropy estimator; optimization; quadratic entropy; Convolution; Entropy; Kernel; Minimization methods; Multilayer perceptrons; Mutual information; Performance analysis; Signal processing algorithms; Supervised learning; Testing;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.938856