Title :
Optimization criteria and nonlinear PCA neural networks
Author_Institution :
Lab. of Comput. & Inf. Sci., Helsinki Univ. of Technol., Espoo, Finland
fDate :
27 Jun-2 Jul 1994
Abstract :
In this paper, the relationships of various nonlinear extensions of principal component analysis (PCA) to optimization are considered. Standard PCA arises as an optimal solution to several different information representation problems. It is claimed that basically this follows from the fact that PCA solution utilizes second-order statistics only. If the optimization problems are generalized for nonquadratic criteria so that higher order statistics are taken into account, their solutions will in general be different. The solutions define, in a natural way, several nonlinear extensions of PCA and give a solid foundation to them. The respective gradient type neural algorithms are discussed
Keywords :
learning (artificial intelligence); neural nets; optimisation; statistical analysis; gradient type neural algorithms; information representation; nonlinear PCA neural networks; nonquadratic criteria; optimization criteria; principal component analysis; second-order statistics; Ear; Higher order statistics; Information analysis; Information representation; Information science; Laboratories; Neural networks; Principal component analysis; Signal processing algorithms; Solids;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374363