Title :
Constrained optimization of neural network architecture
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., CA, USA
Abstract :
By introducing a well-motivated information theoretic metric and new convex optimization algorithms, the architecture of a neural network is designed to enhance its supervised learning capability. We formulate two optimization frameworks that allow efficient algorithms for a large number of variables and accommodate a variety of practical constraints on structural randomness of neural networks. Convex optimization is also used for independent component analysis (ICA) and multi-antenna fading channel capacity
Keywords :
convex programming; neural net architecture; optimisation; ICA; constrained optimization; convex optimization algorithms; independent component analysis; information theoretic metric; multi-antenna fading channel capacity; neural network architecture; structural randomness constraints; supervised learning capability; Constraint optimization; Entropy; Fading; Independent component analysis; Mutual information; Neural networks; Neurons; Pattern recognition; Probability distribution; Supervised learning;
Conference_Titel :
Circuits and Systems, 2000. IEEE APCCAS 2000. The 2000 IEEE Asia-Pacific Conference on
Conference_Location :
Tianjin
Print_ISBN :
0-7803-6253-5
DOI :
10.1109/APCCAS.2000.913508