Title :
Analyzing the structure of a neural network using principal component analysis
Author_Institution :
Dept. of Comput. Sci., Montana Univ., Missoula, MT, USA
Abstract :
When learning from data, one often attaches a penalty term to a standard error term in an attempt to prefer simple models and thus prevent overfitting. Current penalty terms for neural networks, however, often do not take into account weight interaction. This is a critical drawback since the effective number of parameters in a network often differs dramatically from the total number of possible parameters. In this paper we present a penalty term that uses principal component analysis to detect functional redundancy in a neural network. Results show that our new algorithm gives a much more accurate estimate of network complexity than standard approaches. As a result, our new term should be able to improve techniques that can make use of a penalty term, such as weight decay, weight pruning, feature selection, Bayesian, and prediction-risk techniques
Keywords :
directed graphs; learning (artificial intelligence); neural net architecture; performance evaluation; redundancy; statistical analysis; directed acyclic graphs; functional redundancy; neural network; penalty term; principal component analysis; structure analysis; weight decay; weight pruning; Bayesian methods; Computer errors; Computer science; Neural networks; Predictive models; Principal component analysis; Redundancy; Size measurement; Training data;
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.611674