Title :
Network simplification through oracle learning
Author :
Menke, Joshua ; Peterson, Adam ; Rimer, Mike ; Martinez, Tony
Author_Institution :
Dept. of Comput. Sci., Brigham Young Univ., Provo, UT, USA
fDate :
6/24/1905 12:00:00 AM
Abstract :
Often, the best artificial neural network to solve a real-world problem is relatively complex. However, with the growing popularity of smaller computing devices (hand-held computers, cellular telephones, automobile interfaces, etc.), there is a need for simpler models with comparable accuracy. This paper presents evidence that using a larger model as an oracle to train a smaller model on unlabeled data results in (1) a simpler acceptable model and (2) improved results over standard training methods on a similarly-sized smaller model. On automated spoken-digit recognition, oracle learning resulted in an artificial neural network of half the size that (1) maintained comparable accuracy to the larger neural network and (2) obtained up to a 25% decrease in error over standard training methods
Keywords :
errors; learning (artificial intelligence); learning automata; neural nets; speech recognition; artificial neural network simplification; automated spoken digit recognition; error reduction; model accuracy; network size; oracle learning; small computing devices; training methods; unlabeled data; Application software; Artificial neural networks; Bagging; Computer science; Data mining; Handheld computers; Neural networks; Power system modeling; Telephony; Training data;
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
0-7803-7278-6
DOI :
10.1109/IJCNN.2002.1007532