Title :
On memory size occupied by a multilayer perceptron
Author :
Liang, Xun ; XIA, Shaowei
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing, China
Abstract :
When a multilayer perceptron (MLP) is trained by the backpropagation (BP) algorithm, it is difficult to estimate Q1, where Q1 denotes the memory size occupied by the weights and thresholds in the MLP, because of unpredictability of the number of hidden layers and the number of neurons in each hidden layer. By introducing a training difficulty factor, the paper overcomes the difficulty in estimation and gives Q1. After comparing Q1 with Q2, where Q2 denotes the memory size occupied by the pattern pairs themselves, the result is that only real pattern pairs can be compressed by MLPs.
Keywords :
backpropagation; content-addressable storage; multilayer perceptrons; backpropagation; hidden layers; memory size; multilayer perceptron; neural nets; pattern pairs; thresholds; weights; Automation; Information management; Laboratories; Management training; Multilayer perceptrons; Neurons;
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
DOI :
10.1109/IJCNN.1993.716983