Title :
An incremental learning method using weighted magnitude of interference
Author :
Yamaguchi, Nobuhiko ; Minagawa, Youichi ; Yamauchi, Koichiro ; Ishii, Naohiro
Author_Institution :
Dept. of Intelligence & Comput. Sci., Nagoya Inst. of Technol., Japan
Abstract :
Almost all of neural network models need relearning of past learning patterns to memorize new patterns incrementally to avoid forgetting of past memories. This is due to each parameter of the neural network contributes to memorize several learning patterns. Therefore, the change in each parameter due to the learning interferes with the past memories. In this paper, we propose a new incremental learning method to omit the re-learning process to reduce the cost of the incremental learning. The method uses an objective function, which is to be minimized during the learning, called "weighted magnitude of interference." The function represents the change in output function of the neural network, which is caused by the modification of parameters and square error to the new patterns
Keywords :
learning (artificial intelligence); neural nets; incremental learning; learning patterns; neural network; objective function; Artificial neural networks; Computational complexity; Computer science; Costs; Intelligent networks; Interference; Learning systems; Neural networks; Radio access networks; Resource management;
Conference_Titel :
Industrial Electronics Society, 2000. IECON 2000. 26th Annual Confjerence of the IEEE
Conference_Location :
Nagoya
Print_ISBN :
0-7803-6456-2
DOI :
10.1109/IECON.2000.972291