Title :
Generalization and C-information
Author :
Kamimura, Ryotaro ; Nakanishi, Shohachiro
Author_Institution :
Inf. Sci. Lab., Tokai Univ., Kanagawa, Japan
fDate :
27 Jun-2 Jul 1994
Abstract :
We attempt to show that the C-information is in direct proportion to the generalization errors in neural nets. This means that, to improve the generalization performance, networks must have as little information as possible upon the input patterns, under the condition that networks can produce targets correctly. For confirming this hypothesis of the minimum information (minimum information principle), two kinds of experiments of the language acquisition problem were performed: 1) the networks were trained to infer the correct regular past tense forms, given various new verb stems; and 2) in addition to the inference of regular past tense forms, the networks were trained to infer irregular forms and whether given strings were well-formed or not. In either case, we could clearly see that the information was in direct proportion to generalization errors. These results suggest that to improve the generalization, we must minimize the information, more exactly C-information, and that some methods ever developed for the improvement of the generalization performance, can be explained by the minimization of C-information
Keywords :
generalisation (artificial intelligence); grammars; information theory; minimisation; natural languages; neural nets; C-information; generalization errors; grammar; input patterns; language acquisition problem; minimization; minimum information principle; natural language processing; neural nets; strings; Entropy; Error correction; Finishing; Minimization methods; Natural languages; Neural networks;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.374231