Title :
Can artificial neural networks discover useful regularities?
Author :
Stone, J.V. ; Thorton, C.J.
Author_Institution :
Sussex Univ., Brighton, UK
Abstract :
Argues that the success of backpropagation artificial neural networks (BP ANNs) depends on input representations which recode input parameters in terms of low-order input statistics. We provide an analysis of how BP ANNs can take advantage of two types of low-order statistical effects to learn a given problem. First, BP ANNs utilise the spurious low-order statistics associated with almost any `natural´ problem. These statistics provide partial information about the underlying mapping, and therefore do not, in general, permit accurate generalisation. Second, it as always possible to hand-craft inputs so that a parameter which was originally coded by relations between input components is coded by a single input component. Such a recoding enables ANNs to perform tasks on which they would otherwise fail. We conjecture that many `natural´ problems represent sparse codings of some underlying relational mapping. If true, this suggests that the ability of BP ANNs to solve such problems is more apparent than real, because BP ANNs rely upon informative, but essentially spurious, correlations between input and output variables
Keywords :
backpropagation; correlation theory; generalisation (artificial intelligence); neural nets; pattern recognition; statistics; backpropagation artificial neural networks; generalisation; input component relations; input parameter recoding; input representations; low-order input statistics; natural problems; relational mapping; sparse codings; spurious input-output variable correlations; useful regularities discovery;
Conference_Titel :
Artificial Neural Networks, 1995., Fourth International Conference on
Conference_Location :
Cambridge
Print_ISBN :
0-85296-641-5
DOI :
10.1049/cp:19950554