DocumentCode :
285415
Title :
Exploiting fractalness of error surfaces: New methods for neural network learning
Author :
Kahng, Andrew B.
Author_Institution :
Dept. of Comput. Sci., California Univ., Los Angeles, CA, USA
Volume :
1
fYear :
1992
fDate :
10-13 May 1992
Firstpage :
41
Abstract :
Learning in neural networks can be formulated as global optimization of a multimodal error function over a high-dimensional space of connection weights. A general scaling model that describes the error surface as high-dimensional fraction Brownian motion (FBM), i.e., as a class of random fractals, is developed. The parameter of FBM can be extracted by spectral analysis of the error profile over a random walk in weight space. Scaling structure within the error surface has important implications for stochastic optimizations such as Boltzmann learning. Experimental data that confirm the fractalness of error surfaces for a wide range of problems and connection topologies are reviewed, and the implications of these results are discussed
Keywords :
Boltzmann machines; fractals; learning (artificial intelligence); Boltzmann learning; connection weights; error profile; error surfaces; fractalness; fraction Brownian motion; global optimization; multimodal error function; neural network learning; scaling model; spectral analysis; stochastic optimizations; weight space; Brownian motion; Computer errors; Computer science; Cost function; Data mining; Fractals; Network topology; Neural networks; Optimization methods; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1992. ISCAS '92. Proceedings., 1992 IEEE International Symposium on
Conference_Location :
San Diego, CA
Print_ISBN :
0-7803-0593-0
Type :
conf
DOI :
10.1109/ISCAS.1992.230019
Filename :
230019
Link To Document :
بازگشت