DocumentCode
1365689
Title
An analytical framework for local feedforward networks
Author
Weaver, Scott ; Baird, Leemon ; Polycarpou, Marios M.
Author_Institution
Dept. of Electr. & Comput. Eng., Cincinnati Univ., OH, USA
Volume
9
Issue
3
fYear
1998
fDate
5/1/1998 12:00:00 AM
Firstpage
473
Lastpage
482
Abstract
Interference in neural networks occurs when learning in one area of the input space causes unlearning in another area. Networks that are less susceptible to interference are referred to as spatially local networks. To obtain a better understanding of these properties, a theoretical framework, consisting of a measure of interference and a measure of network localization, is developed. These measures incorporate not only the network weights and architecture but also the learning algorithm. Using this framework to analyze sigmoidal, multilayer perceptron (MLP) networks that employ the backpropagation learning algorithm on the quadratic cost function, we address a familiar misconception that single-hidden-layer sigmoidal networks are inherently nonlocal by demonstrating that given a sufficiently large number of adjustable weights, single-hidden-layer sigmoidal MLPs exist that are arbitrarily local and retain the ability to approximate any continuous function on a compact domain
Keywords
backpropagation; feedforward neural nets; multilayer perceptrons; analytical framework; backpropagation learning algorithm; local feedforward networks; network localization; quadratic cost function; sigmoidal multilayer perceptron networks; single-hidden-layer sigmoidal networks; spatially local networks; Algorithm design and analysis; Backpropagation algorithms; Cost function; Feedforward neural networks; Gradient methods; Interference; Learning systems; Multi-layer neural network; Multilayer perceptrons; Neural networks;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.668889
Filename
668889
Link To Document