Title :
Perceptron-based learning algorithms
Author :
Gallant, Stephen I.
Author_Institution :
Coll. of Comput. Sci., Northeastern Univ., Boston, MA, USA
fDate :
6/1/1990 12:00:00 AM
Abstract :
A key task for connectionist research is the development and analysis of learning algorithms. An examination is made of several supervised learning algorithms for single-cell and network models. The heart of these algorithms is the pocket algorithm, a modification of perceptron learning that makes perceptron learning well-behaved with nonseparable training data, even if the data are noisy and contradictory. Features of these algorithms include speed algorithms fast enough to handle large sets of training data; network scaling properties, i.e. network methods scale up almost as well as single-cell models when the number of inputs is increased; analytic tractability, i.e. upper bounds on classification error are derivable; online learning, i.e. some variants can learn continually, without referring to previous data; and winner-take-all groups or choice groups, i.e. algorithms can be adapted to select one out of a number of possible classifications. These learning algorithms are suitable for applications in machine learning, pattern recognition, and connectionist expert systems
Keywords :
learning systems; neural nets; connectionist expert systems; learning algorithms; machine learning; network scaling; pattern recognition; perceptron; single-cell models; training data; Algorithm design and analysis; Classification algorithms; Heart; Hybrid intelligent systems; Machine learning; Machine learning algorithms; Pattern recognition; Supervised learning; Training data; Upper bound;
Journal_Title :
Neural Networks, IEEE Transactions on