Title :
Constructive neural-network learning algorithms for pattern classification
Author :
Parekh, Rajesh ; Yang, Jihoon ; Honavar, Vasant
Author_Institution :
Allstate Res. & Planning Center, Menlo Park, CA, USA
fDate :
3/1/2000 12:00:00 AM
Abstract :
Constructive learning algorithms offer an attractive approach for the incremental construction of near-minimal neural-network architectures for pattern classification. They help overcome the need for ad hoc and often inappropriate choices of network topology in algorithms that search for suitable weights in a priori fixed network architectures. Several such algorithms are proposed in the literature and shown to converge to zero classification errors (under certain assumptions) on tasks that involve learning a binary to binary mapping (i.e., classification problems involving binary-valued input attributes and two output categories). We present two constructive learning algorithms, MPyramid-real and MTiling-real, that extend the pyramid and tiling algorithms, respectively, for learning real to M-ary mappings (i.e., classification problems involving real-valued input attributes and multiple output classes). We prove the convergence of these algorithms and empirically demonstrate their applicability to practical pattern classification problems. Additionally, we show how the incorporation of a local pruning step can eliminate several redundant neurons from MTiling-real networks
Keywords :
convergence; learning (artificial intelligence); minimisation; neural net architecture; pattern classification; MPyramid-real; MTiling-real; binary-valued input attributes; constructive neural-network learning algorithms; convergence; incremental construction; local pruning step; near-minimal neural-network architectures; pattern classification; pyramid algorithm; redundant neuron elimination; tiling algorithm; zero classification errors; Backpropagation algorithms; Classification algorithms; Convergence; Function approximation; Network topology; Neural networks; Neurons; Pattern classification; Space exploration; Training data;
Journal_Title :
Neural Networks, IEEE Transactions on