Abstract :
The problem of the approximation of nonlinear mapping, (especially continuous mappings) is considered. Regularization theory and a theoretical framework for approximation (based on regularization techniques) that leads to a class of three-layer networks called regularization networks are discussed. Regularization networks are mathematically related to the radial basis functions, mainly used for strict interpolation tasks. Learning as approximation and learning as hypersurface reconstruction are discussed. Two extensions of the regularization approach are presented, along with the approach´s corrections to splines, regularization, Bayes formulation, and clustering. The theory of regularization networks is generalized to a formulation that includes task-dependent clustering and dimensionality reduction. Applications of regularization networks are discussed
Keywords :
approximation theory; learning systems; neural nets; Bayes formulation; approximation; clustering; dimensionality reduction; hypersurface; interpolation; neural networks; nonlinear mapping; regularization networks; splines; three-layer networks; Approximation methods; Artificial intelligence; Associative memory; Backpropagation algorithms; Contracts; Network synthesis; Network topology; Neural networks; Prototypes; System identification;