Title :
Improved generalization and network pruning using adaptive Laplace regularization
Author_Institution :
Sussex Univ., UK
Abstract :
Neural networks designed for regression or classification need to be trained using some form of stabilization or regularization if they are to generalize well beyond the original training set. This means finding a balance between complexity of the network and information content of the data. This paper examines a type of formal regularization in which the penalty term is proportional to the logarithm of the L p norm of the weight vector log (Σj|ωj|p)1p/ ( p⩾1). The specific choice p=1 simultaneously provides both forms of stabilization with radical pruning leading to greatly improved generalization
Keywords :
estimation theory; learning (artificial intelligence); neural nets; optimisation; Bayesian estimation; adaptive Laplace regularization; complexity; generalization; information content; network pruning; penalty term; weight vector;
Conference_Titel :
Artificial Neural Networks, 1993., Third International Conference on
Conference_Location :
Brighton
Print_ISBN :
0-85296-573-7