Title :
Extending the FPE and the effective number of parameters to neural estimators
Author_Institution :
Dipartimento di Elettronica, Politecnico di Milano, Italy
Abstract :
In this paper we deal with the problem of optimally dimensioning a neural model of regression type. The ultimate goal is to tune its complexity to the information present in the data set and to validate its performance without needing fresh data for cross-validation. It is not required the approximating function to belong to the model family; this introduces a structural bias in the neural estimator which needs to be accounted for. A generalisation of the final prediction error (FPE) to biased estimators is provided, which extends the one suggested by Moody (1992). It is shown that the effective number of parameters of a model a prior differs from the number of free parameters also depending on the degree of biasing
Keywords :
error analysis; function approximation; learning (artificial intelligence); neural nets; optimisation; parameter estimation; prediction theory; Moody criteria; dimensioning; final prediction error; function approximation; generalisation; learning; neural estimators; neural model; parameter estimation; regression type; structural bias; Function approximation; H infinity control; Identity-based encryption; Multidimensional systems; Neural networks; Noise generators; Parameter estimation; Topology; Vectors; Zinc;
Conference_Titel :
Neural Networks, 1996., IEEE International Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-3210-5
DOI :
10.1109/ICNN.1996.548894