Title :
Invariance of reparametrization in model selection of neural networks
Author :
Yang, Jian ; Luo, Si-Wei
Author_Institution :
Sch. of Comput. & Inf. Technol., Beijing Jiaotong Univ., China
Abstract :
The goal of model selection is to identify the model that generated the data. Goodness of a model is measured using generalization, which takes two opposite pressures: goodness of fit and model complexity into account. In the paper we take neural network as an example and use conception of curvature from the point of view of differential geometry to explore the intrinsic model complexity that is free of reparametrization; and then through theoretical analysis, we show the future residual that is qualified to measure the generalization can be expressed by using the intrinsic curvature array of model, from which we give a new model selection criterion, it not only considers the factors such as the number of parameters, sample size and functional form, but also with very clear and intuitive geometric understanding of model selection.
Keywords :
computational complexity; computational geometry; differential geometry; generalisation (artificial intelligence); neural nets; curvature array; differential geometry; generalization; geometric complexity; geometric curvature; model complexity; model selection reparametrization; neural networks; residual error; Bayesian methods; Computer networks; Electronic mail; Geometry; Information technology; Intelligent networks; Neural networks; Size measurement; Solid modeling; Temperature; Curvature Array; Future Residual Error; Geometric Complexity; Geometric Curvature; Model Selection; Solution Locus;
Conference_Titel :
Machine Learning and Cybernetics, 2005. Proceedings of 2005 International Conference on
Conference_Location :
Guangzhou, China
Print_ISBN :
0-7803-9091-1
DOI :
10.1109/ICMLC.2005.1527660