Title :
A Hybrid Methodology for Improving Generalization Performance of Neural Networks
Author_Institution :
Sch. of Inf. Sci. & Technol., Guangdong Univ. of Bus. Studies, Guangzhou
Abstract :
A new hybrid approach to improving generalization of feedforward neural networks is presented. The hybrid approach combines several efficient methods, such as feature extraction via wavelet transform, construction of optimal networks architecture via fast cascade-correlation, dynamic optimization of learning parameters via simultaneous determination, avoiding overfitting problem via fast cross-validation. The experimental results show that the hybrid approach can automatically design optimal neural networks with good generalization capability and small network size and short training time in comparison with other ways
Keywords :
feedforward neural nets; learning (artificial intelligence); optimisation; wavelet transforms; dynamic optimization; fast cascade-correlation; feature extraction; feedforward neural network; learning parameter; optimal neural network; simultaneous determination; wavelet transform; Convergence; Cybernetics; Electronic mail; Feature extraction; Feedforward neural networks; Frequency; Information science; Machine learning; Neural networks; Optimization methods; Wavelet transforms; Bottom-up; Cascade-Correlation; Cross-Validation; Generalization; Learning Parameters Optimization; Neural Networks; Wavelet Transform;
Conference_Titel :
Machine Learning and Cybernetics, 2006 International Conference on
Conference_Location :
Dalian, China
Print_ISBN :
1-4244-0061-9
DOI :
10.1109/ICMLC.2006.258412