DocumentCode :
3130723
Title :
Improving neural network learning through output vector scaling
Author :
Altun, H. ; Curtis, K.M.
Author_Institution :
Dept. of Electr. & Electron. Eng., Nottingham Univ., UK
Volume :
2
fYear :
1997
fDate :
2-4 Jul 1997
Firstpage :
723
Abstract :
We show that through output vector scaling an improvement in the estimation performance of a neural network can be obtained for any training pattern. A smaller estimation error is obtained for the neural network, when it is trained with a sigmoid-like scaled output vector than when it is trained with a linear or nonlinear scaled ones. The most populated region in the output vector domain is determined accordingly through investigation of the nature of the problem to be solved. Through a vector scaling technique in which the nature of the problem is incorporated, one can assign more resolution to a more populated region of the output vector domain. The neural network trained with a set of scaled output vectors results in a more accurate estimation of the output vector, with increased resolution when recall or generalisation is carried out. Results indicate that if the nature of the problem is suitable, the technique can also speed up convergence. The technique is applied to estimate the control parameters of an articulatory speech synthesizer
Keywords :
backpropagation; neural nets; parameter estimation; speech synthesis; backpropagation; learning; neural network; output vector scaling; parameter estimation; speech synthesizer; Backpropagation algorithms; Convergence; Cost function; Estimation error; Neural networks; Parallel processing; Parameter estimation; Redundancy; Speech synthesis; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Digital Signal Processing Proceedings, 1997. DSP 97., 1997 13th International Conference on
Conference_Location :
Santorini
Print_ISBN :
0-7803-4137-6
Type :
conf
DOI :
10.1109/ICDSP.1997.628454
Filename :
628454
Link To Document :
بازگشت