Title :
Towards a more analytical training of neural networks and neuro-fuzzy systems
Author :
Ruano, António E. ; Cabrita, Cristiano L. ; Ferreira, Pedro M.
Author_Institution :
Centre for Intell. Syst., IST, Portugal
Abstract :
When used for function approximation purposes, neural networks belong to a class of models whose parameters can be separated into linear and nonlinear, according to their influence in the model output. In this work we extend this concept to the case where the training problem is formulated as the minimization of the integral of the squared error, along the input domain. With this approach, the gradient-based non-linear optimization algorithms require the computation of terms that are either dependent only on the model and the input domain, and terms which are the projection of the target function on the basis functions and on their derivatives with respect to the nonlinear parameters. These latter terms can be numerically computed with the data provided. The use of this functional approach brings at least two advantages in comparison with the standard training formulation: firstly, computational complexity savings, as some terms are independent on the size of the data and matrices inverses or pseudo-inverses are avoided; secondly, as the performance surface using this approach is closer to the one obtained with the true (typically unknown) function, the use of gradient-based training algorithms has more chance to find models that produce a better fit to the underlying function.
Keywords :
computational complexity; function approximation; fuzzy neural nets; fuzzy systems; gradient methods; learning (artificial intelligence); nonlinear programming; computational complexity savings; function approximation; gradient-based nonlinear optimization algorithms; gradient-based training algorithms; neural network training; neuro-fuzzy system training; squared error integral minimization; target function; Analytical models; Availability; Complexity theory; Computational modeling; Jacobian matrices; Neural networks; Training; Neural networks training; functional back-propagation; parameter separability;
Conference_Titel :
Intelligent Signal Processing (WISP), 2011 IEEE 7th International Symposium on
Conference_Location :
Floriana
Print_ISBN :
978-1-4577-1403-0
DOI :
10.1109/WISP.2011.6051695