DocumentCode
1046563
Title
A Fault-Tolerant Regularizer for RBF Networks
Author
Leung, Chi-Sing ; Sum, John Pui-Fai
Author_Institution
City Univ. of Hong Kong, Kowloon
Volume
19
Issue
3
fYear
2008
fDate
3/1/2008 12:00:00 AM
Firstpage
493
Lastpage
507
Abstract
In classical training methods for node open fault, we need to consider many potential faulty networks. When the multinode fault situation is considered, the space of potential faulty networks is very large. Hence, the objective function and the corresponding learning algorithm would be computationally complicated. This paper uses the Kullback-Leibler divergence to define an objective function for improving the fault tolerance of radial basis function (RBF) networks. With the assumption that there is a Gaussian distributed noise term in the output data, a regularizer in the objective function is identified. Finally, the corresponding learning algorithm is developed. In our approach, the objective function and the learning algorithm are computationally simple. Compared with some conventional approaches, including weight-decay-based regularizers, our approach has a better fault-tolerant ability. Besides, our empirical study shows that our approach can improve the generalization ability of a fault-free RBF network.
Keywords
Gaussian distribution; Gaussian noise; fault tolerance; learning (artificial intelligence); radial basis function networks; Gaussian distributed noise term; Kullback-Leibler divergence; fault-tolerant regularizer; learning algorithm; multinode fault situation; objective function; radial basis function network; Kullback–Leibler divergence; node open fault; regularization; Algorithms; Computer Simulation; Decision Support Techniques; Humans; Learning; Models, Statistical; Neural Networks (Computer);
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/TNN.2007.912320
Filename
4439295
Link To Document