• DocumentCode
    2755409
  • Title

    Fault tolerant learning using Kullback-Leibler divergence

  • Author

    Sum, John ; Leung, Chi-Sing ; Hsu, Lipin

  • Author_Institution
    Nat. Chung Hsing Univ., Taichung
  • fYear
    2007
  • fDate
    Oct. 30 2007-Nov. 2 2007
  • Firstpage
    1
  • Lastpage
    4
  • Abstract
    In this paper, an objective function for training a fault tolerant neural network is derived based on the idea of Kullback-Leibler (KL) divergence. The new objective function is then applied to a radial basis function (RBF) network that is with multiplicative weight noise. Simulation results have demonstrated that the RBF network trained in accordance with the new objective function is of better fault tolerance ability, in compared with the one trained by explicit regularization. As KL divergence has relation to Bayesian learning, a discussion on the proposed objective function and the other Bayesian type objective functions is discussed.
  • Keywords
    Bayes methods; fault tolerant computing; radial basis function networks; Bayesian type objective function; Kullback-Leibler divergence; fault tolerant neural network; multiplicative weight noise; radial basis function network; Additive white noise; Bayesian methods; Biomedical engineering; Electronic commerce; Fault tolerance; Multilayer perceptrons; Neural network hardware; Neural networks; Radial basis function networks; Signal to noise ratio;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    TENCON 2007 - 2007 IEEE Region 10 Conference
  • Conference_Location
    Taipei
  • Print_ISBN
    978-1-4244-1272-3
  • Electronic_ISBN
    978-1-4244-1272-3
  • Type

    conf

  • DOI
    10.1109/TENCON.2007.4429073
  • Filename
    4429073