• DocumentCode
    2348898
  • Title

    A Comparative Study of Different Learning Rate In Radial Basis Function

  • Author

    Kapoor, Richa ; Kumar, Jay ; Dhubkarya, D.C. ; Nagariya, Deepak

  • Author_Institution
    SIT, Mathura, India
  • fYear
    2010
  • fDate
    26-28 Nov. 2010
  • Firstpage
    612
  • Lastpage
    616
  • Abstract
    This paper presents the work regarding the implementation of radial basis function algorithm on very high speed integrated circuit hardware description language by using Perceptron learning. Neural Network hardware is usually defined as those devices designed to implement neural architectures and learning algorithms. The radial basis function (RBF) network is a two-layer network whose output units form a linear combination of the basis function computed by the hidden unit & hidden unit function is a Gaussian. The radial basis function has a maximum of 1 when its input is 0. As the distance between weight vector and input decreases, the output increases. Thus, a radial basis neuron acts as a detector that produces 1 whenever the input is identical to its weight vector.
  • Keywords
    hardware description languages; learning (artificial intelligence); radial basis function networks; RBF; different learning rate; integrated circuit hardware description language; learning algorithms; neural architectures; neural network hardware; perceptron learning; radial basis function; FPGA; RBF; block RAM; training algorithm; weight;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computational Intelligence and Communication Networks (CICN), 2010 International Conference on
  • Conference_Location
    Bhopal
  • Print_ISBN
    978-1-4244-8653-3
  • Electronic_ISBN
    978-0-7695-4254-6
  • Type

    conf

  • DOI
    10.1109/CICN.2010.121
  • Filename
    5702044