• DocumentCode
    315237
  • Title

    New bounds for correct generalization

  • Author

    Mattera, Davide ; Palmieri, Francesco

  • Author_Institution
    Dipartimento di Ingegneria Elettronica, Naples Univ., Italy
  • Volume
    2
  • fYear
    1997
  • fDate
    9-12 Jun 1997
  • Firstpage
    1051
  • Abstract
    A theoretical approach to the determination of the number of training examples for a neural network architecture is provided by the theory of Vapnik and Chervonenkis on the minimization of the empirical risk. We report here a new bound on the joint probability that both the approximation error between the binary function learned by the input/output examples and the target binary function is larger than ε and the empirical error on the examples is smaller than a fixed non-null fraction of ε. The given bounds are independent of the probability distribution on the input space and improve some existing results on the generalization abilities of an adaptive binary function
  • Keywords
    error statistics; generalisation (artificial intelligence); learning (artificial intelligence); minimisation; neural net architecture; probability; Vapnik-Chervonenkis theory; approximation error; binary function; bounds; empirical risk; error probability; generalization; minimization; neural network architecture; probability; Error correction; Error probability; Frequency measurement; Neural networks; Probability distribution; Risk analysis;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks,1997., International Conference on
  • Conference_Location
    Houston, TX
  • Print_ISBN
    0-7803-4122-8
  • Type

    conf

  • DOI
    10.1109/ICNN.1997.616173
  • Filename
    616173