• DocumentCode
    1629667
  • Title

    Optimal discrete perceptrons for graded learning

  • Author

    Elder, John F., IV

  • Author_Institution
    Dept. Syst. Eng., Virginia Univ., Charlottesville, VA, USA
  • fYear
    1992
  • Firstpage
    375
  • Abstract
    Perceptrons, the original artificial neural network structure, are finite in number for a given discrete-valued problem, and can be exhaustively enumerated. The great benefit of exhaustive enumeration is that one has a complete distribution of empirical results. Thus, the global optimum is identified, any competitors or multiple solutions are known, and the unusualness of any solution can be assessed. As the complete sample distribution of candidate models is available, model selection, inference, and prediction can be performed with a low level of supervision, that is, by graded learning. The enumeration procedure is described. Although NP-complete, the number of distinct perceptrons is tractable for a low number of inputs. An example application of the method is demonstrated for the task of learning investment rules for the US Treasury Bond market, with encouraging results
  • Keywords
    learning (artificial intelligence); neural nets; NP-complete problem; US Treasury Bond market; discrete-valued problem; graded learning; inference; investment rules; model selection; optimal discrete perceptrons; prediction; Artificial neural networks; Bonding; Investments; Logistics; Machine intelligence; Machine learning; Neural networks; Nonlinear equations; Predictive models; Systems engineering and theory;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Systems, Man and Cybernetics, 1992., IEEE International Conference on
  • Conference_Location
    Chicago, IL
  • Print_ISBN
    0-7803-0720-8
  • Type

    conf

  • DOI
    10.1109/ICSMC.1992.271746
  • Filename
    271746