• DocumentCode
    1809776
  • Title

    Discretization methods for encoding of continuous input variables for Boolean neural networks

  • Author

    Linneberg, Christian ; ørgensen, Thomas Martini

  • Author_Institution
    Intellix A/S, Frederiksberg, Denmark
  • Volume
    2
  • fYear
    1999
  • fDate
    36342
  • Firstpage
    1219
  • Abstract
    RAM-based neural networks are normally based on binary input variables, and a thermometer code or a so-called CMAC-Gray code is most often used when encoding real-valued variables. The number of intervals and interval boundaries are normally set from ad hoc principles. Using this approach many intervals are normally needed to provide sufficient resolution. This leads to large variable codes, which again complicates the learning problem. Instead of selecting more or less arbitrary interval boundaries if can be expected to be beneficial to use discretization techniques, where the split-values are selected from the use of information measures. We report on the results that can be obtained by applying local and global discretization techniques together with enhanced schemes of the so-called n-tuple classifier which is the most simple type of a RAM neural net. The enhanced n-tuple nets have proven competitive on a large set of benchmark data sets. By making proper use of the discretization boundaries increased performances can be obtained. The local discretization algorithms are closely connected with the learning principle used for decision trees, and we show how such schemes can be used as variable selectors for the RAM based neural nets
  • Keywords
    decision trees; encoding; learning (artificial intelligence); neural nets; Boolean neural networks; RAM-based neural networks; continuous input variables; discretization boundaries; enhanced n-tuple nets; global discretization techniques; information measures; local discretization techniques; n-tuple classifier; variable selectors; Binary codes; Classification tree analysis; Decision trees; Encoding; Frequency; Input variables; Laboratories; Neural networks; Ribs; Sampling methods;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1999. IJCNN '99. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-5529-6
  • Type

    conf

  • DOI
    10.1109/IJCNN.1999.831134
  • Filename
    831134