• DocumentCode
    315238
  • Title

    D-entropy minimization: integration of mutual information maximization and minimization

  • Author

    Kamimura, Ryotaro

  • Author_Institution
    Inf. Sci. Lab., Tokai Univ., Kanagawa, Japan
  • Volume
    2
  • fYear
    1997
  • fDate
    9-12 Jun 1997
  • Firstpage
    1056
  • Abstract
    In this paper, we propose a D-entropy minimisation method, aiming to unify information maximization and minimization methods. The D-entropy is defined by difference between Renyi´s entropy and Shannon´s entropy. The D-entropy minimization corresponds to both mutual information maximization and minimization. Thus, the method can be used to interpret explicitly internal representations and to improve generalization. The D-entropy minimization was applied to two problems: six alphabet character recognition and the inference of well-formedness of artificial strings. Experimental results confirmed that by minimizing the D-entropy a small number of principal hidden units can be detected and generalization performance can be improved
  • Keywords
    character recognition; generalisation (artificial intelligence); information theory; learning (artificial intelligence); minimisation; minimum entropy methods; neural nets; probability; D-entropy minimization; Renyi entropy; Shannon entropy; character recognition; generalization; information maximization; information minimization; neural learning; neural nets; probability; Artificial neural networks; Character recognition; Entropy; Information science; Laboratories; Minimization methods; Mutual information; Neural networks; Noise reduction; Uncertainty;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks,1997., International Conference on
  • Conference_Location
    Houston, TX
  • Print_ISBN
    0-7803-4122-8
  • Type

    conf

  • DOI
    10.1109/ICNN.1997.616174
  • Filename
    616174