Title :
General statistical inference by an approximate application of the maximum entropy principle
Author :
Yan, Lian ; Miller, David J.
Author_Institution :
Dept. of Electr. Eng., Pennsylvania State Univ., University Park, PA, USA
Abstract :
We propose a learning method for building a general statistical inference engine, operating on discrete feature spaces. Such a model allows inference on any feature given values for the other features (or for a feature subset). Bayesian networks (BNs) are versatile tools that possess this inference capability. However, while the BN´s explicit representation of conditional independencies is informative, this structure is not so easily learned. Typically, learning methods for BNs use (suboptimal) greedy search techniques. There is also a difficult issue of overfitting in these models. Alternatively, in Cheeseman (1983) proposed finding the maximum entropy (ME) joint probability mass function (pmf) consistent with arbitrary lower order probability constraints. This approach has some potential advantages over BNs. However, the huge complexity required for learning the joint pmf has severely limited the use of this approach until now. Here we propose an approximate ME method which also allows incorporation of arbitrary lower order constraints, but while retaining quite tractable learning complexity. The new method approximates the joint feature pmf (during learning) on a subgrid of the full feature space grid. Experimental results on the UC-Irvine repository reveal significant performance gains over two BN approaches: Chow and Liu´s (1968) dependence trees and Herskovits and Cooper´s (1991) Kutato. Several extensions of our approach are indicated
Keywords :
learning (artificial intelligence); maximum entropy methods; probability; UC-Irvine repository; conditional independencies; discrete feature spaces; general statistical inference; joint probability mass function; learning complexity; learning method; maximum entropy principle; statistical inference engine; Bayesian methods; Combinatorial mathematics; Computer applications; Diseases; Engineering profession; Engines; Entropy; Fault detection; Learning systems; Predictive models;
Conference_Titel :
Neural Networks for Signal Processing IX, 1999. Proceedings of the 1999 IEEE Signal Processing Society Workshop.
Conference_Location :
Madison, WI
Print_ISBN :
0-7803-5673-X
DOI :
10.1109/NNSP.1999.788129