• Title of article

    Three Information–Theoretical Methods to Estimate a Random Variable

  • Author/Authors

    Lind، نويسنده , , Niels C.، نويسنده ,

  • Issue Information
    روزنامه با شماره پیاپی سال 1997
  • Pages
    9
  • From page
    43
  • To page
    51
  • Abstract
    Three information–theoretical methods to estimate a continuous univariate distribution are proposed for estimation when the distribution type is uncertain, when data are scarce, or when extremes are important. The first is a new version of Jaynes» MaxEnt Method. The second, minimizing Shannonʹs information measure, yields a minimally informative estimate. The third analysis produces a pair of distributions that together minimize the relative entropy (=cross-entropy) satisfying the principle of least information. Thus, one distribution (p) provides minimal information among all chosen candidates about these events, while the other (q) minimizes the information among all distributions that satisfies the sample rule, which is an essential constraint. Modelpis also identical to the ««maximum product of spacings»» estimate. Distributionqis determined frompby simple algebra. The principle of least information yields a unique solution (p,q) when other methods fail. The algorithm is the same for all types of distributions; the estimation process introduces a minimum of information, approaching objectivity, and the solution is invariant under monotonic variable transformations. All three methods are computationally simple but involve optimization. There are also several approximate information–theoretical methods that retain some of the advantages cited and are computationally simpler.
  • Keywords
    Information , Distribution , entropy , sample , Estimation
  • Journal title
    Journal of Environmental Management
  • Serial Year
    1997
  • Journal title
    Journal of Environmental Management
  • Record number

    1568408