• DocumentCode
    739335
  • Title

    Justification of Logarithmic Loss via the Benefit of Side Information

  • Author

    Jiao, Jiantao ; Courtade, Thomas A. ; Venkat, Kartik ; Weissman, Tsachy

  • Author_Institution
    Department of Electrical Engineering, Stanford University, Stanford, CA, USA
  • Volume
    61
  • Issue
    10
  • fYear
    2015
  • Firstpage
    5357
  • Lastpage
    5365
  • Abstract
    We consider a natural measure of relevance: the reduction in optimal prediction risk in the presence of side information. For any given loss function, this relevance measure captures the benefit of side information for performing inference on a random variable under this loss function. When such a measure satisfies a natural data processing property, and the random variable of interest has alphabet size greater than two, we show that it is uniquely characterized by the mutual information, and the corresponding loss function coincides with logarithmic loss. In doing so, our work provides a new characterization of mutual information, and justifies its use as a measure of relevance. When the alphabet is binary, we characterize the only admissible forms the measure of relevance can assume while obeying the specified data processing property. Our results naturally extend to measuring the causal influence between stochastic processes, where we unify different causality measures in the literature as instantiations of directed information.
  • Keywords
    Convex functions; Data processing; Entropy; Loss measurement; Mutual information; Random variables; Yttrium; Axiomatic Characterizations; Axiomatic characterizations; Causality Measures; Directed Information; causality measures; data processing; directed information; logarithmic loss;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2015.2462848
  • Filename
    7173043