• DocumentCode
    3127166
  • Title

    Lossy common information of two dependent random variables

  • Author

    Viswanatha, Kumar ; Akyol, Emrah ; Rose, Kenneth

  • Author_Institution
    ECE Dept., Univ. of California - Santa Barbara, Santa Barbara, CA, USA
  • fYear
    2012
  • fDate
    1-6 July 2012
  • Firstpage
    528
  • Lastpage
    532
  • Abstract
    The two most prevalent notions of common information are due to Wyner and Gács-Körner and both the notions can be stated as two different characteristic points in the lossless Gray-Wyner region. Although these quantities can be easily evaluated for random variables with infinite entropy (eg. continuous random variables), the operational significance underlying their definition is applicable only to the lossless framework. The primary objective of this paper is to generalize these two notions of common information to the lossy Gray-Wyner network, which extends the theoretical intuition underlying their definitions for general sources and distortion measures. We begin with the lossy generalization of Wyner´s common information, defined as the minimum rate on the shared branch of the Gray-Wyner network at minimum sum rate when the two decoders reconstruct the sources subject to individual distortion constraints. We derive a complete single letter information theoretic characterization for this quantity and use it to compute the common information of symmetric bivariate Gaussian random variables. We then derive similar results to generalize Gács-Körner´s definition to the lossy framework. These two characterizations allow us to carry the practical insight underlying the two notions of common information to general sources and distortion measures.
  • Keywords
    Gaussian processes; entropy; random processes; characteristic points; distortion constraints; distortion measures; infinite entropy; lossless Gray-Wyner region; lossy Gray-Wyner network; lossy common information; random variables; symmetric bivariate Gaussian random variables; two dependent random variables; Distortion measurement; Entropy; Joints; Markov processes; Random variables; Rate-distortion; Gács and Körner´s common information; Lossy Gray-Wyner network; Wyner´s common information;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Proceedings (ISIT), 2012 IEEE International Symposium on
  • Conference_Location
    Cambridge, MA
  • ISSN
    2157-8095
  • Print_ISBN
    978-1-4673-2580-6
  • Electronic_ISBN
    2157-8095
  • Type

    conf

  • DOI
    10.1109/ISIT.2012.6284246
  • Filename
    6284246