• DocumentCode
    1434828
  • Title

    Complex-Valued Random Vectors and Channels: Entropy, Divergence, and Capacity

  • Author

    Tauböck, Georg

  • Author_Institution
    Inst. of Telecommun., Vienna Univ. of Technol., Vienna, Austria
  • Volume
    58
  • Issue
    5
  • fYear
    2012
  • fDate
    5/1/2012 12:00:00 AM
  • Firstpage
    2729
  • Lastpage
    2744
  • Abstract
    Recent research has demonstrated significant achievable performance gains by exploiting circularity/noncircularity or properness/improperness of complex-valued signals. In this paper, we investigate the influence of these properties on important information theoretic quantities such as entropy, divergence, and capacity. We prove two maximum entropy theorems that strengthen previously known results. The proof of the first maximum entropy theorem is based on the so-called circular analog of a given complex-valued random vector. The introduction of the circular analog is additionally supported by a characterization theorem that employs a minimum Kullback-Leibler divergence criterion. In the proof of the second maximum entropy theorem, results about the second-order structure of complex-valued random vectors are exploited. Furthermore, we address the capacity of multiple-input multiple-output (MIMO) channels. Regardless of the specific distribution of the channel parameters (noise vector and channel matrix, if modeled as random), we show that the capacity-achieving input vector is circular for a broad range of MIMO channels (including coherent and noncoherent scenarios). Finally, we investigate the situation of an improper and Gaussian distributed noise vector. We compute both capacity and capacity-achieving input vector and show that improperness increases capacity, provided that the complementary covariance matrix is exploited. Otherwise, a capacity loss occurs, for which we derive an explicit expression.
  • Keywords
    Gaussian noise; MIMO communication; covariance matrices; entropy; Gaussian distributed noise vector; Kullback-Leibler divergence criterion; MIMO channels; capacity-achieving input vector; circular analog; circularity-noncircularity; complementary covariance matrix; complex-valued random vectors; maximum entropy theorem; multiple-input multiple-output channels; theoretic quantity; Covariance matrix; Entropy; MIMO; Matrix decomposition; Noise; Symmetric matrices; Vectors; Capacity; Kullback–Leibler divergence; circular; circular analog; differential entropy; improper; multiple-input multiple-output (MIMO); mutual information; noncircular; proper;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2012.2184638
  • Filename
    6142094