DocumentCode :
2029689
Title :
Derivatives of Mutual Information in Gaussian Vector Channels with Applications
Author :
Feiten, A. ; Hanly, S. ; Mathar, R.
Author_Institution :
Inst. for Theor. Inf. Technol., RWTH Aachen Univ., Aachen
fYear :
2007
fDate :
24-29 June 2007
Firstpage :
2296
Lastpage :
2300
Abstract :
In this paper, derivatives of mutual information for a general linear Gaussian vector channel are considered. We consider two applications. First, it is shown how the corresponding gradient relates to the minimum mean squared error (MMSE) estimator and its error matrix. Secondly, we determine the directional derivative of mutual information and use this geometrically intuitive concept to characterize the capacity-achieving input distribution of the above channel subject to certain power constraints. The well-known water-filling solution is revisited and obtained as a special case. Also for shaping constraints on the maximum and the Euclidean norm of mean powers explicit solutions are derived. Moreover, uncorrected sum power constraints are considered. The optimum input can here always be achieved by linear precoding.
Keywords :
Gaussian channels; least mean squares methods; matrix algebra; error matrix; linear Gaussian vector channels; minimum mean squared error estimator; mutual information derivative; Australia; Computer errors; Covariance matrix; Eigenvalues and eigenfunctions; Gaussian noise; Information technology; MIMO; Mutual information; Transmitters; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2007. ISIT 2007. IEEE International Symposium on
Conference_Location :
Nice
Print_ISBN :
978-1-4244-1397-3
Type :
conf
DOI :
10.1109/ISIT.2007.4557562
Filename :
4557562
Link To Document :
بازگشت