DocumentCode :
905712
Title :
Information rates of non-Gaussian processes
Author :
Gerrish, A.M. ; Schultheiss, P.M.
Volume :
10
Issue :
4
fYear :
1964
fDate :
10/1/1964 12:00:00 AM
Firstpage :
265
Lastpage :
271
Abstract :
The rate distortion function R(D) of an information source was introduced by Shannon to specify the channel capacity required in transmitting information from the source with an average distortion not exceeding D . Exact rates have been calculated for Gaussian sources under a mean-square error criterion. For non-Gaussian continuous sources, Shannon has given upper and lower bounds on R(D) . In specific cases, the difference between these two bounds may not be sufficiently small to provide a useful estimate of R(D) . The present paper is concerned with improving estimates of information rates of non-Gaussian sources under a mean-square error criterion. The sources considered are ergodic, and their statistical properties are characterized by a bounded and continuous n -dimensional probability density function. The paper gives a set of necessary and sufficient conditions for R(D) to equal Shannon\´s lower bound. For sources satisfying these conditions, exact rate calculations are possible. For sources that do not satisfy the required conditions, an improved upper bound is obtained that never exceeds Shannon\´s upper bound. Under rather general conditions, the new upper bound approaches Shannon\´s lower bound for small values of distortion, so that the true value of R(D) can be estimated very accurately for small D .
Keywords :
Rate-distortion theory; Channel capacity; Distortion; Entropy; Information rates; Probability density function; Rate-distortion; Signal generators; Signal sampling; Sufficient conditions; Upper bound;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1964.1053705
Filename :
1053705
Link To Document :
بازگشت