• DocumentCode
    908191
  • Title

    A per letter converse to the channel coding theorem

  • Author

    Reiffen, Barney

  • Volume
    12
  • Issue
    4
  • fYear
    1966
  • fDate
    10/1/1966 12:00:00 AM
  • Firstpage
    475
  • Lastpage
    480
  • Abstract
    By relating the average probability of error to the distortion measure of a source-sink pair, we prove a converse to the channel coding theorem. This converse lower-bounds the probability of error per source letter. It differs from the more familiar "weak" and "strong" converses which bound the probability of error of an entire message. The result is applicable to all stationary sources, all channels, and all block lengths. Lower-bounding the rate distortion function of the source-sink pair with which the channel is to be used reduces the new result to a lower bound on the achievable probability of error per source letter expressed in terms of the source entropy, alphabet size, and maximum achievable average mutual information on the channel. This latter result had been proved previously only for a memoryless channel operating with an independent letter source.
  • Keywords
    Block codes; Channel capacity; Channel coding; Decoding; Distortion measurement; Entropy; Information theory; Memoryless systems; Mutual information; Rate-distortion; Stochastic processes;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.1966.1053928
  • Filename
    1053928