• DocumentCode
    912429
  • Title

    Another look at the coding theorem of information theory—A tutorial

  • Author

    Wyner, Aaron D.

  • Author_Institution
    Weizmann Institute of Science, Rehovot, Israel
  • Volume
    58
  • Issue
    6
  • fYear
    1970
  • fDate
    6/1/1970 12:00:00 AM
  • Firstpage
    894
  • Lastpage
    913
  • Abstract
    In this tutorial paper we consider the problem of the transmission of data from a general class of information sources over a general class of communication channels. The problem is the determination of the maximum attainable value of a (suitably defined) "reliability." In general the channel imposes limits on the attainable reliability in three ways: 1) by introducing "noise" into the system, 2) because of a "mismatch" between source and channel (for example, an analog data source and digital communication channel), and 3) because of "costs" associated with various channel inputs (for example, signal "power"). We assume that the system designer is allowed to interpose data processors between the source and channel input; and between the channel output and the user (called "encoder" and "decoder," respectively) to combat these limitations. Shannon\´s coding theorem, which is the subject of this paper, gives an answer to this question of maximum reliability in the special case where no limit is imposed on the complexity of these processors. Since this is a tutorial paper, we emphasize motivating material and discussion at the expense of mathematical details and proofs.
  • Keywords
    Binary sequences; Codes; Communication channels; Communication systems; Decoding; Digital communication; Reliability theory; Signal processing; Telephony; Tutorial;
  • fLanguage
    English
  • Journal_Title
    Proceedings of the IEEE
  • Publisher
    ieee
  • ISSN
    0018-9219
  • Type

    jour

  • DOI
    10.1109/PROC.1970.7797
  • Filename
    1449727