• DocumentCode
    1161425
  • Title

    Sequential neural text compression

  • Author

    Schmidhuber, Jürgen ; Heil, Stefan

  • Author_Institution
    IDISA, Lugano, Switzerland
  • Volume
    7
  • Issue
    1
  • fYear
    1996
  • fDate
    1/1/1996 12:00:00 AM
  • Firstpage
    142
  • Lastpage
    146
  • Abstract
    The purpose of this paper is to show that neural networks may be promising tools for data compression without loss of information. We combine predictive neural nets and statistical coding techniques to compress text files. We apply our methods to certain short newspaper articles and obtain compression ratios exceeding those of the widely used Lempel-Ziv algorithms (which build the basis of the UNIX functions “compress” and “gzip”). The main disadvantage of our methods is that they are about three orders of magnitude slower than standard methods
  • Keywords
    backpropagation; data compression; document handling; encoding; feedforward neural nets; file organisation; linear predictive coding; probability; backpropagation; data compression; feedforward neural networks; predictive neural networks; probability distribution; sequential text compression; statistical coding; Arithmetic; Character generation; Compression algorithms; Decoding; History; Huffman coding; Hydrogen; Neural networks; Probability distribution; Table lookup;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/72.478398
  • Filename
    478398