• DocumentCode
    47231
  • Title

    A Case for a Value-Aware Cache

  • Author

    Arelakis, Angelos ; Stenstrom, Per

  • Author_Institution
    Comput. Sci. & Eng., Chalmers Univ. of Technol., Gothenburg, Sweden
  • Volume
    13
  • Issue
    1
  • fYear
    2014
  • fDate
    Jan.-June 21 2014
  • Firstpage
    1
  • Lastpage
    4
  • Abstract
    Replication of values causes poor utilization of on-chip cache memory resources. This paper addresses the question: How much cache resources can be theoretically and practically saved if value replication is eliminated? We introduce the concept of value-aware caches and show that a sixteen times smaller value-aware cache can yield the same miss rate as a conventional cache. We then make a case for a value-aware cache design using Huffman-based compression. Since the value set is rather stable across the execution of an application, one can afford to reconstruct the coding tree in software. The decompression latency is kept short by our proposed novel pipelined Huffman decoder that uses canonical codewords. While the (loose) upper-bound compression factor is 5.2×, we show that, by eliminating cache-block alignment restrictions, it is possible to achieve a compression factor of 3.4× for practical designs.
  • Keywords
    Huffman codes; cache storage; data compression; data handling; tree codes; Huffman-based compression; cache-block alignment restriction elimination; coding tree reconstruction; decompression latency; on-chip cache memory resources; value replication; value-aware cache design; Clocks; Decoding; Engines; Huffman coding; Indexes; System-on-a-chip; B Hardware; B.3 Memory Structures; B.3.2 Design Styles; B.3.2.b Cache memories; E Data; E.4 Coding and Information Theory; E.4.a Data compaction and compression;
  • fLanguage
    English
  • Journal_Title
    Computer Architecture Letters
  • Publisher
    ieee
  • ISSN
    1556-6056
  • Type

    jour

  • DOI
    10.1109/L-CA.2012.31
  • Filename
    6313585