This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A case for a value-aware cache
RapidPost
ISSN: 1556-6056
Angelos Arelakis, Chalmers University of Technology, Gothenburg
Per Stenstrom, Chalmers University of Technology, Gothenburg
Replication of values causes poor utilization of on-chip cache memory resources. This paper addresses the question: How much cache resources can be theoretically and practically saved if value replication is eliminated? We introduce the concept of value-aware caches and show that a sixteen times smaller value-aware cache can yield the same miss rate as a conventional cache. We then make a case for a value-aware cache design using Huffman-based compression. Since the value set is rather stable across the execution of an application, one can afford to reconstruct the coding tree in software. The decompression latency is kept short by our proposed novel pipelined Huffman decoder that uses canonical codewords. While the (loose) upper-bound compression factor is 5.2X, we show that, by eliminating cache-block alignment restrictions, it is possible to achieve a compression factor of 3.4X for practical designs.
Index Terms:
Huffman coding,Engines,Clocks,Indexes,Decoding,System-on-a-chip,E.4.a Data compaction and compression,B Hardware,B.3 Memory Structures,B.3.2 Design Styles,B.3.2.b Cache memories,E Data,E.4 Coding and Information Theory
Citation:
Angelos Arelakis, Per Stenstrom, "A case for a value-aware cache," IEEE Computer Architecture Letters, 14 Aug. 2013. IEEE computer Society Digital Library. IEEE Computer Society, <http://doi.ieeecomputersociety.org/10.1109/L-CA.2012.31>
Usage of this product signifies your acceptance of the Terms of Use.