This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2009 IEEE International Symposium on Parallel&Distributed Processing
Efficient shared cache management through sharing-aware replacement and streaming-aware insertion policy
Rome, Italy
May 23-May 29
ISBN: 978-1-4244-3751-1
Yu Chen, Department of Computer Science and Technology, Tsinghua University, Beijing, China
Wenlong Li, Microprocessor Technology Lab, Intel Corp, China
Changkyu Kim, Microprocessor Technology Lab, Intel Corp, China
Zhizhong Tang, Department of Computer Science and Technology, Tsinghua University, Beijing, China
Multi-core processors with shared caches are now commonplace. However, prior works on shared cache management primarily focused on multi-programmed workloads. These schemes consider how to partition the cache space given that simultaneously-running applications may have different cache behaviors. In this paper, we examine policies for managing shared caches for running single multi-threaded applications. First, we show that the shared-cache miss rate can be significantly reduced by reserving a certain amount of space for shared data. Therefore, we modify the replacement policy to dynamically partition each set between shared and private data. Second, we modify the insertion policy to prevent streaming data (data not reused before eviction) from promoting to the MRU position. Finally, we use a low-overhead sampling mechanism to dynamically select the optimal policy. Compared to LRU policy, our scheme reduces the miss rate on average by 8.7% on 8MB caches and 20.1% on 16MB caches respectively.
Citation:
Yu Chen, Wenlong Li, Changkyu Kim, Zhizhong Tang, "Efficient shared cache management through sharing-aware replacement and streaming-aware insertion policy," ipdps, pp.1-11, 2009 IEEE International Symposium on Parallel&Distributed Processing, 2009
Usage of this product signifies your acceptance of the Terms of Use.