The Community for Technology Leaders
Proceedings of the 22nd International Conference on Parallel Architectures and Compilation Techniques (2011)
Galveston, Texas USA
Oct. 10, 2011 to Oct. 14, 2011
ISSN: 1089-795X
ISBN: 978-0-7695-4566-0
pp: 217
ABSTRACT
A number of hardware systems have been built or proposed to provide an interface for software to influence cache management. The combined software-hardware solution is called collaborative caching. Our previous work showed that in theory collaborative caching with LRU and MRU may enable a program to manage cache optimally. In this work we first present a prioritized LRU model. For each memory access, a program specifies a priority, the target cache position for the accessed datum, for all cache sizes. We have proved that the prioritized LRU holds inclusion property. Alternatively, we describe a dynamic cache control scheme based on the associated priority. The limitation of knowing cache size in our LRU-MRU collaborative caching work is removed.
INDEX TERMS
collaborative caching, priority LRU, dynamic LRU-MRU determination
CITATION
Xiaoming Gu, "Collaborative Caching for Unknown Cache Sizes", Proceedings of the 22nd International Conference on Parallel Architectures and Compilation Techniques, vol. 00, no. , pp. 217, 2011, doi:10.1109/PACT.2011.50
84 ms
(Ver 3.3 (11022016))