Title :
Integrating Caching and Prefetching Mechanisms in a Distributed Transactional Memory
Author :
Dash, Alokika ; Demsky, Brian
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Univ. of California, Irvine, Irvine, CA, USA
Abstract :
We present a distributed transactional memory system that exploits a new opportunity to automatically hide network latency by speculatively prefetching and caching objects. The system includes an object caching framework, language extensions to support our approach, and symbolic prefetches. To our knowledge, this is the first prefetching approach that can prefetch objects whose addresses have not been computed or predicted. Our approach makes aggressive use of both prefetching and caching of remote objects to hide network latency while relying on the transaction commit mechanism to preserve the simple transactional consistency model that we present to the developer. We have evaluated this approach on three distributed benchmarks, five scientific benchmarks, and several microbenchmarks. We have found that our approach enables our benchmark applications to effectively utilize multiple machines and benefit from prefetching and caching. We have observed a speedup of up to 7.26× for distributed applications on our system using prefetching and caching and a speedup of up to 5.55× for parallel applications on our system.
Keywords :
cache storage; distributed shared memory systems; parallel machines; caching; distributed applications; distributed transactional memory system; language extensions; multiple machines; network latency; parallel applications; prefetching; remote objects; transaction commit mechanism; transactional consistency model; Arrays; Context; Prefetching; Runtime; Semantics; Distributed shared memory; prefetching.; software transactional memory;
Journal_Title :
Parallel and Distributed Systems, IEEE Transactions on
DOI :
10.1109/TPDS.2011.23