Abstract :
in shared storage environment, different types of applications share cache resources. The traditional cache management has two disadvantages. First, interference exists between applications that share one cache space, therefore, every application can´t share cache resource fairly. Second, overall resource utilization is very low. To solve these problems, we design a cache management system - PIN-Cache, which adopts cache partition technology to handle cache resource sharing between applications. Every application owns an individual cache space respectively, thereby reducing interferences between applications with different access patterns. In order to verify the feasibility and effectiveness of PIN-Cache in real system, we implement it in Linux kernel as a pseudo device driver. In the system overhead test, the read performance of PIN-Cache is comparable to that of Linux page cache, while the write performance of PIN-Cache is much better than that of Linux page cache. The result validates effectiveness of implementation. In the performance insulation test, compared with Linux page cache, overall mean performance improves by up to 67% with an average 45%, and single performance improves by up to 136%. This shows that PIN-Cache could effectively achieve application performance insulation and reduce performance interferences between applications obviously. Moreover, in terms of cache gains, allocating reasonable capacity for every application improves overall performance effectively.
Keywords :
Linux; cache storage; device drivers; operating system kernels; resource allocation; Linux kernel; PIN-Cache; access pattern; application performance insulation test; applications share cache resource; cache gain; cache management system; cache partition technology; cache scheme design; cache space sharing; capacity allocation; performance interference reduction; pseudo device driver; read performance; shared storage environment; write performance; Bandwidth; Insulation; Interference; Kernel; Linux; Performance evaluation; Resource management; cache; performance insulation; storage;