• DocumentCode
    2522497
  • Title

    Dynamically adaptive fetch size prediction for data caches

  • Author

    Tang, Weiyu ; Veidenbaum, Alex ; Nicolau, Alex

  • Author_Institution
    Center for Embedded Syst., California Univ., Irvine, CA, USA
  • fYear
    2003
  • fDate
    37819
  • Firstpage
    40
  • Lastpage
    44
  • Abstract
    Cache line size has a significant impact on cache and overall CPU performance. This size is typically fixed at design time and may not be optimal for a given program or even within a program. Past attempts to achieve an effect of dynamic line size require complex hardware fetch size predictors. This paper proposes an adaptive fetch size predictor based on miss rate sampling. It requires little additional hardware and is straightforward to implement. Adaptive fetch size can be used at either L1 or L2 caches and achieves significant miss rate reductions in both cases. On average, the L2 adaptive fetch size cache results in highest overall performance improvements: 15% average speedup and up to 50% speedup in individual programs.
  • Keywords
    cache storage; instruction sets; parallel architectures; performance evaluation; program compilers; storage management; CPU; L1 cache; L2 adaptive fetch size cache; L2 cache; adaptive fetch size prediction; cache line size; data caches; hardware fetch size predictors; miss rate reductions; miss rate sampling; Embedded system; Hardware; Prefetching; Proposals; Sampling methods; Silicon;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Innovative Architecture for Future Generation High-Performance Processors and Systems, 2003
  • ISSN
    1537-3223
  • Print_ISBN
    0-7695-2019-7
  • Type

    conf

  • DOI
    10.1109/IWIA.2003.1262781
  • Filename
    1262781