Modeling cache performance for embedded systems

Ogechukwu Kingsley Ugwueze, Chijindu C. V., Udeze C. C., Ahaneku A. M., Eneh N. J., Obinna M. Ezeja, Edward C. Anoliefo

Abstract


This paper presents a cache performance model for embedded systems. The need for efficient cache design in embedded systems has led to the exploration of various methods of design for optimal cache configurations for embedded processor. Better users’ experiences are realized by improving performance parameters of embedded systems. This work presents a cache hit rate estimation model for embedded systems that can be used to explore optimal cache configurations using Bourneli’s binomial cumulative probability based on application of reuse distance profiles. The model presented was evaluated using three mibench benchmarks which are bitcount, basicmath and FFT for 4kb, 8kb, 16kb, 32kb and 64kb sizes of cache under 2-way, 4-ways, 8-ways and 16-ways set associative configurations, all using least recently-used (LRU) replacement policy. The results were compared with the results obtained using sim-cheetah from simplescalar simulators suite. The mean errors for bitcount, basicmath, and FFT benchmarks are 0.0263%, 2.4476%, and 1.9000% respectively. Therefore, the mean error for the three benchmarks is equal to 1.4579%. The margin of errors in the results was below 5% and within the acceptable limits showing that the model can be used to estimate hit rates of cache and to explore cache design options.

Keywords


Cache designs; Cache memory; Cache model; Cache performance; Embedded systems; Reuse distance

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v10i5.2459

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).