Towards Memory Prefetching with Neural Networks: Challenges and Insights
Accurate memory prefetching is paramount for processor performance, and modern processors employ various techniques to identify and prefetch different memory access patterns. While most modern prefetchers target spatio-temporal patterns by matching memory addresses that are accessed in close proximity (either in space or time), the recently proposed concept of semantic locality views locality as an artifact of the algorithmic level and searches for correlations between memory accesses and program state. While this approach was shown to be effective, capturing semantic locality requires significant associative learning capabilities. In this paper we utilize neural networks for this task. Artificial neural networks are becoming increasingly effective in tasks of pattern recognition and associative learning of complex relations. We leverage recent advances in this field to propose a conceptual neural network prefetcher. We show that by targeting semantic locality, this prefetcher can learn distinct memory access patterns that cannot be covered by other state-of-the-art prefetchers. We evaluate the neural network prefetcher over SPEC2006, Graph500, and a variety of handwritten kernels. We show that the prefetcher can deliver an average speedup of 22 up to 5x over kernels. We also explore the limitations of using neural networks for prefetching. Ultimately, we conclude that although there are still many challenges to overcome before we can reach a feasible, power-efficient implementation, the neural network prefetcher potential gains over state-of-the-art prefetchers justify further exploration
READ FULL TEXT