
Learning Memory Access Patterns - godelmachine
https://arxiv.org/abs/1803.02329
======
godelmachine
Abstract ->

The explosion in workload complexity and the recent slow-down in Moore's law
scaling call for new approaches towards efficient computing. Researchers are
now beginning to use recent advances in machine learning in software
optimizations, augmenting or replacing traditional heuristics and data
structures. However, the space of machine learning for computer hardware
architecture is only lightly explored. In this paper, we demonstrate the
potential of deep learning to address the von Neumann bottleneck of memory
performance. We focus on the critical problem of learning memory access
patterns, with the goal of constructing accurate and efficient memory
prefetchers. We relate contemporary prefetching strategies to n-gram models in
natural language processing, and show how recurrent neural networks can serve
as a drop-in replacement. On a suite of challenging benchmark datasets, we
find that neural networks consistently demonstrate superior performance in
terms of precision and recall. This work represents the first step towards
practical neural-network based prefetching, and opens a wide range of exciting
directions for machine learning in computer architecture research.

