Improving cache replacement with machine learning

Loading...
Thumbnail Image

Date

2023

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

The performance of caching systems heavily relies on the effectiveness of cache replacement policies, which aim to optimize the usage of cache storage and improve cache hit rates. Traditional cache replacement policies such as Least Recently Used (LRU) and Least Frequently Used (LFU) have been widely used but have limitations in adapting to changes in the workload and minimizing the use of expensive storage. Recently, researchers have proposed machine learning-based approaches to cache replacement, aiming to learn from the access history of cache blocks and make accurate predictions about the usefulness of cache blocks. These approaches have shown promising results in improving the cache hit rate and reducing the overall storage cost compared to traditional policies. The proposed solutions include various machine learning algorithms such as Linear Regression (LR), K-Nearest Neighbor (KNN), and Deep Learning (DL). The proposed solution implements a dataset-specific prediction model which first derives a dataset using an algorithm which uses future references with very high accuracies. Then the supervised machine learning model will be used to derive the patterns on the derived dataset which has used the future reference pattern in cache replacement decision making process. By this method, the cache replacement strategy will be much more resilient as all the traditional and state of the art models uses cache data from the past only. The evaluation and the future work provides much stronger impression on this method as there are several factors of this research which may significantly improve the results and will help the current cache replacement strategies tremendously. The document also explains about the technical environments which used to evaluate models. The machine learning model training and the dataset generation require significant memory and CPU performance requirement. Hence, dedicated GPU is required in some scenarios which this document explains on how to manage

Description

Citation

Devinda, K.S. (2023). Improving cache replacement with machine learning [Master's theses, University of Moratuwa]. Institutional Repository University of Moratuwa. https://dl.lib.uom.lk/handle/123/23587

DOI

Endorsement

Review

Supplemented By

Referenced By