You are required to read and agree to the below before accessing a full-text version of an article in the IDE article repository.
The full-text document you are about to access is subject to national and international copyright laws. In most cases (but not necessarily all) the consequence is that personal use is allowed given that the copyright owner is duly acknowledged and respected. All other use (typically) require an explicit permission (often in writing) by the copyright owner.
For the reports in this repository we specifically note that
- the use of articles under IEEE copyright is governed by the IEEE copyright policy (available at http://www.ieee.org/web/publications/rights/copyrightpolicy.html)
- the use of articles under ACM copyright is governed by the ACM copyright policy (available at http://www.acm.org/pubs/copyright_policy/)
- technical reports and other articles issued by M‰lardalen University is free for personal use. For other use, the explicit consent of the authors is required
- in other cases, please contact the copyright owner for detailed information
By accepting I agree to acknowledge and respect the rights of the copyright owner of the document I am about to access.
If you are in doubt, feel free to contact webmaster@ide.mdh.se
Predicting Cache Behaviour of Concurrent Applications
Publication Type:
Conference/Workshop Paper
Venue:
29th IEEE International Conference on Emerging Technologies and Factory Automation
Abstract
Modern digital solutions are built around a variety of applications. The continuous integration of these applications brings advancements in technology. Therefore, it is essential to understand how these applications will behave when they run together. However, this can be challenging to interpret due to the increasing complexity of the execution details. One such fundamental detail is the utilization of shared cache as it goes hand in hand with the computation capacity of computer systems. Since cache utilization behavior is not simple enough to translate with few assumptions we have investigated if this complex behavior can be predicted with the help of machine learning. We trained the deep neural network with enough examples that
represent the cache behavior when applications were running alone and when they were running concurrently on the same core. The Long Short-Term Memory (LSTM) network learns the entire execution period of each application in the training set. As a result, without running two applications together in reality, provided with the L1 cache misses of two applications (running alone), it can predict how the cache will look like if two applications wish to run together. The model returns a time series that reflects the cache behavior in concurrency.
Bibtex
@inproceedings{Imtiaz7052,
author = {Shamoona Imtiaz and Moris Behnam and Gabriele Capannini and Jan Carlson and Marcus J{\"a}gemar},
title = {Predicting Cache Behaviour of Concurrent Applications},
month = {September},
year = {2024},
booktitle = {29th IEEE International Conference on Emerging Technologies and Factory Automation},
url = {http://www.es.mdu.se/publications/7052-}
}