You are required to read and agree to the below before accessing a full-text version of an article in the IDE article repository.

The full-text document you are about to access is subject to national and international copyright laws. In most cases (but not necessarily all) the consequence is that personal use is allowed given that the copyright owner is duly acknowledged and respected. All other use (typically) require an explicit permission (often in writing) by the copyright owner.

For the reports in this repository we specifically note that

  • the use of articles under IEEE copyright is governed by the IEEE copyright policy (available at http://www.ieee.org/web/publications/rights/copyrightpolicy.html)
  • the use of articles under ACM copyright is governed by the ACM copyright policy (available at http://www.acm.org/pubs/copyright_policy/)
  • technical reports and other articles issued by M‰lardalen University is free for personal use. For other use, the explicit consent of the authors is required
  • in other cases, please contact the copyright owner for detailed information

By accepting I agree to acknowledge and respect the rights of the copyright owner of the document I am about to access.

If you are in doubt, feel free to contact webmaster@ide.mdh.se

Methodological Principles for Reproducible Performance Evaluation in Cloud Computing

Fulltext:


Authors:

Alessandro Papadopoulos, Laurens Versluis , André Bauer , Nikolas Roman Herbst , Jóakim von Kistowski , Ahmed Ali-Eldin , Cristina Abad , J. Nelson Amaral , Petr Tuma , Alexandru Iosup

Publication Type:

Journal article

Venue:

IEEE Transactions on Software Engineering

Publisher:

IEEE

DOI:

10.1109/TSE.2019.2927908


Abstract

The rapid adoption and the diversification of cloud computing technology exacerbate the importance of a sound experimental methodology for this domain. This work investigates how to measure and report performance in the cloud, and how well the cloud research community is already doing it. We propose a set of eight important methodological principles that combine best-practices from nearby fields with concepts applicable only to clouds, and with new ideas about the time-accuracy trade-off. We show how these principles are applicable using a practical use-case experiment. To this end, we analyze the ability of the newly released SPEC Cloud IaaS benchmark to follow the principles, and showcase real-world experimental studies in common cloud environments that meet the principles. Last, we report on a systematic literature review including top conferences and journals in the field, from 2012 to 2017, analyzing if the practice of reporting cloud performance measurements follows the proposed eight principles. Worryingly, this systematic survey and the subsequent two-round human reviews, reveal that few of the published studies follow the eight experimental principles. We conclude that, although these important principles are simple and basic, the cloud community is yet to adopt them broadly to deliver sound measurement of cloud environments.

Bibtex

@article{Papadopoulos5558,
author = {Alessandro Papadopoulos and Laurens Versluis and Andr{\'e} Bauer and Nikolas Roman Herbst and J{\'o}akim von Kistowski and Ahmed Ali-Eldin and Cristina Abad and J. Nelson Amaral and Petr Tuma and Alexandru Iosup},
title = {Methodological Principles for Reproducible Performance Evaluation in Cloud Computing},
volume = {47},
number = {8},
pages = {1528--1543},
month = {August},
year = {2021},
journal = {IEEE Transactions on Software Engineering},
publisher = {IEEE},
url = {http://www.es.mdu.se/publications/5558-}
}