Reproducibility initiative at ASPLOS'20

[ Back to the ASPLOS 2020 conference website ]

Important dates

Paper decision: November 20, 2019
Artifact submission: December 4, 2019
Artifact decision: January 15, 2020
Camera-ready paper: January 20, 2020
Conference: March 16-20, 2020
Reproducibility discussion: TBA

Reproducibility chairs

Techical support

Reproducibility committee

TBA

Motivation

Authors of accepted ASPLOS'20 papers are invited to formally submit their supporting materials (code, data, models, workflows, results) to the Artifact Evaluation process (AE). AE is run by a separate committee whose task is to assess how submitted artifacts support the work described in accepted papers while reproducing at least some experiments. This submission is voluntary and will not influence the final decision regarding the papers.

Since it is not always trivial to perform a full validation of computer architecture experiments and may require expensive computational resources, we use multistage Artifact Evaluation: we will validate only the "availability" and "functionality/reusability" of submitted artifacts at ASPLOS'20. Thus, depending on evaluation results, camera-ready papers will include the Artifact Appendix and will receive at most two ACM stamps of approval printed on the first page (note that authors still need to provide a small sample dataset to test the functionality of their artifacts):

      or

We think that the second AE stage can be a special reproducibility session or an open tournament to perform a full validation of experimental results from above papers with artifacts at the next conference based on the successful ASPLOS-ReQuEST'19 experience. It is under discussion so feel free to provide your feedback to the ASPLOS AE chairs!

Public discussion

We plan to organize an open session at ASPLOS to discuss artifact evaluation results and a common methodology to perform a full validation and comparison of computer architecture experiments (see related SIGARCH blogs "A Checklist Manifesto for Empirical Evaluation: A Preemptive Strike Against a Replication Crisis in Computer Science" and "Artifact Evaluation for Reproducible Quantitative Research").

Artifact submission

Prepare your submission and this Artifact Appendix using the following guidelines and register it at the ASPLOS'20 AE website. Your submission will be then reviewed according to the following guidelines. Please, do not forget to provide a list of hardware, software, benchmark and data set dependencies in your artifact abstract - this is essential to find appropriate evaluators!

The papers that successfully go through AE will receive a set of ACM badges of approval printed on the papers themselves and available as meta information in the ACM Digital Library (it is now possible to search for papers with specific badges in ACM DL). Authors of such papers will have an option to include Artifact Appendix to the camera-ready paper (up to 2 pages) and share their artifacts in the ACM DL.

At the end of the process we will inform you about how to add badges to your camera-ready paper.

Questions and feedback

Please check AE FAQs and feel free to ask questions or provide your feedback and suggestions via the dedicated AE discussion group.