Artifact Submission

All accepted articles (including accepted articles subject to minor revisions) are eligible to submit artifacts.

Your submission should consist of three parts, all in one file:

  1. a Markdown-formatted README.md for your artifact;
  2. a URL pointing to either
    • a single file containing the artifact (recommended), or
    • the address of a public source control repository;
  3. a hash certifying the version of the artifact at submission time, either
    • an sha256 hash of the single file (use the sha256sum command-line tool to generate the hash), or
    • the full commit hash for the repository (e.g., from git reflog --no-abbrev).

The URL must be a public Google Drive, Dropbox, GitHub, Bitbucket, or GitLab URL, to help protect the anonymity of the reviewers.

Artifacts do not need to be anonymous; reviewers will be aware of author identities.

Submission System

Submit your artifact documentation via the online submission system.

Please upload a single file that includes the URL for your artifact, the hash of the artifact at submission time, and the content of the README file.

Contents of the Artifact README.md

Your README.md should consist of three parts:

  • Getting started guide
  • Overview of the claims
  • For each claim: Step-by-Step instructions for how you propose to evaluate your artifact (with appropriate connections to the relevant sections of your article)

Getting Started Guide

Your “Getting Started Guide” should contain setup instructions (including, for example, a pointer to the VM player software, its version, passwords if needed, etc.) and basic testing of your artifact that you expect a reviewer to be able to complete in 30 minutes or less. Reviewers will follow all the steps in the guide during an initial kick-the-tyres phase. The Getting Started Guide should be as simple as possible, and yet it should stress the key elements of your artifact. Anyone who has successfully followed the Getting Started Guide should have no technical difficulties with your artifact.

Overview of Claims

Your artifact’s documentation should include the following:

  • A list of claims from your article supported by the artifact, and how they are supported.
  • A list of claims from your article not supported by the artifact, and why not.

Examples of unsupported claims: Performance claims cannot be reproduced in VM, authors are not allowed to redistribute specific benchmarks, etc.

Artifact reviewers can use this documentation to base their reviews/evaluations on these specific claims. The reviewers will still also consider whether the provided evidence is adequate to support claims that the artifact works as expected.

Step-by-Step Instructions

The “Step-by-Step Instructions” explain how to reproduce any experiments or other activities that support the individual claims in your article. Write this for readers who have a deep interest in your work and are studying it to improve it or compare against it. If your artifact runs for more than a few minutes to exhibit the desired effect, point this out, note how long it is expected to run (roughly), and explain how to run it on smaller inputs. Reviewers may choose to run on smaller or larger inputs depending on available hardware.

Where appropriate, include descriptions of and links to files (included in the archive) that represent expected outputs (e.g., the log files expected to be generated by your tool on the given inputs); if there are warnings that are safe to be ignored, explain which ones those are.

Packaging the Artifact

When packaging your artifact, please keep in mind: a) how accessible you are making your artifact to other researchers and b) the fact that the AEC members will have a limited time in which to make an assessment of each artifact.

We ask that you make the artifact available as a Docker image or a virtual machine (VM) image in OVF / OVA format containing the installed artifact and all of the necessary libraries installed. A VM provides an easily reproducible environment. It also helps the AEC have confidence that errors or other problems cannot cause harm to their machines.

You should make your artifact available as a single archive file. Please use a widely available compressed archive format such as ZIP (.zip), tar and gzip (.tgz), or tar and bzip2 (.tbz2). Please use open formats for documents, preferably CSV or JSON for data.

For ensuring quality packaging, we strongly recommend testing your own instructions on a fresh machine (or VM), following exactly the instructions you have given.

Authors submitting machine-checked proof artifacts should consult Marianna Rapoport’s Proof Artifacts: Guidelines for Submission and Reviewing.

Common Issues

Common issues in the kick-the-tyres phase are:

  • Overstating platform support. Several artifacts claiming the need for only UNIX-like systems failed severely under macOS — in particular those requiring 32-bit compilers, which are no longer present in newer macOS versions. We recommend future artifacts scope their claimed support more narrowly. Generally, this could be fixed by the authors providing a Dockerfile.
  • Missing dependencies, or poor documentation of dependencies. The single most effective way to avoid these sorts of issues ahead of time is to run the instructions independently on a fresh machine, VM, or Docker container.
  • Comparing against existing tools on new benchmarks, but not including ways to reproduce the other tools’ executions.
  • Not explaining how to interpret results. For instance, if an artifact ran successfully and produced the output that was the basis for the article, but without any way for reviewers to check the output for consistency with the article. Examples included generating a list of warnings without documenting which were true vs. false positives and generating large tables of numbers that are presented graphically in the article without providing a way to generate analogous visualizations.

Conflicts of Interest

Conflicts of interests for AEC members are handled by the chairs. Conflicts of interest involving one of the two AEC chairs are handled by the other AEC chair, or the associate editor of the current volume if both chairs are conflicted.