Traditionally, technical research papers are published without including any artifacts (such as tools, data, models, videos, etc.), even though the artifacts may serve as crucial and detailed evidence for the quality of the results that the associated paper offers. Artifacts support the repeatability of experiments and precise comparison with alternative approaches, thus enabling higher quality in the research area as a whole. They may also make it easier for other researchers to perform their own experiments, thus helping the original authors to disseminate their ideas in detail.
The artifact evaluation (AE) process at ECOOP 2021 is a continuation of the AE process at previous ECOOP editions, and several other conferences, including ESEC/FSE, OOPSLA, PLDI, ISSTA, HSCC, and SAS: see the authoritative Artifact Evaluation for Software Conferences web site.
Call for Reviewers
This year, the Artifact Evaluation Committee (AEC) will consist primarily of experienced early-career researchers that are invited by the Artifact Evaluation co-chairs. In addition, to foster diversity and train the next generation of researchers, we will also recruit AEC members using a self-nomination process. We expect to recruit PhD students (and postdocs) that have published at least one peer-reviewed publication.
If you are interested, you can nominate yourself by submitting the following form: ECOOP’21 AEC application form You will receive an email confirmation after submitting your application.
Please note the AEC member application deadline listed under important dates. We will respond to all self-nomination approximately two weeks after the application deadline has passed.
Call for Artifacts
Authors of accepted papers at ECOOP 2021 can have their artifacts evaluated by an Artifact Evaluation Committee (AEC). Artifacts that live up to the expectations created by the paper will be marked with a badge in the proceedings. Artifacts that are deemed especially meritorious will be singled out for special recognition in the proceedings and at the conference.
The AE process is run by a committee separate from the program committee and whose task is to assess how the artifacts support the work described in the papers. The submission of an artifact is voluntary and will not influence the final decision regarding the papers (artifacts are submitted after the notification of acceptance has been sent out).
A submitted artifact should be consistent with the associated paper. It should be so well documented that it is accessible for a general computer scientist with an interest in the research area who has read the associated paper. A submitted artifact is treated as confidential, just like a submitted paper.
Artifacts of a paper may take many forms, these include, but are not limited to: tool implementations, frameworks, libraries, data sets, and mechanized proofs. Please contact the AEC co-chairs if you have any questions about what may be submitted as an artifact.
Submission link: https://ecoop21-ae.hotcrp.com/
Every submission must include:
- An abstract that briefly describes the artifact.
- A PDF file that describes the artifact in detail and provides instructions for using it.
- A URL for downloading the artifact.
- A PDF file of the accepted paper.
When packaging your artifact for submission, please take the following into consideration: Your artifact should be as accessible to the AEC members as possible, and it should be easy for the AEC members to quickly make progress on the evaluation of your artifact. Please provide some simple scenarios describing concretely how the artifact is intended to be used; for a tool, this would include specific inputs to provide or actions to take, and expected output or behavior in response to this input. In addition to these very tightly controlled scenarios that you prepare for the AEC members to try out, it may be very useful if you suggest some variations along the way, such that the AEC members will be able to see that the artifact is robust enough to tolerate experiments. For artifacts that are tools, one very convenient way for reviewers to learn about your artifact is to include a video showing you using the artifact in a simple scenario, along with verbal comments explaining what is going on.
To avoid problems with software dependencies and installation, as well as to ensure that artifacts can still be run in the future, artifacts must be made available either as a Docker image (https://www.docker.com/) or as a virtual machine image (for example, VirtualBox, VMware, or a similar widely available platform) containing the artifact already installed. The artifact must be either uploaded as a self-contained archive file, using a widely supported archive format (e.g., zip, tgz), or must be cloneable from an online public repository (e.g., GitHub). Please use widely supported open formats for documents, and preferably CSV or JSON format for data.
Submitted artifacts will go through a two-phase evaluation:
- Kicking-the-tires: Reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM won’t start, immediate crashes on the simplest example, etc.). Authors are informed of the outcome and, in case of technical problems, they can help solve them during a brief author response period.
- Artifact assessment: Reviewers evaluate the artifacts, checking if they live up to the expectations created by the papers.
- An artifact may be awarded one or more of the following badges (see ACM Artifact Review and Badging for details)
- Functional - The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.
- Reusable - The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated.
- Available - Author-created artifacts relevant to this paper have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.
- Artifacts that go beyond expectations of quality will receive a Distinguished Artifact award. The selection procedure will be based on review scores and feedback from the program committee.
Authors will be given a 4-day period to read and respond to the kick-the-tires reports of their artifacts. Authors may be asked for clarifications in case the committee encountered problems that may prevent reviewers from properly evaluating the artifact.
When submitting artifacts, we encourage you to read the HOWTO for AEC Submitters. We would also like to provide artifact authors with general information on what makes good / bad artifacts and suggestions for good practices. In a nutshell:
Committee members want artifacts that:
- Contain all dependencies (Linux container / VM)
- Have few setup steps
- Have getting started guides where all instructions are tested
- Include some documentation on the code and layout of the artifact
- Have a short run reviewers can try first (several minutes max)
- Show progress messages (percentage complete) during longer runs
Authors should avoid:
- Downloading content over the internet during experiments or tests
- Closed source software libraries, frameworks, operating systems, and container formats
- Experiments or tests that run for multiple days
Authors and Reviewers of Proof Artifacts: We encourage authors and reviewers of mechanized proofs to consult the recent guidelines for submitting and reviewing proof artifacts.
The process for evaluation is based on and consistent with the ACM Artifact Review and Badging and NISO’s guidelines for reproducibility badging. However, ACM is not involved in the implementation or evaluation process on behalf of EAPLS.