We are excited to announce that the SC20 Reproducibility Committee has selected the SC19 paper “MemXCT: Memory-Centric X-ray CT Reconstruction with Massive Parallelization”, by Mert Hidayetoğlu, Tekin Biçer, Simon Garcia de Gonzalo, Bin Ren, Doğa Gürsoy, Rajkumar Kettimuthu, Ian T. Foster, and Wen-mei W. Hwu, to serve as the Student Cluster Competition (SCC) benchmark for the Reproducibility Challenge this year. A team of reviewers selected the paper from 45 finalists, based on the paper’s Artifact Descriptor (AD) and its suitability to the SCC. The authors and the Reproducibility Committee have been working to create a reproducible benchmark that builds on the paper’s results. At SC20, the sixteen SCC teams will be asked to run the benchmark, replicating the findings from the original paper under different settings and with different datasets.
What makes the work of the student teams particularly relevant is the replication of the paper’s work across the sixteen different clusters that will be fielded by the teams. In the era of heterogeneous computing, porting applications from one platform to another is not a simple task. The work of the student teams at SC20 is a fantastic way to dive into reproducibility challenges across heterogeneous platforms and emerge with shareable, robust insights. It is the ensemble of each team’s implementation and execution of the challenge on sixteen different platforms that will earn this paper ACM’s “Results Replicated” badge in the ACM digital library. Sharing is at the core of the Reproducibility Challenge – so, the work of the SCC teams will be collected and published. We have already published three special issues in Parallel Computing from three previous years (SC16, SC17, SC18).
Putting Together the Benchmark
Many volunteers participate in the selection of the paper, the creation of the benchmark for the SCC teams, the assessment of the students’ work, and the publication of the special journal issue.
During the first round of reviews to determine feasibility for the competition, the reviewers looked at whether the finalist papers had an application that could be run by the student teams on the broad range of hardware types and cluster configurations that are typically fielded by SCC teams. This initial review eliminated nearly 80% of the potential papers because, for example, they used proprietary compilers, ran only on specific hardware, or reproducing the results required a larger scale than the SCC clusters could provide.
A second round of reviews, including at least three for each paper, looked for which application would be best suited for the SCC teams. The committee then ranked the submissions based on criteria such as the application’s real-world impact as understood by undergraduates, and the student experience while working with the benchmark. Discussions with the authors of the finalist papers focused on the feasibility of adapting their applications to the Student Cluster Competition in cases where the information was not readily available in the submitted paper. Following these interviews, the committee met to determine which application to invite.
The selection of the paper is only one step in a long process that ends with the preparation of the Reproducibility Challenge benchmark – one of four benchmarks that the students must meet during the competition. The Reproducibility benchmark will be revealed at SC20. Following the conference, we will publish the students’ reports from the SC20 SCC Reproducibility Challenge, to demonstrate the effectiveness of the SCC teams and their success in replicating the code on the sixteen platforms.
Mark Your Calendar
The Student Cluster Competition will be held Monday–Wednesday, November 16–18, 2020. Visit the SCC booth in the Exhibit Hall at SC20 and chat with students about the Reproducibility Challenge. We invite you to celebrate the student participants and the authors of the selected paper at the Awards Ceremony on Thursday of the conference. And don’t miss next year’s SCC reports!
Join us in Atlanta to meet these amazing students and watch them race to reproduce this benchmark and three other HPC applications on the exhibit floor.
Scott Michael, SC20 Student Cluster Competition Chair
Stephen Lien Harrell, SC20 Reproducibility Challenge Chair