SC20 Is Everywhere We Are

Virtual Event FAQ

Days Held

Sunday–Wednesday, November 8–11, 2020

Read a post-competition summary of the VSCC in the blog

The Student Cluster Competition (SCC) was developed in 2007 to provide an immersive high performance computing experience to undergraduate and high school students. For SC20, the competition has moved to the cloud to accommodate remote participation, becoming the Virtual Student Cluster Competition (VSCC). With sponsorship from vendor partners, student teams design and build virtual clusters in the Microsoft Azure cloud, learn scientific applications, apply optimization techniques for their chosen cloud configurations, and compete in a 72-hour challenge around the world to complete a set of benchmarks and real-world scientific workloads. The VSCC gives teams the opportunity to show off their HPC knowledge for conference attendees and judges.

VSCC Winning Team

Highest Linpack Benchmark & Overall Winner

Tsinghua University, China

  • Students: Chen Zhang, Jiajie Chen, Yutian Wang, Runxin Zhong, Mingshu Zhai, Zeyu Song
  • Advisors: Jidong Zhai, Wentao Han, Lin Gan

VSCC Orientation & Kickoff

An orientation video for students participating in the VSCC, and a brief kickoff including an introduction from SC20 General Chair Christine E. Cuicchi

View Orientation Video on YouTube

View Kickoff Video on YouTube

VSCC FAQ

Learn more about participating in the Virtual Student Cluster Competition

What is VSCC?

The VSCC is a fully virtual version of the Student Cluster Competition that allows for remote participation.

Does my team need to bring physical hardware?

No. All the competition applications will be executed in the cloud.

How will the competition rules change?

Please stay tuned for more details.

How can my team’s institution/vendor partner remain involved?

We encourage institutions/vendor partners to remain involved in the following ways:

  • Provide training and interact closely with their teams.
  • Provide technical assistance to teams as they familiarize themselves with the cloud.

Reproducibility Challenge

One of the applications presented to the student teams is the Reproducibility Challenge, in which students attempt to reproduce results from an accepted paper from the prior year’s Technical Program.

Students have the opportunity to interact directly with the paper’s authors as they attempt to reproduce specific results and conclusions from the paper. As part of this challenge, each student team writes a reproducibility report detailing their experience in reproducing the results from the paper. Authors of the most highly rated reproducibility reports may be invited to submit their reports to a reproducibility special issue.

Teams & Process

Teams are composed of six students, an advisor, and vendor partners. The students provide their skills and enthusiasm, the advisor provides guidance, the vendor provides resources (e.g., software, expertise, travel funding), and Microsoft Azure provides the cloud credits for the competition. Students work with their advisors to craft a proposal that describes the team, how the cloud budget will be utilized, and their approach to the competition. The SCC committee reviews each proposal and provides comments for all submissions. The requirements for teams’ proposed cloud deployment are that they be able to run the applications and exercises during the competition and stay inside the monetary budget stated in the competition’s rules.

Support Provided

For the Virtual Student Cluster Competition, selected teams receive full virtual conference registration for each team member and one advisor. As the competition is part of the Students@SC program, students can also participate in Mentor–Protégé Matching and the Job Fair. Student teams may also seek additional support from Microsoft by contacting their regional representative

History

For more information about SCC in past years, including team profiles, photos, winners, and more: studentclustercompetition.us

VSCC Mystery Application

The VSCC is looking for scientific applications from the HPC community that could be used as the VSCC Mystery Application. If you have a scientific application that you think would be a great fit for the competition, please complete the form at the link below.

The application owner for the selected VSCC Mystery Application will receive complimentary registration to SC20.

Eligibility

  • Each submission must list an application owner who will:
    • be responsible for answering questions from the VSCC teams.
    • prepare test and input decks for the competition.
    • be available to serve as judge during SC20.
  • The application should not have export control restrictions.
  • The application must have updated documentation.
  • Submissions and selections must be kept confidential until the beginning of the VSCC when the mystery application selected will be revealed.

Submissions will be accepted until July 3, 2020.

Submit an VSCC Mystery Application

VSCC Benchmarks & Applications

Three Benchmarks and Four Applications for the VSCC

Benchmarks

 

LINPACK Benchmark
http://top500.org/project/linpack/

The Linpack Benchmark is a measure of a computer’s floating-point rate of execution. It is determined by running a computer program that solves a dense system of linear equations. It is used by the TOP 500 as a tool to rank peak performance. The benchmark allows the user to scale the size of the problem and to optimize the software in order to achieve the best performance for a given machine. This performance does not reflect the overall performance of a given system, as no single number ever can. It does, however, reflect the performance of a dedicated system for solving a dense system of linear equations. Since the problem is very regular, the performance achieved is quite high, and the performance numbers give a good correction of peak performance.

 

HPCG Benchmark
http://hpcg-benchmark.org/

The High Performance Conjugate Gradients (HPCG) Benchmark project is an effort to create a new metric for ranking HPC systems. HPCG is intended as a complement to the High Performance LINPACK (HPL) benchmark, currently used to rank the TOP500 computing systems. The computational and data access patterns of HPL are still representative of some important scalable applications, but not all. HPCG is designed to exercise computational and data access patterns that more closely match a different and broad set of important applications, and to give incentive to computer system designers to invest in capabilities that will have impact on the collective performance of these applications.

 

IO500 Benchmark
http://io500.org

The IO500 benchmark is a benchmark suite for High-Performance IO. It harnesses existing and trusted open-source benchmarks such as IOR and MDTest and bundles execution rules and multiple workloads with the purpose to evaluate and analyze the storage devices for various IO patterns. The IO500 benchmark is designed to provide performance boundaries of the storage for HPC applications regarding data and metadata operations under what are commonly observed to be both easy and difficult IO patterns from multiple concurrent clients. Moreover, there is a phase that scans for previously-created files that match certain conditions using a (possibly file system-specific) parallel find utility to evaluate the speed of namespace traversal and file attribute retrieval. The final score that is used to rank submissions in the list is a combined score across all the executed benchmarks.

The VSCC results will be included in a dedicated list that will be released during SC20.

IO500 Submission Rules

 

Applications

 

Parallel Computing with Climate Models
The Community Earth System Model (CESM) is a state-of-the-art climate model developed by the National Center for Atmospheric Research. It discretizes and parameterizes Earth system motion and processes (atmosphere, ocean, land, etc.) over tens of thousands of grid boxes, often using many parallel processors. Climate models like this one are used for understanding scenarios of how the Earth system might respond to change. Competition entrants will design and build a cluster that can run several self-contained test cases using CESM. The benchmark for this challenge will be the speed at which the test case can be completed, which includes reading input files, actual compute time, and writing output from the model.

 

Gromacs
http://www.gromacs.org/About_Gromacs

GROMACS is a versatile package to perform molecular dynamics, i.e. simulate the Newtonian equations of motion for systems with hundreds to millions of particles. It is primarily designed for biochemical molecules like proteins, lipids and nucleic acids that have a lot of complicated bonded interactions, but since GROMACS is extremely fast at calculating the nonbonded interactions (that usually dominate simulations) many groups are also using it for research on non-biological systems, e.g. polymers. This year we are looking into the possibility of simulating problems related to the COVID-19 pandemic.

 

Mystery Application
At the start of the competition, teams will be given an application and datasets for a mystery application. Students will be expected to build, optimize and run this mystery application all at the competition.

 

Reproducibility Challenge
The SC20 Reproducibility Committee has selected the paper “MemXCT: Memory-Centric X-ray CT Reconstruction with Massive Parallelization” by Mert Hidayetoğlu, Tekin Biçer, Simon Garcia de Gonzalo, Bin Ren, Doğa Gürsoy, Rajkumar Kettimuthu, Ian T. Foster, and Wen-mei W. Hwu to serve as the Virtual Student Cluster Competition (VSCC) benchmark for the Reproducibility Challenge this year. A team of reviewers selected the paper from 45 finalists based on the paper’s Artifact Descriptor (AD) and its suitability to the VSCC. The authors and the Reproducibility Committee have been working to create a reproducible benchmark that builds on the paper’s results. During the VSCC, the 16 VSCC teams will be asked to run the benchmark, replicating the findings from the original paper under different settings and with different datasets.

VSCC Requirements & Regulations

Teams, Institutions/Vendors, & Azure Deployments

VSCC Team Requirements

  • A team consists of 6 undergraduate or high school students and one primary advisor.
  • All team members must be enrolled at an educational institution but MUST NOT have been granted an undergraduate degree before the start of the VSCC.
  • A team may be composed of members from multiple educational institutions including combined high school and college students.
  • Advisors are required to be staff, faculty or graduate students of the team’s educational institution(s) or sponsoring HPC center.
  • Each team must have a sponsoring institution. In addition, it is highly recommended for an institution or HPC center to have a vendor partner to help offset costs for the event.

 

VSCC Institution/Vendor Requirements

  • Institutions, sponsoring HPC centers and/or their vendor partners must provide team training and support. Each team will be provided an allocation of credits on Microsoft Azure. The deployment’s software must meet competition rules.
  • In the months leading up to the competition, teams will be provided a month-by-month Azure budget for testing and training. Institutions and vendor partners should provide their teams with software and application support for practice and preparation, ideally for three (3) months or more prior to the competition.
  • Vendors are encouraged to provide training and interact closely with their teams in designing cloud deployment.

 

VSCC Azure Deployments

  • All hardware must be commercially available by the start of the competition.
  • No hardware in the competition system may be subject to a Non-Disclosure Agreement (NDA).
  • All technical specifications of all hardware components should be available to the general public at competition start.
  • All performance results from the competition hardware must be able to be published without restriction.

VSCC Logistics

Orientation Briefing & Competition Stages

VSCC Schedule

  • Competition Stages: The competition will run continuously for 72 hours beginning on Sunday, November 8 and finishing on Wednesday, November 11. Benchmarks and applications will be released to the teams in stages and any benchmark or application results can be submitted throughout the competition.
      • Benchmarking Stage: Comprises the benchmarks and will be released at the beginning of the competition.
      • Application Workload Stage: The application workload will be divided into stages. The input data for each stage will be released at different times throughout the 72 hour window.

VSCC Team Applications

Recommendations for Preparing Your VSCC Team Application

  • Before submitting your VSCC Team Application, please carefully review the VSCC Rules section.
  • Team Applications will be reviewed by the VSCC Committee following the rubric outlined below:
    • Strength of Team (5 points)
    • Strength of Approach for Software and Cloud Administration (5 points)
    • Strength of Vendor/Institution Relationship (5 points)
    • Strength of Diversity (5 points)
    • Team Preparation (5 points)
    • Teams that include institutions competing for the first time will receive 2.5 extra points. Cross-institutional teams are encouraged.
  • To assist you in preparing your application, please make sure that the responses for each section of the application address the following:
    • Strength of Team:
      • Motivation for participating in the competition
      • Outcome the team expects from the competition (e.g., how will HPC impact the team’s academic careers?)
    • Strength of Approach for Software and Cloud Administration:
      • Describe your plan for the use of Azure budget for the competition. Use a hypothetical budget of $2,500 for testing and training and $1,250 for the competition itself.
      • Explain why the cloud strategy coupled with the software plan will be successful.
      • Describe the strategy for running applications and/or optimization during the competition.
      • Explain why the choice of deployment is well suited for the competition applications.
      • Describe how the team will manage the system’s administration and application workflow.
    • Strength of Vendor/Institution Relationship:
      • What support will the team receive from vendors and/or the institution?
      • If reviewers have questions about the type of support provided, who can answer those questions from the vendor and team?
    • Strength of Diversity:
      • Does the team include meaningful contributions by groups that are traditionally underrepresented in the country of the sponsoring institution?
      • What efforts made during the team selection process to approach under-represented communities?
    • Team Preparation:
      • What courses are available and attended?
      • What HPC or cloud resources will be used to investigate the applications before initial access to Azure is granted?
      • What is the team’s method for preparing for the competition?

VSCC Rules

General Rules, Azure Cloud Provision, & System Software

Violation of any rule may result in a team’s disqualification from the competition or a point penalization at the discretion of the VSCC committee. Any unethical conduct not otherwise covered in these rules will also be penalized at the discretion of the VSCC Committee.

The following violations will result in immediate disqualification:

  • Having anyone other than the 6 registered team members in the team’s booth during competition hours.
  • Any communication between your cloud resource and a network other than the approved cloud networks.

 

A. General Competition Rules

  1. Safety First: All VSCC operations are always subject to safety as first consideration. If a task cannot be done safely, then it is unacceptable. When in doubt, ask a VSCC committee member or your VSCC team liaison.
  2. No Assistance from Non-Team Members: Once the competition starts, student teams will not be allowed to receive assistance from anyone, including their advisor.
  3. Stay Under Budget: The Azure budget allowed for the VSCC will be given to teams in advance of the competition. Point penalties will be assessed if teams go over this budget.
  4. No External Computational Assistance: All benchmarks and application workload must be run on the Azure cloud using the budget allotted each of the teams. Submitting results generated from any other source will result in disqualification.
  5. Teams Must Conduct Themselves Professionally: Teams must conduct themselves professionally and adhere to the SC Code of Conduct. Students must compete fairly and ethically.

 

B. Azure Cloud Provision

  1. Prior to the start of the competition, each team will receive a testing and training budget for the Microsoft Azure cloud. The training and testing budget will be distributed in monthly installments up until the competition.
  2. At the start of the competition, each individual team will receive a set budget for the Microsoft Azure cloud. Teams are not allowed to supplement their allotted budget during the competition. No additional cycles will be provided.
  3. Teams are allowed to select from the various offerings that the Microsoft Azure cloud provides. Please note that some offerings are more expensive than others.
  4. Accepted teams must submit a detailed Final Architecture Proposal by the deadline listed on the SC submissions website. Failure to submit this form by the deadline will result in an automatic disqualification. The detail in the Final Architecture Proposal should be sufficient for the judging panel to determine the required allotment of Azure resources for each resource type used, and whether all the applications will easily port to and run in the proposed deployment.

 

C. System Software

  1. Teams may choose any operating system and software stack that will run the applications and display visualizations to conference attendees.
  2. Teams may pre-load and test the applications and other software on their cloud deployment.
  3. Teams may study and tune the open-source benchmarks and applications for their platforms. Any changes to application source code must be shared with the VSCC committee.

VSCC Applications

Applications closed.

View application details

Back To Top Button