SC20 Is Everywhere We Are

Virtual Event FAQ

Thoughts at the Threshold of the Exascale Era

 

Steve Conway
Senior Advisor, HPC Market Dynamics, Hyperion Research

An exaflop is an arbitrary milestone, a nice round figure with the kind of symbolic lure the four-minute mile once held. And as we know, there are three stages to these computing milestones that have occurred about once a decade. First comes peak exaflop performance, then a Linpack/TOP500 exaflop, and finally the one that counts most but in the past has been celebrated least: sustained exaflop performance on a full, challenging 64-bit user application.

Not long ago, a fundamental premise underlying advanced supercomputer development was that evolutionary market forces were too slow and governments needed to stimulate revolutionary progress. The idea was that the government would do the heavy lifting to pave the way, and the mainstream HPC market would follow to take advantage of the revolutionary advances. In our annual HPC predictions, Hyperion Research analysts (then at IDC) pointed out the risk that the government-supported high-end HPC market might split off as a separate ecological niche, while the mainstream market continued to evolve on its own inertial path.

That split hasn’t happened. Instead, government officials for the most part have realized that they are no longer the primary drivers of HPC. Market forces have usurped that role. In the worldwide HPC market’s diversification and financial expansion during the past three decades, from about $2 billion in 1990 to more than $27 billion in 2019, government has kept an important role but no longer occupies the kingpin position it once held. Government officials in most HPC-exploiting countries have inflected their strategies to take better advantage of market forces, especially technology commoditization and open standards.

The fact that governments have met HPC market forces partway is a good thing for almost all parties (some legacy codes still need to be modernized). It means that many of the government-supported technology advances for exascale computing will sooner or later benefit the mainstream HPC market, including SMEs that buy only a rack or two of technical servers. Some of these advances may spill over into the larger IT market, well beyond the boundaries of HPC.

That, in turn, means that savvy government officials can help to justify the skyrocketing investments needed for exascale supercomputers by pointing to likely ROI in the larger HPC and IT markets, including potential benefits for industry and commerce.

Measuring these real and potential benefits becomes important. Until recently, even the most powerful, expensive new supercomputers were designed in part as “Linpack machines.” Even though the Top500 Linpack test was never intended to measure the performance of high performance computers on a broad spectrum of workload types, many government funders saw superior Linpack performance as a mark of leadership.

The NCSA “Blue Waters” procurement set a different example by placing overwhelming stress on the assessed needs of user applications, targeting performance gains on these applications as the primary measure of success. Exascale initiatives around the world have largely followed suit, though sometimes with one eye still on the Top500 list. The point here is that, if more buyers of HPC systems at all price points target performance gains on user applications and treat the Linpack benchmark as it was meant to be treated, this should lead to better system balance and wider applicability of HPC systems over time.

First-generation exascale systems have been designed to alleviate existing architectural imbalance, something especially important given the rise of AI and other data-intensive workflows. Studies by Hyperion Research and others confirm that many of the economically important AI applications, including precision medicine, automated driving and Smart City development, will benefit from architectures with sufficient balance to efficiently support concurrent simulation and analytics runs. An important future capability will be to integrate the sometimes orthogonal results of simulation and analytics runs, rather than forcing this to happen inside the brains of researchers.

More good news is that governments around the world have increasingly recognized that HPC is a transformational technology that can boost not only scientific leadership, but also industrial and economic competitiveness. Accompanying this recognition is the notion that HPC is too strategic to outsource to another country, meaning to the U.S. in most cases. Exascale initiatives in Asia and Europe are promoting the development of indigenous technologies, often in conjunction with non-native components. In the short term these indigenous movements may cause some understandable market disruption, but in the end they should increase choices and competition in the global HPC market.

Hyperion Research has said for some years that software advances will be even more important than hardware progress in determining HPC’s future. It’s gratifying to see national and regional exascale initiatives increase funding for exascale software development, although the amounts still seem unequal to the task.

The long-term good news is that HPC has become a mature market, one driven by market forces. That gives strong assurance that the market will behave rationally over time. Demand, in the form of buyer and user requirements, will increasingly win out.

Back To Top Button