You’ve heard that the first computer was the size of a small house, right? And how amazing it is that we all carry computers around in our pockets now? Well, some computers still are the size of houses—or even apartment buildings. These huge computers are so big because they’re super fast. And they’re capable of some amazing things.
Exascale supercomputers are the next frontier in computing. They can quickly analyze massive volumes of data and realistically simulate many of the extremely complex processes and relationships behind the fundamental forces of the universe—in a way that’s never been done before. Many industries and systems could be affected, including precision medicine, climate science, and nuclear physics. Here’s a little more about how exascale computing works and how it stands to change the world.
How is computer speed measured?
One way scientists measure computer performance speed is in floating-point operations per second (FLOPS). These operations are simple arithmetic, like addition or multiplication, involving a number containing a decimal, like 3.5. A person can typically solve an operation such as addition with a pencil and paper in one second—that’s 1 FLOPS. Computers can solve these operations much faster. They are so fast that scientists use prefixes to talk about the speed.
A typical laptop is capable of a few teraFLOPS, or a trillion operations per second.
What is a supercomputer?
The first supercomputer was developed in 1964, running 3,000,000 FLOPS, or 3 megaFLOPS.
Since then, research teams have been in a constant race to build a faster computer. In 1996, computers hit the terascale milestone—that’s 12 zeros—when the US Department of Energy’s Intel ASCI Red supercomputer was measured at 1.06 teraFLOPS. The Roadrunner supercomputer was the first to pass the petascale milestone (15 zeros) when it was recorded running 1.026 petaFLOPS in 2008.
Exascale computing is more than a million times faster than ASCI Red’s peak performance. “Exa” means 18 zeros. That means an exascale computer can perform more than 1,000,000,000,000,000,000 FLOPS, or 1 exaFLOPS. To contextualize how powerful an exascale computer is, an individual would have to perform one sum every second for 31,688,765,000 years to equal what an exascale computer can do in one single second.
In May 2022, the Frontier supercomputer at the Oak Ridge National Laboratory in Tennessee clocked in at 1.1 exaFLOPS, becoming the first exascale computer on record and the current fastest supercomputer in the world. Over the coming years, Frontier could reach a theoretical peak of two exaFLOPS.
Which industries could be affected by exascale computing?
Exascale computing could allow scientists to solve problems that have until now been impossible. With exascale, exponential increases in memory, storage, and compute power may drive breakthroughs in several industries: energy production, storage, transmission, materials science, heavy industry, chemical design, AI and machine learning, cancer research and treatment, earthquake risk assessment, and many more. Here are some of the areas where exascale computing might be used:
- Clean energy. Exascale computing could help develop resilient clean-energy systems. New materials developed with exascale computing can perform in extreme environments or adapt to changes in the water cycle, for example.
- Medical research. Exascale computing can support the analysis of massive data volumes and complex environmental genomes. It can also support cancer research in analyzing patient genetics, tumor genomes, molecular simulations, and more.
- Manufacturing. Using exascale computing could accelerate the adoption of additive manufacturing by allowing faster and more accurate modeling and simulation of manufacturing components.
How is exascale computing different from quantum computing?
Exascale computers are digital computers, like today’s laptops and phones, but with much more powerful hardware. On the other hand, quantum computers are a totally new approach to building a computer. Quantum computers won’t replace today’s computers. But using the principles of quantum physics, quantum computing will be able to solve very complex statistical problems that are difficult for today’s computers. Quantum computing has so much potential and momentum that McKinsey has identified it as one of the next big trends in tech.
Put simply, exascale computing—and all classical computing—is built on bits. A bit is a unit of information that can store either a zero or a one. By contrast, quantum computing is built on qubits, which can store any combination of zero and one at the same time. When classical computers solve a problem with multiple variables, they have to conduct new calculations every time a variable changes. Each calculation is a single path to a single result. On the other hand, quantum computers have a larger working space, which means they can explore a massive number of paths simultaneously. This possibility means that quantum computers can be much, much faster than classical computers.
For a more in-depth exploration of these topics, see McKinsey’s insights on digital. Learn more about our Digital Practice—and check out digital-related job opportunities if you’re interested in working at McKinsey.
“Quantum computing use cases are getting real—what you need to know,” December 14, 2021, Matteo Biondi, Anna Heid, Nicolaus Henke, Niko Mohr, Lorenzo Pautasso, Ivan Ostojic, Linde Wester, and Rodney Zemmel.
“Top trends in tech,” June 11, 2021, Jacomo Corbo, Nicolaus Henke, and Ivan Ostojic.