Making fault-tolerant quantum computers a reality

Quantum computing (QC)—and particularly quantum error correction—has the potential to open up unprecedented business capabilities across a range of industries and applications, including in pharmaceuticals, green technology, finance, and transportation. Surging investments in quantum start-ups in 2024, which were 50 percent higher than in 2023, and faster-than-expected innovation could propel the quantum market to almost $100 billion by 2035, according to our latest Quantum Technology Monitor report.

However, the quantum industry continues to face a daunting challenge—noise. Quantum noise emanates from, for instance, the surrounding environment, temperature fluctuations, and neighboring qubits and causes quantum systems to destabilize, leading to decoherence that limits their utility. Depending on the qubit modality (such as superconducting qubits, neutral atoms, and trapped ions), different kinds of noise manifest as technological challenges that require a holistic approach to both hardware and software.

Enter quantum robustness, which is QC systems’ resilience to said noise and, consequently, decoherence. Quantum robustness enables qubit modalities to scale quantum utility and fault-tolerant quantum computing (FTQC) and can be achieved with a set of three main techniques: quantum error suppression, quantum error detection and correction, and quantum error mitigation. Thus far, many quantum hardware players have focused only on suppression and mitigation to advance hardware. While challenges with hardware remain, quantum industry players are now starting to develop error detection and correction as well.

In this post, we discuss quantum robustness and six criteria (including modality, compatibility, and code distance) that quantum leaders can use to assess where the various available and developing techniques can support their robustness strategy. Quantum leaders, CTOs, and investors who prioritize achieving quantum robustness can develop a competitive edge in offering the critical, high-impact solutions that will advance this emerging industry.

Quantum robustness: Controlling the noise

In quantum systems, noise easily occurs if the system isn’t properly isolated from its environment. Quantum noise breaks down the quantum algorithm and affects the qubits’ entanglement and superposition properties. The loss of quantum information makes computation unreliable and affects quantum utility.

To bring the noise under control, quantum industry leaders need to pursue quantum robustness, which can be achieved with a complementary set of solutions: error suppression, error detection and correction, and error mitigation (Exhibit 1). Each of these solutions has its own challenges reaching quantum utility and full-scale FTQC.

Image description:  A table displays how quantum robustness could conquer noise and decoherence using three techniques in concert. The first technique is error suppression, the second is error detection and error correction, and the third is error mitigation. The table also displays the required timing, overhead, and tech stack for each technique. For error suppression, it occurs during preprocessing, there is no overhead, and it is close to the hardware. For error detection and error correction, it occurs in real time, there is qubit overhead, and it is in the middleware and hardware. For error mitigation, it occurs during postprocessing, there is quantum processing overhead, and it is in the middleware and software. End of image description.

Error suppression

Error suppression is a preprocessing technique that includes standard practices such as changing or adding control signals, which are electrical signals that control qubits and counteract noise by canceling unwanted energy shifts and reducing errors in quantum operations. The technique does not increase runtime or require additional qubits. The challenge is that suppression is not perfect; therefore, fault tolerance cannot be reached.

Error detection and correction

Error detection and correction are real-time techniques that enable logical qubits to be defined and encoded with a set of physical qubits that have a significantly lower error rate than standalone physical qubits. The redundant physical qubits serve as ancilla (or helper) qubits to protect the quantum information. Three of the most important families of quantum error correction (QEC) codes are repetition codes, surface codes, and quantum low-density parity check (qLDPC) codes (Exhibit 2).

Image description:  A table displays three types of code included in error correction: surface, quantum low-density parity check (q L D P C), and repetition. The table lists information related to the execution of each type of code, as well as upsides and downsides for each. For surface codes, an example of execution is detection via measurement and comparison of neighboring qubits; an example of an upside is that surface codes are among the most fault-tolerant quantum error correction codes and can handle various error types; and an example of a downside is the high number of physical qubits required. For q L D P C codes, an example of execution is the encoding of quantum information through nonnearest neighbors, building long-distance connections; an example of an upside is efficient encoding and decoding; and an example of a downside is potentially resource-intensive sparse matrix operations. For repetition codes, an example of execution is encoding information by repeating the quantum state across multiple physical qubits in a linear arrangement; an example of an upside is effectiveness in correcting bit-flip errors; and an example of downside is decoding challenges susceptible to error propagation during syndrome measurements. End of image description.

Real-time error detection identifies when errors occur by measuring properties of qubits or performing operations (for example, through a stabilizer measurement) to determine whether errors have affected the quantum state. Detection is required for correction, but it is also useful on its own—in dynamic stabilization,1 for example.

While QC players are already solving for and integrating QEC into their quantum systems,2 Asif Sinay, CEO and co-founder of Qedma, noted that achieving QEC and improving quantum utility will require a significant number of qubits and a high threshold for error rates. This is because QEC is resource-intensive and often requires many ancilla qubits to obtain actual correction. To realize the next level of quantum utility, QC developers need to reduce the number of ancilla qubits needed to perform correction.

Error mitigation

Error mitigation is a postprocessing technique that uses algorithmic calculations and additional quantum processing unit (QPU) overhead to improve error robustness. Relying on statistical methods, noise analysis, and application of anti-noise pulses can mitigate errors in quantum circuits. However, error mitigation requires multiple circuit runs to ensure functional statistics, which increase QPU time and cost.

Integrating quantum robustness solutions

Quantum-industry leaders can integrate quantum robustness solutions into the tech stack on multiple levels (Exhibit 3).

Image description:  A flow chart shows how quantum robustness solutions affect different parts of the tech stack, excluding hardware improvements. Some categories of the stack are designated as having had recent progress observed; others are marked as having the most room to improve. Error mitigation relates to the quantum operating system and quantum instruction set architecture portions of the tech stack. It provides a software interface to control quantum hardware. Error suppression relates to the quantum control processor, control electronics, quantum error correction, feedback calibration, and readout electronics portions of the stack. It aims to lower the rate at which errors occur in quantum systems. Error detection and correction, which relates to the same portions of the tech stack as suppression, is key for quantum robustness.  End of image description.

Michael Biercuk, CEO and founder of Q-CTRL and a professor at the University of Sydney, states that “large-scale machines will continuously evolve—quantum error correction doesn’t instantly fix all errors once implemented. Instead, error correction must be iteratively developed and combined with complementary techniques to yield performance and efficiency gains that meet the needs of key applications and hardware constraints.” As a result, today’s strategic quantum robustness road maps need to combine error suppression, error detection and correction, and error mitigation techniques.

Assessing QEC methodologies for a full-stack solution

When developing a robustness strategy, QC companies strive for a full-stack solution that effortlessly integrates hardware, middleware, and software. Quantum leaders can consider six criteria when comparing QEC methodologies as part of their robustness strategy:

  1. Modality compatibility: To what extent is the QEC method compatible with different quantum hardware modalities?
  2. Architecture support: Is the QEC method aligned with the specific characteristics of different qubit connectivity patterns? This determines how feasible it is to implement QEC in certain quantum processor architectures.
  3. Qubit overhead: How many physical qubits are needed to represent a single logical qubit? A higher overhead requires more qubits to achieve fault tolerance, which affects its scalability because some codes (such as surface, repetition, and qLDPC codes) require more overhead than others.
  4. Code distance: What is the minimum number of physical qubit errors that must occur before they irreparably corrupt a logical qubit? Logical qubits with higher code distances are more likely to resist errors and sustain fault tolerance.
  5. Scalability: Can the QEC system expand to larger quantum systems while effectively managing resources and maintaining robust performance?
  6. Recent advancements: What is the rate of progress of QC research and development, including around the development of specific error correction codes?3 This highlights the solutions’ readiness for practical deployment.

QEC priorities for quantum players

As Steve Bierley, CEO and founder of Riverlane, stated, “It is increasingly certain that Q-day is coming, but people are not really getting ready for that; what they should have done in AI, they should be doing that right now in quantum.”

To move beyond the noisy intermediate-scale quantum computing era, quantum industry leaders and investors will need to make QEC technology and investment a priority while leveraging value chain partnerships.

For C-level executives. C-level executives of emerging quantum companies need to build partnerships across the value chain and focus on the interplay between hardware and software, which could unlock more of hardware’s potential value. C-level executives will need to understand the challenges of quantum utility and scaling quantum computers as well as the breakthroughs that are required in error handling strategies.

For hardware start-ups. Hardware start-ups will need to partner with dedicated software players4 to incorporate QEC solutions into their tech stack or build error correction methods in-house to provide full-stack error robustness solutions for industry adopters.

For quantum leaders. Quantum leaders will need to understand the role of error handling techniques in developing QC and focus on building partnerships along the value chain to stay competitive.

For investors. Investors will want to invest in quantum technology that has the highest likelihood of reaching fault tolerance and look out for key differentiators such as hardware and software developments that include error correction. Many hardware players are already developing error correction solutions as a critical part of their road map and are forming partnerships that provide the respective control and error handling software that is required for achieving fully FTQC.

The technologies that enable quantum robustness, including error correction, are developing quickly. The C-level executives, quantum leaders, and investors who make investing in these technologies a priority could lead the industry toward FTQC while also positioning their companies to benefit from the potential $0.9 trillion to $2.0 trillion in value QC technologies could generate by 2035.

Henning Soller is a partner in McKinsey’s Frankfurt office, Martina Gschwendtner is a consultant in the Munich office, and Victor Kermans is a consultant in the Brussels office.

The authors wish to thank Matija Zesko for his contributions to this post.

1 Dynamic stabilization refers to the process by which qubits are constantly rotated in just the right way to make them effectively immune to the noise that would normally randomize them.
2 See, for example, The Keyword, “Meet Willow, our state-of-the-art quantum chip,” blog entry by Hartmut Neven, Google, June 12, 2025.
3 Microsoft Azure Quantum Blog, “Microsoft and Quantinuum create 12 logical qubits and demonstrate a hybrid, end-to-end chemistry simulation,” blog entry by Krysta Svore, Microsoft, September 10, 2024.
4 David Marshall, “Qedma’s error mitigation software now available within IBM’s Qiskit Functions,” VMblog.com, September 16, 2024.