Navigating the Noise: The Potential of NISQ Computers

The term NISQ computer is frequently used in the quantum computing space to describe the first generation of quantum processors. Coined by John Preskill in 2018, NISQ stands for Noisy Intermediate-Scale Quantum. Currently, available NISQ processors typically range from around 20 to slightly over 100 qubits, which are the quantum equivalent of a processing unit.

However, due to the current design of NISQ computers, they are not advanced enough to deliver error-free results, nor are they scalable enough to solve real-world business problems. This is due to the error correction requirements that are still needed. As such, early quantum computers suffer from significant issues surrounding scalability and fault tolerance, which are directly related to the need for quantum error correction.

While we know that these early quantum computers are not yet capable of delivering the ultimate production power and value that quantum computers will offer, ongoing research in the field holds promise for the development of more advanced and powerful quantum processors.

What is Quantum Error Correction?

Quantum Error Correction (QEC) is an essential component of quantum computing as it helps to address the challenge of preserving the fragile superposition states of qubits, which are susceptible to environmental noise and decoherence.

In contrast to classical computing, where binary states of 0s and 1s are stable and easily maintained, qubits in quantum information processing exist in superpositions of multiple states, which are highly susceptible to decoherence from a variety of sources, including thermal and electromagnetic noise. The preservation of superposition states is crucial in quantum computing, and QEC techniques are used to detect and correct errors that occur during the computation process.

The primary objective of QEC is to protect the encoded quantum information by detecting and correcting errors without disturbing the encoded state. To achieve this, a redundant encoding is applied to the qubits, where the information is distributed among multiple physical qubits. These redundant qubits enable the detection of errors and allow for the correction of errors without losing the original quantum state.

In summary, QEC is a crucial aspect of quantum computing that enables the preservation of the superposition states of qubits in the presence of environmental noise and decoherence. The primary goal of QEC is to detect and correct errors without disrupting the encoded quantum state, which is achieved through the use of redundant encoding and error correction techniques.

Quantum Error Correction and NISQ Computers

Quantum computers are highly sensitive to external factors, which can cause errors in their calculations. This phenomenon is known as noise, and it is a natural occurrence in any quantum system. Due to this sensitivity, protecting quantum computers from external noise is challenging. Even a nearby microwave can cause quantum decoherence and result in a loss of quantum states, leading to errors. This highlights the need for error-correction methods in NISQ (Noisy Intermediate-Scale Quantum) computers.

Quantum Error Correction (QEC) refers to code or algorithms designed to identify and rectify errors in quantum computers. Early QEC algorithms encode a logical quantum bit, which means that quantum information stored in a single qubit is shared across “supporting” qubits. The primary objective of QEC is to safeguard the original quantum information from errors during processing by the quantum system.

However, the implementation of QEC comes at a significant cost in terms of qubits required, and the number of error-correcting qubits varies depending on the hardware architecture and the type of computation performed. The more noise present, the greater the number of qubits required. Current estimates suggest that we need around 1000 error-correcting qubits for every single computational qubit.

Unfortunately, current NISQ systems are not scalable beyond a few hundred qubits in either quantum annealing or gate model machines. As a result, quantum error correction remains an insurmountable challenge for existing NISQ systems. We need more sophisticated and scalable machines to overcome this critical limitation.

A Glimpse of the Future

As we look to the future of quantum computing, we must consider the challenges posed by noise and decoherence, as well as the resulting errors that limit the scalability of these machines. However, there are some positive developments that offer potential solutions to these problems.

One promising avenue is the discovery of new problems that can be tackled with noisy intermediate-scale quantum (NISQ) computers, as users and vendors continue to explore the possibilities of these machines. Additionally, there is a growing focus on designing algorithms that can minimize the size and complexity of quantum computations, which could reduce the scalability requirements for NISQ computers.

One example of such algorithms is our QAmplify software, which has demonstrated the ability to significantly enhance the capabilities of gate-model and quantum annealing machines, with real-world optimization problems. Specifically, QAmplify can amplify the capabilities of gate-model machines by 5x and quantum annealing machines by up to 20x.

Another exciting development in the quantum computing market is the emergence of new approaches to quantum computing. For instance, QCI recently achieved groundbreaking results with an entropy quantum computer, successfully solving a complex Autonomous Vehicle problem for the BMW Group. This represents the largest problem solved to date on a quantum computer and highlights the potential for innovative approaches to drive the adoption of quantum computing in the marketplace, beyond the limitations of NISQ machines.

The Bottom Line

Quantum computing is poised to revolutionize our world, but the timing of its widespread impact remains uncertain. Current NISQ (Noisy Intermediate-Scale Quantum) computers are just the first step towards realizing the full potential of quantum computing beyond its current status as an experimental science.

One of the major obstacles facing quantum computing is noise, which leads to processing errors. As NISQ computers are inherently noisy, new innovations are necessary to improve their reliability and pave the way toward quantum computing’s true value.

The question of whether NISQ computers will benefit from Quantum Error Correction (QEC) innovation or require entirely new quantum approaches remains subjective, depending on the individual’s experience and perspective.

Ultimately, the success of quantum computing will be determined by its ability to solve real-world problems with accuracy, efficiency, and sophistication. To prove its value, we need concrete evidence from real-world applications and computations, conducted by quantum computers that can handle these tasks with speed and precision while maintaining elegance.