Why are quantum computers taking so long to perfect?

The journey to perfecting quantum computers has been protracted, primarily due to the formidable challenges of maintaining quantum coherence and minimizing errors.
Share this STORY

Introduction

In the past two decades, the race to harness the power of quantum computers has witnessed significant investments, with tech giants like Google, Microsoft, and IBM spearheading the charge.

Moreover, With over $5 billion poured into this quest, the promise of a computing revolution looms on the horizon. At the forefront of this pursuit is Jay Gambetta, the mind behind IBM’s quantum leap, orchestrating the development of quantum computers and pushing the boundaries by putting these systems in the cloud.

Follow us on Linkedin for everything around Semiconductors & AI

The Quantum Promise

Quantum computers, operating on the mind-bending principles that govern matter at the atomic level, hold the potential to revolutionize fields ranging from drug discovery to cryptography. Additionally,they can simulate and analyse complex molecular interactions with unprecedented speed enabling rapid exploration of various drug compounds and their potential effects on biological systems.


Regarding Cryptography which is the science and practice of encoding messages or data to protect them from unauthorized access or tampering, quantum computes can potentially break existing cryptographic codes by solving complex mathematical problems faster than classical computers.

Additionally, the allure of quantum computing lies in its ability to process information in ways that classical computers could only dream of.

Read More: Explained: What the hell is Artificial Intelligence – techovedas

A Decade of Disappointments

Amidst the promises, a crucial obstacle stands in the way — noise. Unlike classical computers that can withstand external disturbances with ease, the delicate nature of quantum systems makes them vulnerable to even the slightest interference.

For years, researchers grappled with the challenges of noisy circuitry, exploring applications that could make sense of the limited quantum capacity. The results were less than stellar, leading to a period of creeping disappointment. Yet, recent breakthroughs on both theoretical and experimental fronts have injected a renewed sense of optimism into the field.

According to Earl Campbell, the Vice President of Quantum Science at Riverlane, a quantum computing company based in Cambridge, UK, “I’m seeing much more evidence being presented in defence of optimism.”

What is the root cause of noise vulnerability?

Quantum computers are considered delicate. This is because they rely on the principles of quantum mechanics, where qubits can exist in multiple states simultaneously. This superposition makes them highly sensitive to external influences, such as stray photons, random signals, or vibrations, which can disrupt the fragile quantum states, leading to errors. Overcoming this noise is the main for quantum computers to surpass their classical counterparts.

Read More: Qualcomm Invests 177 Cr For Design Centre in Chennai; Creating 1600 Jobs – techovedas

Cloud-Based Quantum Computing

A pivotal moment in the quantum computing saga came with the introduction of cloud-based quantum computing. Additionally,Jay Gambetta, a physicist hailing from Australia, envisioned a world where physicists could operate IBM’s quantum bits or “qubits” themselves, and he pitched the idea of cloud-based quantum computing to IBM’s executives in 2014.

IBM’s cloud-access quantum computer, born on May 4th, 2016, became an instant hit, with 7,000 people signing up in the first week alone. However, as researchers began testing quantum computers with a few qubits working together, unforeseen noise-related challenges emerged, impacting the performance of quantum algorithms.

Read More: Hotel Administration to Billion Dollar Company: Story of Qualcomm

Solving the Quantum Noise problem

Researchers have responded to the noise challenge with a three-pronged approach: error suppression, error mitigation, and quantum error correction (QEC).

1.Error Suppression:

Dynamical Decoupling (DD) actively applies sequences of control pulses periodically to negate undesirable noise.

Derangement circuits actively prepare multiple identical copies of quantum states, protecting them by continuously rearranging the qubits.

Quantum feedback control- which uses closed-loop classical feedback to control the dynamics of the quantum system.

2. Quantum Error Correction (QEC):

The foundational concepts of QEC is the encoding of quantum information across multiple physical qubits. Additionally,In the case where quantum information is encoded onto only one physical qubit and that qubit experiences an error, the error can destructively cascade throughout the entire system. However, by encoding the same quantum information across multiple highly entangled physical qubits, collectively referred to as a logical qubit, the system can detect and correct errors on individual qubits, thereby preserving and protecting the quantum information.
Additionally, One of the earliest quantum error correction codes (QECC), the Shor Code, encodes a logical qubit with only nine physical qubits.

3.Error Mitigation:

Out of the three strategies, quantum error mitigation is of most interest to quantum coders. At the hardware level, error suppression and error correction occur, while error mitigation is classically applied after a circuit has executed. Quantum error mitigation, also known as measurement error mitigation, involves the software analyzing qubit error rates and connections to create a noise model. Upon receiving measurement results, the software actively applies the reverse of the noise model.

Read more – Explained: What the hell is a Quantum Computer?

Quantum Computing’s Stride

The amalgamation of these noise-handling techniques marks a significant milestone, especially when the notion of obtaining useful results from small-scale, noisy processors seemed like a dead end.

According to Michael Biercuk, director of the Quantum Control Laboratory at the University of Sydney, “I think we’re well on the way now. I don’t see any fundamental issues at all.”


These advancements coincide with general improvements in hardware performance, reducing baseline errors and increasing the number of qubits on each processor. Researchers hope that quantum computers will soon outperform the best classical machines for certain tasks.

IBM’s Quantum Processors and Beyond

IBM’s quantum processors, including Eagle, Osprey, and Condor, signify a shift towards scalability. The modular Heron design and the upcoming Flamingo processor in 2025 aim to enable truly large-scale quantum computation by allowing quantum information to flow between different processors without hindrance.

Quantum Utility: A New Frontier

The debate in the quantum computing community has long revolved around supremacy, advantage, and utility. IBM’s recent move to retire entry-level processors in favor of the 127-qubit Eagle processor is aimed at pushing researchers to prioritize truly useful tasks. IBM claims Eagle is a “utility-scale” processor, providing useful results to problems that challenge the best scalable classical methods.

Conclusion: The future outlook


Quantum computing’s progress is not only about achieving computational supremacy but also about utility and practicality. In conclusion, the quantum revolution is no longer confined to the realms of theory and experimentation. The persistent efforts to tame quantum noise and the optimism surrounding breakthroughs indicate that quantum computing is not just a distant dream but an imminent reality in the coming years.

Share this STORY