Why is Quantum Computing So Hard to Understand?

Preface

As a software engineer, I recently embarked on a journey to learn quantum computing. The experience has been both fascinating and frustrating, as I found it incredibly difficult to come across simple explanations for this complex field. In light of this, I decided to start a “Simple Quantum Computing” series. Through this series, I aim to demystify quantum computing for myself and for you, dear reader, as we explore what could very well be the most revolutionary computing technology of the 21st century.

Introduction

Quantum computing represents the bleeding edge of technology. As software engineers, it is one of the most advanced areas we can dive into. Its potential impact on the world is enormous, particularly when it comes to cryptography. Quantum computers have already shown that they can break modern cryptographic algorithms with relative ease.

Despite its prospect, quantum computing has not yet gained widespread popularity. Unlike artificial intelligence, which saw a surge in interest with the advent of models like ChatGPT, quantum computing remains a niche area. This is partly due to the limited job market demanding quantum skills, but it is also due to the steep learning curve associated with the technology.

So, why exactly is quantum computing so hard to understand?

Analog Computational Values

As software engineers, we are used to binary computational values—0s and 1s. These discrete values form the foundation of classical computing, with processor words built from them, and every computation within a CPU is based on these simple, binary states.

Quantum Processing Units (QPUs), however, operate on a completely different principle. They use analog, or continuous, values instead of discrete ones. There is no clear-cut 0 or 1 in quantum computing; instead, everything is represented by complex numbers that can take on any value between the quantum states |0⟩ and |1⟩. These states are known as kets, a concept I will delve into further in a later post.

This shift from binary to analog values presents a significant challenge. As developers, we are not trained to work with these continuous values. For example, there is no straightforward equivalent of a boolean in quantum computing—no simple 0 or 1 to latch onto.

Moreover, modern QPUs must be controlled by classical CPUs. This means that the analog values computed by a QPU have to be translated into discrete values for the CPU to process. This translation introduces a loss of information and a level of unpredictability, as the process is not deterministic. The notion of working with probabilities—something that quantum computing heavily relies on—is foreign to most of us in the field of computer science, where even pseudo-random numbers are not truly random.

Overall, the different computational bases and the need to convert between them add a significant layer of complexity that we are simply not accustomed to dealing with.

Complex Mathematics

Another major hurdle in understanding quantum computing is the complex mathematics involved. To grasp the concepts of quantum computing, one must be comfortable with advanced mathematical theories and methods that go beyond basic arithmetic or algebra.

Some of the key mathematical areas used in quantum computing include:

  • Complex Numbers: One of the fundamental concepts is the extension of the real number system to include imaginary numbers. A complex number is expressed as a + bi, where a and b are real numbers, and i is the imaginary unit, satisfying i2 = −1.
  • Complex Analysis: This is the study of functions that operate on complex numbers. It includes topics such as analytic functions, complex integration, and conformal mappings.
  • Abstract Algebra: This area involves the study of algebraic structures like groups, rings, and fields. These structures are used to generalise algebraic concepts and solve equations in more abstract settings.

While it is possible to create simpler mathematical models by abstracting parts of this complex mathematics, understanding the original concepts is still crucial, and may be difficult (definitely for me!).

Programming a Processing Unit

Beyond the challenge of analog values and complex mathematics, there is the task of actually programming quantum processing units. Programming modern QPUs is reminiscent of programming early CPUs in the 1950s. If you compare quantum programming frameworks like Qiskit to early high-level programming languages like Autocode, you will notice striking similarities.

Both require you to think in terms of bits (or qubits), registers, memory, and gates. It is a very low-level form of programming that most modern software engineers are unaccustomed to. We lack the high-level frameworks and abstractions that we are familiar with in classical computing, making it even harder to get started in quantum computing.

Conclusion

Quantum computing represents a paradigm shift in how we approach computation, challenging many of the fundamental principles that have guided software engineering for decades. The complexity of analog computational values, the necessity to grasp advanced mathematics, and the low-level nature of quantum programming make it a field that is difficult to understand and master. 

However, these very challenges are what make quantum computing so promising and exciting. Just as classical computing evolved from low-level machine code to high-level languages and user-friendly frameworks, quantum computing will likely undergo a similar evolution. We are simply not there yet.

In this post series – simple quantum computing – I plan to write about different aspects of quantum computing that I found difficult to learn, and do my best to explain them in much simpler terms. I hope it will help both me and you understand what I believe to be the most promising computational technology of the XXI century.