How Close Are We—Really—to Building a Quantum Computer?
The development of a functional quantum computer, capable of revolutionizing various fields of science and technology, is a race that has attracted the attention of major tech companies such as IBM, Microsoft, Google, and Intel. Although the realization of such a machine is likely more than a decade away, these companies continuously celebrate incremental achievements in the field. While increasing the number of quantum bits, or qubits, on a processor chip is a common milestone, the journey toward quantum computing encompasses more than just manipulating subatomic particles.
A qubit, the basic unit of information in quantum computing, can represent both 0 and 1 simultaneously—a phenomenon known as superposition. This property enables qubits to perform multiple calculations simultaneously, significantly enhancing computational speed and capacity. However, not all qubits are equal, and there are different types with varying characteristics. For example, the state of an electron determines whether a bit is 0 or 1 in a spin qubit. Despite their potential, all qubits are extremely fragile and require very low temperatures, around 20 millikelvins, to maintain stability.
Building a quantum computer involves more than just the processor. These systems require new algorithms, software, interconnects, and other yet-to-be-invented technologies tailored to harness the immense processing power and enable data sharing or storage. The complexity involved is a significant obstacle to progress. Intel, for instance, introduced a 49-qubit processor named Tangle Lake and developed a virtual-testing environment for quantum computing software. However, to truly grasp quantum computer software development, hundreds or even thousands of qubits must be simulated.
In an interview with Scientific American, the director of quantum hardware at Intel Labs, Jim Clarke, sheds light on the different approaches to building a quantum computer and the challenges associated with the technology. He explains that conventional computing relies on binary states, where a transistor is either up or down. In contrast, a quantum computer exploits the superposition of qubits, representing both 0 and 1 simultaneously until the qubit settles into a resting state. This enables quantum computers to explore an exponentially larger state space than classical computers. The analogy of spinning coins illustrates this concept, where multiple coins spinning in the air represent an exponential number of states. The fragility of qubits stems from their susceptibility to disturbances such as noise, temperature changes, or vibrations, which can disrupt their operation and cause data loss. To address this, qubits often require extremely low temperatures to remain stable.
Various types of qubits exist, each with its own way of manipulation and entanglement—the process of qubits interacting to perform complex calculations. Superconducting qubits, like Intel’s Tangle Lake processor, require extreme cooling. Another approach involves trapped ions held in place by laser beams. Intel is exploring a third type, called silicon spin qubits, which resemble conventional silicon transistors but operate using a single electron. While superconducting qubits are more mature technologically, silicon spin qubits offer scalability and commercialization potential.
The path toward quantum computing involves the development of quantum chips, along with simulators running on supercomputers. The simulations serve as a foundation for designing architectures, compilers, and algorithms. However, the practicality of software and applications cannot be determined until physical systems with several hundred to a thousand qubits are available. There are two avenues for scaling quantum computers: increasing the number of qubits, which presents physical space limitations for large-scale systems, or reducing the size of the integrated circuit. Intel is exploring the latter option with silicon spin qubits, as they are significantly smaller than superconducting qubits.
Regarding the impact of quantum computing, Clarke mentions that the initial quantum algorithms proposed focus on security, chemistry, and materials modeling—problems that are challenging for classical computers. However, research groups and startups are also investigating machine learning and artificial intelligence (AI) using quantum computers. While AI development may be influenced more by conventional chips optimized for AI algorithms, quantum computing could still play a role in AI advancements.
As for the timeline of achieving working quantum computers that solve real-world problems, Clarke emphasizes that major advancements in computing technology historically took time. The first transistor was introduced in 1947, followed by the first integrated circuit in 1958, and Intel’s first microprocessor in 1971. Each milestone was more than a decade apart. While some may claim quantum computers are just a few years away, Clarke argues that they underestimate the complexity of the technology. He suggests that if, in 10 years, a quantum computer with a few thousand qubits becomes a reality, it would have a transformative effect similar to the advent of the first microprocessor.