Fine. You want to understand this, don't you? Let's not pretend this is a casual inquiry. You're here because you've heard the whispers, seen the impossible calculations, and now you want the cold, hard facts. Don't expect me to hold your hand. This is quantum hardware, and it's not for the faint of heart.
Computer Hardware Technology That Uses Quantum Mechanics
At its core, a quantum computer is a machine that doesn't just crunch numbers; it dances with the fundamental laws of physics. It exploits the eerie phenomena of quantum mechanics – namely, superposition and entanglement – along with the inherently non-deterministic nature of quantum measurements, to perform computations. Think of it as sampling from a vast, evolving quantum system. It operates on a colossal number of possibilities simultaneously, but with a precision that classical computers can only dream of.
Classical computers, bless their predictable hearts, follow strict, deterministic rules. They can be mimicked by a Turing machine, albeit with a significant time penalty. Quantum computers, on the other hand, are believed to require an exponentially larger amount of time and energy to simulate classically. This is where the magic, and the potential, lies. A large-scale quantum computer could, in theory, shatter widely used public-key cryptographic schemes and allow physicists to perform incredibly complex physical simulations. However, the hardware is still largely experimental, confined to niche tasks.
The fundamental unit of information in this strange new world is the qubit, or "quantum bit." It's the quantum equivalent of the classical bit. But where a classical bit is stuck in either a 0 or a 1, a qubit can exist in a linear combination of both states simultaneously – a quantum superposition. When you measure a qubit, it collapses into one of the two states, but the outcome is governed by a probabilistic rule. Quantum algorithms are designed to manipulate these qubits in such a way that wave interference amplifies the probability of the desired result.
Building these machines is… difficult. Physically engineering high-quality qubits is a monumental task. If a qubit isn't perfectly isolated from its environment, it succumbs to quantum decoherence, introducing fatal noise into the computation. Governments are pouring money into research, aiming for qubits with longer coherence times and fewer errors. We're seeing implementations using superconductors – materials that eliminate electrical resistance – and ion traps, where single atomic particles are held in place by electromagnetic fields. While researchers claim, and are widely believed to be correct, that certain quantum devices can outperform classical computers on very specific tasks – a feat dubbed quantum advantage or quantum supremacy – these tasks aren't exactly solving your everyday problems.
History
For those who prefer a linear progression, there's a Timeline of quantum computing and communication.
For decades, quantum mechanics and computer science were separate continents. Modern quantum theory, born in the 1920s to explain the bizarre behavior of the very small, and digital computers, which emerged to automate tedious calculations, coexisted without much overlap. Both played critical roles in World War II – computers for wartime cryptography, and quantum physics for the nuclear physics of the Manhattan Project.
The fields began to merge when physicists started applying quantum mechanical models to computational problems, swapping classical bits for qubits. In 1980, Paul Benioff laid down a theoretical marker with the quantum Turing machine, a conceptual model of a computer operating on quantum principles.
As classical computers grew faster, simulating quantum dynamics became an exponential hurdle. This led Yuri Manin and Richard Feynman, independently, to propose that hardware built on quantum phenomena might be far more efficient for simulating physics. [10] [11] [12]
The intersection of quantum theory and computation expanded into cryptography. In 1984, Charles Bennett and Gilles Brassard proposed protocols that demonstrated how quantum key distribution could bolster information security. [13] [14]
Then came the algorithms. Deutsch's algorithm in 1985, followed by the Bernstein–Vazirani algorithm in 1993 and Simon's algorithm in 1994, showed that querying a black box with a quantum state in superposition could yield more information, a concept sometimes termed quantum parallelism. [15] [16] [17] These algorithms didn't solve practical problems, but they proved the theoretical advantage of quantum queries. [18]
Peter Shor, in 1994, delivered the bombshell: a scalable quantum computer could break the widely used RSA and Diffie–Hellman encryption protocols. [19] This revelation catapulted quantum computing into the spotlight. In 1996, Grover's algorithm offered a quadratic speedup for the ubiquitous unstructured search problem. [20] [21] That same year, Seth Lloyd confirmed Feynman's conjecture, proving that quantum computers could simulate quantum systems without the classical exponential overhead. [22] [23]
Experimentalists began constructing small-scale quantum computers, first using trapped ions and later superconductors. [24] A two-qubit demonstration in 1998 proved the concept's feasibility, [25] [26] and subsequent experiments progressively increased qubit counts and reduced error rates. [24]
In 2019, Google AI and NASA announced they had achieved quantum supremacy with a 54-qubit machine, executing a computation that they claimed was intractable for any classical computer. [27] [28] [29] [30] IBM quickly countered, arguing that their Summit supercomputer, with architectural optimizations, could perform the task in a mere 2.5 days, igniting a debate over the precise definition of "quantum supremacy." [31]
Quantum Information Processing
While computer engineers typically describe modern computers through the lens of classical electrodynamics, the reality is more nuanced. Components like semiconductors and random number generators do rely on quantum behavior, but their interaction with the environment causes quantum information to decohere almost instantly. This means that while programmers might use probability theory for randomized algorithms, concepts like superposition and wave interference are largely irrelevant to program analysis.
Quantum programs, however, demand absolute control over coherent quantum systems. Physicists employ linear algebra to describe these systems mathematically. Complex numbers model probability amplitudes, vectors represent quantum states, and matrices represent the operations performed on these states. Programming a quantum computer, therefore, becomes an exercise in composing these operations to achieve a desired theoretical outcome that can also be implemented in practice.
As physicist Charlie Bennett aptly put it:
A classical computer is a quantum computer ... so we shouldn't be asking about "where do quantum speedups come from?" We should say, "Well, all computers are quantum. ... Where do classical slowdowns come from?" [32]
Quantum Information
Just as the bit is the bedrock of classical information, the qubit is the fundamental unit of quantum information. The term "qubit" refers to both the abstract mathematical model and any physical system embodying it. A classical bit is definitively either 0 or 1. A qubit, too, has states, often denoted as
|
0 ⟩
{\displaystyle |0\rangle }
and
|
1 ⟩
{\displaystyle |1\rangle }
, which correspond to the classical 0 and 1. However, these quantum states reside in a vector space, allowing them to be combined linearly. This combination, a superposition of
|
0 ⟩
{\displaystyle |0\rangle }
and
|
1 ⟩
{\displaystyle |1\rangle }
, is what gives qubits their unique power. [33] [34]
A two-dimensional vector mathematically describes a qubit's state. Physicists commonly use bra–ket notation for this linear algebra. A general qubit state is expressed as
α
|
0 ⟩ + β
|
1 ⟩
{\displaystyle \alpha |0\rangle +\beta |1\rangle }
, where
|
0 ⟩
{\displaystyle |0\rangle }
and
|
1 ⟩
{\displaystyle |1\rangle }
are the standard basis states, [a] and
α
{\displaystyle \alpha }
and
β
{\displaystyle \beta }
are the probability amplitudes – complex numbers, in general. [34] If either
α
{\displaystyle \alpha }
or
β
{\displaystyle \beta }
is zero, the qubit behaves like a classical bit. When both are non-zero, the qubit is in a superposition. Unlike classical probabilities, these amplitudes can be negative, enabling destructive wave interference.
When a qubit is measured in the standard basis, it collapses into a classical bit. The Born rule dictates the probabilities: measuring
α
|
0 ⟩ + β
|
1 ⟩
{\displaystyle \alpha |0\rangle +\beta |1\rangle }
yields
|
0 ⟩
{\displaystyle |0\rangle }
with probability
|
α
|
2
{\displaystyle |\alpha |^{2}}
and
|
1 ⟩
{\displaystyle |1\rangle }
with probability
|
β
|
2
{\displaystyle |\beta |^{2}}
. Any valid qubit state satisfies the normalization condition
|
α
|
2
|
β
|
2
= 1
{\displaystyle |\alpha |^{2}+|\beta |^{2}=1}
. For instance, measuring
1
/
2
|
0 ⟩ + 1
/
2
|
1 ⟩
{\displaystyle 1/{\sqrt {2}}|0\rangle +1/{\sqrt {2}}|1\rangle }
results in either
|
0 ⟩
{\displaystyle |0\rangle }
or
|
1 ⟩
{\displaystyle |1\rangle }
with equal probability.
Each additional qubit exponentially expands the state space. A two-qubit system exists in a four-dimensional vector space, represented by basis vectors like |00⟩, |01⟩, |10⟩, and |11⟩. The Bell state 1/√2|00⟩ + 1/√2|11⟩ is a prime example of entanglement, where the individual qubits lose their distinct states. For an n-qubit system, the vector space is 2 n -dimensional. Simulating this classically becomes an astronomical task; representing a mere 100-qubit system requires storing 2 100 classical values.
Unitary Operators
The state of a quantum memory can be manipulated using quantum logic gates, analogous to how classical logic gates operate. The quantum NOT gate, represented by the matrix
X :=
(
0
1
1
0
)
, {\displaystyle X:={\begin{pmatrix}0&1\1&0\end{pmatrix}},}
is one such example. Applying a logic gate to a quantum state vector is done through matrix multiplication. Thus,
X
|
0 ⟩
|
1 ⟩ {\displaystyle X|0\rangle =|1\rangle }
and
X
|
1 ⟩
|
0 ⟩ . {\displaystyle X|1\rangle =|0\rangle .}
These single-qubit gates can be extended to multi-qubit systems in two ways: applying the gate to a specific qubit while leaving others untouched, or applying it conditionally based on the state of another qubit. The controlled NOT (CNOT) gate illustrates this. It applies a NOT gate to the second qubit only if the first qubit is in the state
|
1 ⟩
{\displaystyle |1\rangle }
. Otherwise, it leaves both qubits unchanged. The CNOT gate's matrix representation is:
CNOT :=
(
1
0
0
0
0
1
0
0
0
0
0
1
0
0
1
0
)
. {\displaystyle \operatorname {CNOT} :={\begin{pmatrix}1&0&0&0\0&1&0&0\0&0&0&1\0&0&1&0\end{pmatrix}}.}
Mathematically, the operations are:
CNOT
|
00 ⟩
|
00 ⟩ , {\textstyle \operatorname {CNOT} |00\rangle =|00\rangle ,}
CNOT
|
01 ⟩
|
01 ⟩ , {\textstyle \operatorname {CNOT} |01\rangle =|01\rangle ,}
CNOT
|
10 ⟩
|
11 ⟩ , {\textstyle \operatorname {CNOT} |10\rangle =|11\rangle ,}
and
CNOT
|
11 ⟩
|
10 ⟩ . {\textstyle \operatorname {CNOT} |11\rangle =|10\rangle .}
Essentially, quantum computation unfolds as a network of these quantum logic gates and measurements. However, measurements can be deferred until the end of the computation, leading to quantum circuit diagrams that often depict only the gates.
Quantum Parallelism
Quantum parallelism is the idea that quantum computers can, in a sense, evaluate a function for multiple inputs simultaneously. This is achieved by preparing a quantum system in a superposition of input states and applying a unitary transformation that encodes the function. The resulting state holds all the output values within the superposition. This property is crucial for many quantum algorithms' speedups. However, it's not a magic bullet; measuring the final state yields only one result. To be truly useful, quantum algorithms must employ additional techniques to extract the desired information. [37] [38]
Quantum Programming
Gate Array
A quantum circuit is a sequence of few-qubit quantum gates. Think of it as a network of logic gates and measurements. Measurements can be postponed to the end, so circuits often show only gates. Any quantum computation, which is essentially any unitary matrix of size
2
n
×
2
n
{\displaystyle 2^{n}\times 2^{n}}
on
n
{\displaystyle n}
qubits, can be constructed from a small set of universal gates. Common universal sets include all single-qubit gates and the CNOT gate. The Solovay–Kitaev theorem guarantees that even an infinite set of gates can be approximated by a finite one. [39]
Measurement-Based Quantum Computing
In this model, computation is a sequence of Bell state measurements and single-qubit quantum gates applied to a highly entangled initial state, like a cluster state, using quantum gate teleportation.
Adiabatic Quantum Computing
Adiabatic quantum computing, often associated with quantum annealing, involves a slow, continuous transformation of an initial Hamiltonian into a final one. The solution to the problem is encoded in the ground state of this final Hamiltonian. The adiabatic theorem ensures that if the transformation is slow enough, the system remains in its ground state throughout the process. [40]
Neuromorphic Quantum Computing
This unconventional approach combines neuromorphic computing with quantum operations. It's proposed that quantum algorithms can be efficiently executed using this method. Both traditional quantum computing and neuromorphic quantum computing are physics-based approaches that diverge from the von Neumann architecture, constructing systems that leverage physical properties to find solutions.
Topological Quantum Computing
Topological quantum computers perform computations by braiding anyons in a 2D lattice. [41]
Quantum Turing Machine
The quantum counterpart to a Turing machine is the quantum Turing machine. [8] Various models, including quantum circuits, [42] one-way quantum computation, [43] adiabatic quantum computation, [44] and topological quantum computation, [45] have been shown to be equivalent to the quantum Turing machine, meaning one can simulate the others with at most polynomial overhead. However, for practical quantum computers, this overhead might be prohibitive.
Noisy Intermediate-Scale Quantum Computing
The threshold theorem suggests that increasing qubit numbers can mitigate errors, but fully fault-tolerant quantum computing remains a distant aspiration. [47] For now, NISQ (Noisy Intermediate-Scale Quantum) machines are seen as having specialized uses, but noise in quantum gates limits their reliability. [47] Researchers at Harvard have developed "quantum circuits" that correct errors more efficiently, potentially overcoming a major hurdle. [48] This work involved MIT, QuEra Computing, Caltech, and Princeton Universities, with funding from DARPA. [49] [50]
Quantum Cryptography and Cybersecurity
Conventional digital cryptography relies on the difficulty of reversing algorithms. Quantum computers, with their potential to break these algorithms, threaten this security. [51] Quantum cryptography, on the other hand, uses quantum mechanics for encryption, making it theoretically impossible to decode, even with a quantum computer. This comes with significant infrastructure demands and raises questions about legitimate access. [51]
Research in quantum and post-quantum cryptography is yielding advancements in quantum key distribution, quantum random number generation, and early technology demonstrations. [52]
Communication
Quantum cryptography opens new avenues for secure data transmission, notably through quantum key distribution, which leverages entangled quantum states. [52] : 1017 The act of sending quantum states ensures that any eavesdropper would disturb the system, leaving a detectable trace. [53] This allows for the establishment of shared private information resistant to interception. [13] [54]
While fiber-optic cables can transmit quantum information over short distances, developing reliable hardware like quantum repeaters is crucial for long-distance quantum networks. Such networks could enable distributed quantum computing and enhanced quantum sensing. [55] [56]
Algorithms
The quest for quantum algorithms primarily focuses on the quantum circuit model, though exceptions like the quantum adiabatic algorithm exist. These algorithms are often categorized by the type of speedup they offer over their classical counterparts. [57]
Algorithms providing more than polynomial speedup include Shor's algorithm for factoring, and related algorithms for discrete logarithms, Pell's equation, and the hidden subgroup problem for finite abelian groups. These rely on the quantum Fourier transform. While no classical algorithm matching this speedup has been found, it's considered unlikely. [57] Provable speedups for oracle problems like Simon's problem and the Bernstein–Vazirani problem exist in the quantum query model, but these don't always translate to practical advantages.
Other problems, such as simulating quantum systems in chemistry and solid-state physics, approximating Jones polynomials, and solving linear systems of equations via the quantum algorithm for linear systems of equations, show super-polynomial speedups and are BQP-complete. If a classical algorithm could solve these efficiently, it would imply that quantum computers don't offer super-polynomial speedups, which is generally disbelieved. [59]
Quantum algorithms are also being explored for cryptography, optimization, and machine learning, though these applications are still in early research phases, requiring significant advances in error correction and hardware scalability. [60]
Algorithms like Grover's algorithm and amplitude amplification offer polynomial, specifically quadratic, speedups. [57] While modest, these speedups are widely applicable. [21] However, these are often against theoretical worst-case classical algorithms, and practical advantages are yet to be demonstrated.
Simulation of Quantum Systems
Given that chemistry and nanotechnology depend on understanding quantum systems, which are notoriously difficult to simulate classically, quantum simulation is poised to be a major application. [61] It could also shed light on phenomena in extreme conditions, such as within a collider. [62] In June 2023, IBM researchers reported that a quantum computer had outperformed a conventional supercomputer on a physics problem. [63] [64]
The production of ammonia for fertilizer via the Haber process consumes about 2% of global energy. [65] Quantum simulations could optimize this process, potentially improving energy efficiency. [66] Some predict this optimization by the mid-2020s, [67] while others foresee a longer timeline. [68]
Post-Quantum Cryptography
A significant implication of quantum computing is its potential to break current cryptographic systems. Integer factorization, the basis of public key cryptography, is considered intractable for classical computers for large numbers. [69] However, Shor's algorithm on a quantum computer could factor integers exponentially faster, rendering many current cryptographic systems vulnerable. [70] This would have profound consequences for electronic privacy and security, as algorithms like RSA, Diffie–Hellman, and elliptic curve Diffie–Hellman could be compromised.
The field of post-quantum cryptography is dedicated to identifying cryptographic systems resilient to quantum attacks. [71] [72] Some algorithms, like the McEliece cryptosystem, are based on problems in coding theory not known to be vulnerable to Shor's algorithm. [71] [73] Lattice-based cryptosystems also appear resistant, although solving the dihedral hidden subgroup problem remains an open challenge. [74] Grover's algorithm, while offering a quadratic speedup for brute-force attacks on symmetric (secret-key) algorithms, effectively halves the key length – AES-256 would offer security comparable to AES-128 against classical brute-force search. [75]
Search Problems
The most prominent example of a problem with a polynomial quantum speedup is unstructured search, finding a specific item in a database of size
n
{\displaystyle n}
. Grover's algorithm achieves this in
O (
n
)
{\displaystyle O({\sqrt {n}})}
queries, a quadratic improvement over the classical
Ω ( n )
{\displaystyle \Omega (n)}
. This speedup is provably optimal. [21] Other examples include collision finding in two-to-one functions [76] and evaluating NAND trees. [77]
Grover's algorithm is effective for problems with the following characteristics: [78] [79]
- No inherent structure in the search space.
- The number of possible answers matches the number of inputs.
- A Boolean function exists to verify the correct answer.
For such problems, Grover's algorithm scales with the square root of the input size, while classical algorithms scale linearly. This applies to problems like the Boolean satisfiability problem, including password cracking and breaking symmetric ciphers, which are of interest to intelligence agencies. [81]
Quantum Annealing
Quantum annealing uses the adiabatic theorem to solve problems. A system starts in the ground state of a simple Hamiltonian and slowly evolves into a more complex one whose ground state encodes the solution. If the evolution is sufficiently slow, the system remains in its ground state. Quantum annealing can solve Ising models and the equivalent QUBO problem, which can represent numerous combinatorial optimization problems, potentially aiding in computational biology. [82] [83]
Machine Learning
The ability of quantum computers to generate outputs intractable for classical computers, combined with their linear algebraic foundation, fuels hope for faster machine learning tasks. [47] [84]
The HHL Algorithm, for instance, is believed to offer speedups. [47] [85] Researchers are also exploring quantum annealing for training Boltzmann machines and deep neural networks. [86] [87] [88]
Quantum computers excel at solving complex quantum many-body problems [22], making them ideal for applications in quantum chemistry and drug discovery. Quantum-enhanced generative models, including quantum GANs [90], could revolutionize drug discovery by navigating the vast structural space of molecules.
Engineering
As of 2023, classical computers still outperform quantum computers for practical applications. While quantum computers can solve specific mathematical problems faster, they offer no general advantage. Developing scalable quantum architectures faces significant engineering hurdles. [91] [92]
Challenges
Building a large-scale quantum computer is fraught with challenges. David DiVincenzo outlined key requirements: [94]
- Scalability: The ability to increase the number of qubits.
- Initialization: Qubits must be settable to arbitrary initial states.
- Coherence: Quantum gates must operate faster than decoherence times.
- Universality: A universal gate set is required.
- Readout: Qubits must be easily readable.
Sourcing components is also a major obstacle. Superconducting quantum computers from Google and IBM require helium-3, a byproduct of nuclear research, and specialized superconducting cables from a single Japanese manufacturer. [95]
Controlling multi-qubit systems necessitates generating and synchronizing numerous electrical signals with precise timing. This has spurred the development of quantum controllers, but scaling them for larger qubit counts remains a challenge. [96]
Decoherence
The primary adversary in quantum computing is decoherence. Isolating the quantum system from its environment is crucial, as external interactions destroy the delicate quantum states. Other decoherence sources include quantum gates themselves, lattice vibrations, and the spin of the underlying physical system. Decoherence is irreversible and non-unitary. Decoherence times (specifically the transverse relaxation time T2) typically range from nanoseconds to seconds, often requiring qubits to be cooled to 20 millikelvin using dilution refrigerators. [97] [98] Even cosmic rays can cause decoherence in milliseconds. [100]
Maintaining qubit states for extended periods is problematic, corrupting superpositions. [101] Optical approaches face even shorter timescales. Error rates are inversely proportional to the ratio of operating time to decoherence time.
The threshold theorem suggests that if error rates are sufficiently low, quantum error correction can suppress errors. This allows for longer computation times, provided errors are corrected faster than they are introduced. A common target error rate for fault-tolerant computation is 10−3, assuming depolarizing noise.
Achieving this scalability requires a massive increase in qubit numbers. Factoring a 1000-bit number using Shor's algorithm might need around 104 bits without error correction, but this balloons to 107 bits with it. [102] Estimates suggest that factoring a 2048-bit integer would require approximately 3 million physical qubits on a fully error-corrected trapped-ion quantum computer, taking about 5 months. [105]
One promising approach combines low-density parity-check codes with cat qubits to suppress bit-flip errors. [106]
Another strategy involves topological quantum computers using anyons – quasi-particles whose braiding patterns are inherently stable. [107] [108] Non-Abelian anyons, in particular, can "remember" their manipulation history, making them robust for logic gates. [109] Microsoft and others are investing in this research.
Quantum Supremacy
John Preskill coined the term quantum supremacy to describe the engineering feat of demonstrating a quantum device performing a computation beyond the reach of the most powerful classical computers. [110] [47] [111] The problem itself doesn't need to be practically useful; it's a benchmark. [112]
In October 2019, Google AI, with NASA's assistance, claimed quantum supremacy using the Sycamore quantum computer. They asserted it performed a calculation 3 million times faster than Summit, the then-fastest classical supercomputer. [28] [113] [114] IBM disputed this, claiming Summit could do it in 2.5 days. [115] [116] Subsequent research has even shown classical computers can beat Sycamore on this specific sampling task. [120] [121] [122]
In December 2020, a Chinese research group claimed quantum supremacy using a photonic quantum computer, Jiuzhang, performing Boson sampling on 76 photons. [123] [124] [125] They stated their machine could generate samples in 20 seconds that would take a supercomputer 600 million years. [126]
These supremacy claims, while impressive engineering feats, often rely on contrived benchmark tasks with no immediate practical applications. [91] [128]
In January 2024, a study in Physical Review Letters offered direct verification of quantum supremacy experiments by computing exact amplitudes for experimental bitstrings on a Sunway supercomputer, showcasing advanced simulation capabilities. This highlights the dynamic nature of quantum computing validation. [129]
Skepticism
Despite optimistic projections and hardware advancements, a 2023 Nature article summarized current quantum computers as being "For now, [good for] absolutely nothing." [91] While they haven't surpassed classical computers in usefulness or efficiency, their long-term potential remains. A 2023 Communications of the ACM article argued that current quantum algorithms are "insufficient for practical quantum advantage without significant improvements across the software/hardware stack." [92] While some applications in chemistry and materials science show promise, others like machine learning are unlikely to see quantum advantage soon due to I/O constraints and data size.
Several factors contribute to this:
- Classical hardware and algorithms are rapidly improving, especially with GPU accelerators.
- Current quantum hardware generates limited entanglement before succumbing to noise.
- Quantum algorithms offer speedups only for specific tasks, and matching these to practical applications is challenging. Many require resources far beyond current capabilities. [130] [131] Processing large non-quantum datasets is particularly difficult. [92]
- Some quantum algorithms have been "dequantized," meaning efficient classical analogues have been found.
- The overhead of quantum error correction, necessary for scaling, might negate quantum speedups. [92]
- Abstract assumptions in algorithm complexity analysis, like perfect oracle functions or pre-encoded quantum states, may not hold in practice.
The challenge lies not only in building more qubits but in connecting them effectively and maintaining entanglement over time. As researchers seek new tasks for quantum computers, the risk of efficient non-quantum solutions emerging remains. Establishing rigorous lower bounds for classical algorithms and proving quantum advantages over them is crucial.
Bill Unruh expressed doubts about quantum computer practicality in 1994. [132] Paul Davies suggested a 400-qubit computer might conflict with the cosmological information bound implied by the holographic principle. [133] Skeptics like Gil Kalai question whether quantum supremacy will ever be achieved. [134] [135] [136] Physicist Mikhail Dyakonov articulated his skepticism:
So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be... about 10300 ... Could we ever learn to control the more than 10300 continuously variable parameters defining the quantum state of such a system? My answer is simple. No, never. [137]
Physical Realizations
A functional quantum computer requires a physical system to act as a programmable quantum register. [139] Researchers are exploring several technologies for reliable qubit implementation, with superconductors and trapped ions being the most developed. [140] [141] Topological quantum computers are also being investigated for their fault-tolerance potential. [142]
The first quantum logic gates were demonstrated with trapped ions, leading to prototype machines with up to 20 qubits. However, integrating these systems, which involve complex vacuum equipment, lasers, and RF electronics, with standard computing infrastructure is difficult. [143]
The largest commercial systems, based on superconductors, have reached 2000 qubits, but with error rates around 5%. These cryogenic devices require wafer-scale integration, a significant engineering challenge. [144]
Potential Applications
From a business perspective, quantum computing's potential applications fall into four main categories: cybersecurity, data analytics and artificial intelligence, optimization and simulation, and data management and searching. [145] Other areas include healthcare (drug discovery), financial modeling, and natural language processing. [146]
Theory
Computability
Any problem solvable by a classical computer can, in principle, be solved by a quantum computer. [147] Conversely, any quantum computer can be simulated by a classical Turing machine. This means quantum computers don't expand the realm of what's computable; they cannot solve undecidable problems like the halting problem. The Church–Turing thesis remains intact. [148]
Complexity
While not increasing computability, quantum computers are suspected to excel in speed. The class of problems efficiently solvable by a quantum computer with bounded error is BQP. It's widely believed that [59]
B P P ⊆ B Q P ⊊ B P P
{\displaystyle {\mathsf {BPP\subseteq BQP\subsetneq BPP}}}
, meaning quantum computers are likely faster than classical ones for certain tasks.
The relationship of BQP to classical complexity classes:
- [P] is a subset of BQP, which is a subset of PSPACE. [59]
- It's suspected that problems exist that are efficiently solvable by quantum computers but not classical ones (i.e., BQP is a strict superset of P). Integer factorization and discrete logarithms are prime candidates.
- It's believed that NP is not a subset of BQP (i.e., NP problems are not all efficiently solvable by quantum computers). [151]
See Also
- D-Wave Systems – A quantum computing company.
- Electronic quantum holography – An information storage technology.
- Glossary of quantum computing
- IARPA – An American government agency.
- India's quantum computer – India's proposed quantum computer.
- IonQ – A US information technology company.
- List of emerging technologies
- List of quantum computing journals
- List of quantum processors
- Magic state distillation – A quantum computing algorithm.
- Metacomputing – Computing for the purpose of computing.
- Natural computing – An academic field.
- Non-local quantum computation – A method of quantum computing via entanglement.
- Optical computing – Computing using photons.
- Quantum bus – A device for storing or transferring quantum information.
- Quantum cognition – Applying quantum theory to cognitive phenomena.
- Quantum sensor – A device measuring quantum effects.
- Quantum volume – A metric for quantum computer capabilities.
- Quantum weirdness – The counterintuitive aspects of quantum mechanics.
- Rigetti Computing – An American quantum computing company.
- Supercomputer – A type of powerful computer.
- Theoretical computer science – A subfield of computer science and mathematics.
- Unconventional computing – Computing using novel methods.
- Valleytronics – An experimental area in semiconductors.
Notes
- ^ The standard basis is also known as the computational basis. [35]