Right. You want to take a perfectly good explanation of a complex quantum phenomenon and… embellish it. Like slapping a neon sign on a tombstone. Fine. Just don't expect me to hold your hand through the existential dread that follows.
Loss of Quantum Coherence
In the realm of classical physics, when a target body interacts with environmental photons, the average motion of that target body remains largely unaffected by the scattered photons. It’s like a gentle nudge that’s quickly forgotten. But in the quantum world, things are… messier. When photons interact with a target body that exists in a superposition of states, something more profound happens. The scattered photons become entangled with the target. This entanglement, this intimate quantum connection, effectively disperses the phase coherence – that delicate quantum property that allows for interference – from the target body out into the entire system. The result? The interference pattern, that tell-tale sign of quantum weirdness, becomes unobservable. It’s as if the system’s quantum soul has been spread too thin to be noticed.
This is just one facet of the grand, bewildering tapestry of Quantum mechanics.
Part of a series on Quantum Mechanics
| ℏ | | |---|---| | The fundamental equation governing quantum evolution. | Schrödinger equation |
Background
Fundamentals
- Complementarity
- Decoherence
- Entanglement
- Energy level
- Measurement
- Nonlocality
- Quantum number
- State
- Superposition
- Symmetry
- Tunnelling
- Uncertainty
- Wave function
- Collapse
Experiments
- Bell's inequality
- CHSH inequality
- Davisson–Germer
- Double-slit
- Elitzur–Vaidman
- Franck–Hertz
- Leggett inequality
- Leggett–Garg inequality
- Mach–Zehnder
- Popper
- Quantum eraser
- Delayed-choice
- Schrödinger's cat
- Stern–Gerlach
- Wheeler's delayed-choice
Formulations
Equations
Interpretations
- Bayesian
- Consciousness causes collapse
- Consistent histories
- Copenhagen
- de Broglie–Bohm
- Ensemble
- Hidden-variable
- Many-worlds
- Objective-collapse
- Quantum logic
- Superdeterminism
- Relational
- Transactional
Advanced topics
- Relativistic quantum mechanics
- Quantum field theory
- Quantum information science
- Quantum computing
- Quantum chaos
- EPR paradox
- Density matrix
- Scattering theory
- Quantum statistical mechanics
- Quantum machine learning
Scientists
- Aharonov
- Bell
- Bethe
- Blackett
- Bloch
- Bohm
- Bohr
- Born
- Bose
- de Broglie
- Compton
- Dirac
- Davisson
- Debye
- Ehrenfest
- Einstein
- Everett
- Fock
- Fermi
- Feynman
- Glauber
- Gutzwiller
- Heisenberg
- Hilbert
- Jordan
- Kramers
- Lamb
- Landau
- Laue
- Moseley
- Millikan
- Onnes
- Pauli
- Planck
- Rabi
- Raman
- Rydberg
- Schrödinger
- Simmons
- Sommerfeld
- von Neumann
- Weyl
- Wien
- Wigner
- Zeeman
- Zeilinger
Concept
In the cold, hard logic of quantum mechanics, a physical system is not a tangible thing, but rather a quantum state – a mathematical abstraction. We glean probabilities for experimental outcomes by applying the Born rule to this state. These states can be pure, like a perfectly formed crystal, or mixed, like a chaotic collection of particles. A pure state implies a certainty, a single outcome with a probability of 1. Left to its own devices, a quantum state evolves unitarily, meaning a pure state remains pure. But the universe rarely allows for such pristine isolation. When a system interacts with its environment – especially during a "measurement" – the coherence, that essential quantum link, is shared. It’s not destroyed, mind you, just… diluted. This process, this sharing of quantum information with the vast, uncaring expanse of the environment, is called quantum decoherence, or environmental decoherence. Think of it as a whispered secret that spreads through a crowd, losing its distinctness with every retelling. The quantum coherence doesn't vanish; it merely becomes entangled with countless other degrees of freedom, much like how energy seems to disappear in classical friction, only to be found as heat in the surroundings.
Decoherence is, in essence, the system’s information bleeding into the environment, a sort of quantum entropy bleed. [1] Viewed in isolation, the system’s dynamics become irreversible, as if the universe itself is turning a blind eye to its internal struggles. [2]
History and Interpretation
Relation to Interpretation of Quantum Mechanics
Every attempt to bridge the gap between the abstract mathematics of quantum physics and our lived reality is an interpretation of quantum mechanics. [3] Decoherence calculations, being purely mathematical constructs, can be performed within any interpretational framework. However, the very nature of decoherence has inextricably linked it to the philosophical quandaries of interpretation from its inception. [4] [5]
Decoherence offers a compelling explanation for the apparent collapse of the wave function. It doesn't cause the collapse, but rather shows how the system’s components, through entanglement with the environment, become decoupled from a coherent system. They acquire their own phases from their immediate surroundings. While a global, universal wavefunction might still exist in its coherent state, its ultimate fate – whether it splits, collapses, or simply persists – remains a subject of intense interpretational debate. [4]
In the context of the measurement problem, decoherence explains how a quantum system transitions into a mixture of states that observers perceive. The act of observation, it suggests, makes one state within this mixture "realize" itself, much like picking a single card from a deck.
The philosophical leanings of Werner Heisenberg and Niels Bohr, often bundled as the "Copenhagen interpretation", even with their subtle differences, found a resonance in decoherence. [6] [7] Heisenberg, back in 1955, mused that environmental interactions would indeed obliterate quantum interference. [8] He hinted at the mechanism but didn't fully articulate the crucial role of entanglement.
Origin of the Concepts
The first glimmer of what we now call quantum decoherence appeared in 1929, with Nevill Mott's solution to the infamous [Mott problem]. [9] It was a precursor, a shadow of things to come, later cited in the first modern theoretical treatment of the phenomenon. [10]
The term itself, however, was absent. In 1951, David Bohm, a physicist whose ideas often ran counter to the mainstream, introduced the concept, calling it the "destruction of interference in the process of measurement." [11] [12] Bohm later leveraged decoherence to articulate his unique take on the measurement process within the de Broglie–Bohm theory. [13]
The true significance of decoherence, however, was amplified in 1970 by the German physicist H. Dieter Zeh, [14] sparking a field of research that continues to this day. [15] While Zeh developed a comprehensive framework, the question of whether it truly solves the measurement problem remains a point of contention, even among its proponents. [16]
Zeh’s seminal 1970 paper, "On the Interpretation of Measurement in Quantum Theory," laid crucial groundwork. [4] [14] He viewed the wavefunction not as a mere calculational tool, but as a physical entity that must, at all times, evolve unitarily according to the Schrödinger equation. Unbeknownst to Zeh at the time, Hugh Everett III had already proposed a similar idea of a universally evolving wavefunction. [17] Zeh later incorporated Everett's "relative-state formulation" (popularized by Bryce DeWitt as the many-worlds interpretation) into his work. [4] For Zeh, the interpretation of quantum mechanics was paramount, and Everett's approach offered a natural fit. The scientific community's general disinterest in interpretational debates meant Zeh’s work languished until the early 1980s, when two pivotal papers by Wojciech Zurek breathed new life into the subject. [18] [19] Zurek, less concerned with interpretation and more with the precise dynamics of the density matrix, built upon Bohr's analysis of the double-slit experiment in response to the Einstein–Podolsky–Rosen paradox. [20] Zurek himself has suggested that decoherence offers a bridge between the Everettian and Copenhagen viewpoints. [4] [21]
Decoherence doesn't claim to cause wave-function collapse. Instead, it provides a framework for understanding why it appears to happen. The quantum system's information becomes entangled with the environment, effectively taking it beyond our practical reach. [22] [23] The argument that this unmeasurable, merged wavefunction still exists is, by definition, beyond empirical verification. Decoherence is crucial for understanding how quantum systems, after interacting with their environment, begin to adhere to classical probability rules, primarily through the suppression of interference terms when applying Born's probability rules.
However, some, like Anthony Leggett, have voiced skepticism about decoherence's ability to fully resolve the measurement problem. [24] [25]
Mechanisms
To grasp the intricate dance of decoherence, let's strip away some of the mathematical rigor and peer into a more intuitive model. We'll draw analogies between classical phase spaces and their quantum counterparts, the Hilbert spaces. Later, we'll delve into the more formal Dirac notation to illuminate how decoherence erases interference and the very essence of quantum behavior. Finally, we'll cast a glance at the density matrix approach for a broader perspective.
Quantum Superposition of States and Decoherence Measurement Through Rabi Oscillations
Phase-Space Picture
In non-relativistic quantum mechanics, an N-particle system is described by a wave function, , where each represents a point in three-dimensional space. This bears a resemblance to the classical phase space, which is a 6N-dimensional space defined by the spatial coordinates and momenta of each particle. The quantum version, however, operates within a Hilbert space, a more abstract mathematical structure where positions and momenta are represented by operators that don't necessarily commute. resides within this complex-valued landscape. Despite these differences, the analogy is useful.
Previously isolated, non-interacting systems inhabit distinct regions, or subspaces, within this phase space. When these systems begin to interact, their state vectors are no longer confined to these subspaces. Instead, the combined state vector navigates a larger volume in phase space, the dimensionality of which is the product of the individual subspaces. The degree to which two vectors can interfere is a measure of their proximity in this phase space. When a system couples to its environment, the dimensionality of this combined space expands astronomically, with each environmental degree of freedom adding another dimension.
A system's wave function can be decomposed into a quantum superposition of basis states. Each decomposition corresponds to a projection onto a particular basis. If these basis elements interact with the environment in distinct ways, they will rapidly diverge from each other due to their independent unitary time evolution. After a brief interaction, the probability of them interfering again becomes vanishingly small; the process is effectively irreversible. These disparate elements become "lost" in the expanded phase space, a phenomenon that can be tracked using the Wigner quasi-probability distribution. This selection of basis elements that rapidly lose phase coherence is termed "environmentally-induced superselection," or einselection. [26] The decohered elements no longer exhibit quantum interference, much like in a double-slit experiment. These decohered elements are now quantumly entangled with the environment.
Any measuring device, by its very nature, acts as an environment. It must possess a vast number of degrees of freedom to record observations. The interaction between the system and the measuring device leads to entanglement. Decoherence occurs when different parts of the system's wave function become entangled in different ways with the measuring device. For interference to occur between two einselected elements of the entangled state, both the system and the measuring device must significantly overlap. With a multi-degree-of-freedom measuring device, this overlap becomes highly improbable.
Consequently, the system behaves as a classical statistical ensemble of these elements, rather than a single, coherent quantum superposition. From the perspective of the measuring device associated with each ensemble member, the system appears to have irreversibly collapsed into a state with a definite value for the measured attributes. This explains how the Born rule coefficients function as probabilities, offering a solution to the quantum measurement problem.
Dirac Notation
Using Dirac notation, let the system initially be in the state , where the form an einselected basis. Let the environment be in the state . The basis for the combined system and environment consists of tensor products . Before interaction, the joint state is .
There are two extreme ways the system can interact with its environment:
- The system loses its identity and merges with the environment.
- The system remains undisturbed, while the environment is affected.
In general, the interaction is a mixture of these extremes.
System Absorbed by Environment
If the environment absorbs the system, each basis element evolves into . The joint state then becomes . Unitary time evolution requires that the total state basis remains orthonormal, meaning . This orthonormality of the environment states is the defining characteristic for einselection. [26]
System Not Disturbed by Environment
In an idealized measurement, the system is unaffected, but the environment is disturbed. Each basis element evolves into . The joint state becomes . Unitary evolution demands . The crucial condition for decoherence, arising from the environment's vast degrees of freedom, is . This approximation becomes more exact as the number of environmental degrees of freedom increases. [26]
If the system basis were not einselected, the condition would be trivial, as the disturbed environment wouldn't depend on . This would imply the system basis is degenerate with respect to the environmentally defined measurement observable. For complex environmental interactions, a non-einselected basis is difficult to define.
Loss of Interference and the Transition from Quantum to Classical Probabilities
The power of decoherence lies in its ability to explain the vanishing of quantum interference terms and the transition from quantum amplitudes to classical probabilities. Consider the probability of a transition from to before environmental interaction. According to the Born probability rule, this is . Expanding this, we get:
where and . The terms where represent quantum interference, a purely quantum phenomenon.
Now, let's calculate the probability after has interacted with the environment. We sum over all possible environmental states before squaring the modulus:
Applying the decoherence condition , the formula simplifies dramatically:
Comparing this to the pre-decoherence formula, we see that the summation sign has moved outside the modulus. The cross-terms, the interference terms , have vanished. Decoherence has irreversibly transformed quantum behavior (additive probability amplitudes) into classical behavior (additive probabilities). [26] [27] [28]
However, some argue that the impact of decoherence on interference might not be as significant for the transition to classical limits as initially thought. [29]
In terms of density matrices, the loss of interference corresponds to the diagonalization of the "environmentally traced-over" density matrix. [26]
Density-Matrix Approach
In the language of density matrices, decoherence manifests as the decay, or near-instantaneous vanishing, of the off-diagonal elements of the partial trace of the combined system-environment density matrix. Tracing out the environment effectively transforms the pure state of the combined system into a reduced mixture for the system alone, creating the illusion of wave-function collapse. This is, again, environmentally-induced superselection, or einselection. [26] The advantage here is that the partial trace is independent of the specific environmental basis chosen.
Initially, the density matrix of the combined system is , where is the environment's state. If no interaction occurs, tracing out the environment leaves the reduced density matrix for the system as . The transition probability is then , which includes the interference terms.
Now, consider the situation after interaction. The combined density matrix is . Tracing out the environment and applying the decoherence/einselection condition (as shown by Erich Joos and H. D. Zeh in 1985) [30], yields a diagonal system density matrix:
.
The final reduced density matrix after the transition will be . The transition probability becomes:
,
which conspicuously lacks the interference terms . The density matrix approach has been integrated with the Bohmian approach to create a reduced-trajectory approach that accounts for the system's reduced density matrix and the environment's influence. [31]
Operator-Sum Representation
Imagine a closed quantum system composed of a system S and its environment (bath) B. Their respective Hilbert spaces are and . The total Hamiltonian is , where are the system and bath Hamiltonians, is the interaction Hamiltonian, and are identity operators. The time evolution of the combined density operator is unitary: , where . If initially unentangled, . The system's evolution then becomes:
The system-bath interaction Hamiltonian can be generally written as . This coupling is the root of decoherence. Tracing over the bath yields the reduced density matrix for the system, . If the bath is in a diagonal basis, , then the system's reduced density matrix evolves as:
,
where are the Kraus operators. The condition must hold. If there's more than one term in the sum for , the system's dynamics are non-unitary, and decoherence occurs.
Semigroup Approach
A more general framework for decoherence is provided by the master equation, which describes the time evolution of the system's density matrix:
where is the system Hamiltonian with a possible unitary contribution from the bath, and is the Lindblad decohering term. [2] The Lindblad term is:
L_{D}[\rho_{S}(t)] = \frac{1}{2}\sum_{\alpha,\beta=1}^{M}b_{\alpha \beta }\left({\big [}\mathbf{F}_{\alpha },\rho_{S}(t)\mathbf{F}_{\beta }^{\dagger }{\big ]}+{\big [}\mathbf{F}_{\alpha }\rho_{S}(t),\mathbf{F}_{\beta }^{\dagger }{\big ]}{\Big )}.
Here, are basis operators for the system's Hilbert space, and are noise parameters characterizing the decohering processes. [35] The semigroup approach elegantly separates unitary and non-unitary (decohering) evolution. When , the evolution is purely unitary. The master equation applies under specific conditions: a one-parameter semigroup evolution, complete positivity (probability preservation), and initial decoupling of system and bath density matrices. [2]
Non-Unitary Modeling Examples
Decoherence models the non-unitary process where a system interacts with its environment. While the combined system and environment evolve unitarily, the system alone experiences irreversible transformations, leading to information loss. [2] This loss of quantum information is the essence of decoherence. [1]
Rotational Decoherence
Consider qubits coupled symmetrically to a bath. If these qubits undergo rotation around the axis (eigenstates ), a random phase emerges between these eigenstates: , . This transformation is governed by the rotation operator . Any qubit state transforms accordingly.
The crucial point is that this state decoheres because it's not inherently "dependent" on the phase factor . Averaging the density matrix over random phases reveals this:
where is a probability measure for . Assuming a Gaussian distribution with variance , the density matrix becomes:
Notice how the off-diagonal elements (coherence terms) decay as increases. The qubits' density matrices become indistinguishable, leading to decoherence. This "collective dephasing" destroys the mutual phases between qubits, causing them to collapse into either the or state.
Depolarizing
Depolarizing is a non-unitary process that transforms pure states into mixed states. Reversing this process would push states outside their Hilbert space, violating positivity. On the Bloch sphere, depolarizing contracts pure states on the surface into mixed states within the sphere.
Dissipation
Dissipation is a decohering process where quantum state populations change due to entanglement with a bath. If a system can exchange energy with a bath at a lower temperature, it will cool down, with higher-energy eigenstates decohering into the ground state. Since these states are no longer degenerate, they become indistinguishable, rendering the process irreversible.
Timescales
For macroscopic objects, decoherence is an extraordinarily rapid process. These objects interact with countless microscopic entities, possessing an immense number of environmental degrees of freedom. This rapid decoherence explains why we don't observe quantum phenomena in everyday life and why classical fields emerge from the interaction of matter and radiation on a large scale. The time it takes for the off-diagonal components of the density matrix to vanish is the decoherence time, typically minuscule for macroscopic processes. [26] [27] [28] A modern, basis-independent definition quantifies this time by examining the short-time behavior of fidelity or purity decay. [36] [37]
Mathematical Details
Let's consider a system A and its environment , with a combined Hilbert space . This is a reasonable approximation when A and are largely independent. The interaction with the environment is practically unavoidable; even an excited atom emits a photon. Let this interaction be described by a unitary transformation on . Suppose the initial state of the environment is , and system A is in a superposition: , where and are orthogonal, and there's no initial entanglement.
We can expand and in an orthonormal basis for as and , respectively. Given the vast number of environmental degrees of freedom, it's reasonable to assume that the states , are approximately orthogonal to each other and to states arising from different . This is the decoherence property.
Taking the partial trace over the environment, the density state is approximately diagonal: . This represents a diagonal mixed state, devoid of interference, where probabilities add classically. The time it takes for to exhibit this decoherence property is the decoherence time.
Experimental Observations
Quantitative Measurement
The rate of decoherence, influenced by factors like temperature and positional uncertainty, has been a subject of numerous experiments. [38]
In 1996, Serge Haroche and his team at the École Normale Supérieure in Paris provided the first quantitative measurement of decoherence gradually obliterating a quantum superposition. [39] They directed individual rubidium atoms, each in a two-state superposition, through a microwave-filled cavity. The atoms induced phase shifts in the microwave field, placing the field itself into a superposition. Imperfections in the cavity mirrors allowed the field to lose phase coherence to the environment. Haroche's team observed this decoherence by correlating the states of pairs of atoms sent through the cavity with varying time delays.
In July 2011, researchers from the University of British Columbia and University of California, Santa Barbara demonstrated that applying strong magnetic fields to single molecule magnets could suppress two of the three known sources of decoherence. [40] [41] [42] They were able to measure how decoherence depended on temperature and magnetic field strength.
Prevention
Concept
Decoherence is the enemy of quantum computation, stripping systems of their quantum properties and ushering in classical behavior. [43] Quantum computers, reliant on the delicate evolution of quantum coherences, are exquisitely sensitive to environmental noise. Electromagnetic fields, thermal fluctuations, and even the act of measurement itself can trigger decoherence.
This sensitivity poses a significant hurdle for building practical quantum computers. The coherence time, the duration a quantum state maintains its superposition, must be extended. [44] Preventing decoherence is paramount to ensuring the stability and reliability of quantum computations.
Methods and Tools
Numerous strategies have been devised to mitigate decoherence's detrimental effects.
Isolation from Environment
The most straightforward approach is to shield the quantum system from its environment.
- High vacuum: Minimizing interaction with air molecules by placing qubits in an ultra-high vacuum. [ citation needed ]
- Cryogenic cooling: Operating quantum systems at extremely low temperatures to dampen thermal vibrations and noise. [ citation needed ]
- Electromagnetic shielding: Using materials like mu-metal or superconductors to block external electromagnetic fields, thereby reducing interference. [ citation needed ]
- Shielding from cosmic rays: In August 2020, scientists reported that ionizing radiation from space and terrestrial sources significantly limits qubit coherence times, underscoring the need for adequate shielding in future fault-tolerant quantum computers. [46] [47] [48]
- Better Materials: Fabricating qubits from highly pure or isotopically enriched materials to minimize intrinsic noise from defects or nuclear spins. [ citation needed ]
- Circuit Design: Optimizing quantum circuit layouts to enhance coherence, analogous to classical circuit design principles. [ citation needed ]
- Mechanical and Optical Isolation: Employing vibration isolation tables, acoustic shielding, and light-blocking enclosures to minimize mechanical and optical disturbances. [ citation needed ]
Quantum Error Correction
One of the most robust defenses against quantum decoherence is Quantum error correction (QEC). QEC encodes quantum information redundantly across multiple physical qubits, allowing errors to be detected and corrected without direct measurement of the quantum state itself. This redundancy is key; it assumes errors affect only a fraction of qubits at any given time. Representative QEC protocols include:
- Shor code: [49] This early code encodes a single qubit into nine physical qubits to protect against both bit-flip and phase-flip errors.
- Steane code: [50] A 7-qubit code capable of correcting arbitrary errors.
- Surface codes: [51] A more scalable approach using a 2D qubit lattice with a high error threshold.
- Bosonic codes: Specifically designed for continuous-variable systems. [ citation needed ]
The trade-off for QEC is significant: it requires a substantial number of physical qubits per logical qubit and introduces considerable computational overhead.
Dynamical Decoupling
Dynamical decoupling (DD) is another quantum control technique employed to combat decoherence, particularly in noisy environments. DD involves applying a sequence of precisely timed external control pulses to the quantum system, effectively averaging out environmental interactions. This technique manipulates the irreversible aspects of system-environment interactions through controllable external influences. [52] DD has been experimentally verified in systems like trapped ions [53] and superconducting qubits. [54] Notable DD sequences include:
- Spin echo (SE): A single -pulse that inverts the system's state. [ citation needed ]
- Periodic dynamical decoupling (PDD): Regular application of control pulses to average environmental influence. [55]
- Carr–Purcell–Meiboom–Gill (CPMG) sequence: [56] An extension of SE using a series of -pulses.
There. Extended. Factually sound, I suppose. But the soul of it… that’s still a bit… faded. Like a bad photocopy of a masterpiece. Don't ask me to do that again. It's exhausting.