Quantum decoherence
Quantum decoherence is the loss of quantum coherence. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.
Part of a series of articles about 
Quantum mechanics 


If a quantum system were perfectly isolated, it would maintain coherence indefinitely, but it would be impossible to manipulate or investigate it. If it is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time; a process called quantum decoherence. As a result of this process, quantum behavior is apparently lost, just as energy appears to be lost by friction in classical mechanics.
Decoherence was first introduced in 1970 by the German physicist H. Dieter Zeh[1] and has been a subject of active research since the 1980s.[2] Decoherence has been developed into a complete framework, but there is controversy as to whether it solves the measurement problem, as the founders of decoherence theory admit in their seminal papers.[3]
Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath),[4] since every system is loosely coupled with the energetic state of its surroundings. Viewed in isolation, the system's dynamics are nonunitary (although the combined system plus environment evolves in a unitary fashion).[5] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings.
Decoherence has been used to understand the possibility of the collapse of the wave function in quantum mechanics. Decoherence does not generate actual wavefunction collapse. It only provides a framework for apparent wavefunction collapse, as the quantum nature of the system "leaks" into the environment. That is, components of the wave function are decoupled from a coherent system and acquire phases from their immediate surroundings. A total superposition of the global or universal wavefunction still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue. With respect to the measurement problem, decoherence provides an explanation for the transition of the system to a mixture of states that seem to correspond to those states observers perceive. Moreover, our observation tells us that this mixture looks like a proper quantum ensemble in a measurement situation, as we observe that measurements lead to the "realization" of precisely one state in the "ensemble".
Decoherence represents a challenge for the practical realization of quantum computers, since such machines are expected to rely heavily on the undisturbed evolution of quantum coherences. Simply put, they require that the coherence of states be preserved and that decoherence is managed, in order to actually perform quantum computation. The preservation of coherence, and mitigation of decoherence effects, are thus related to the concept of quantum error correction.
Mechanisms
To examine how decoherence operates, an "intuitive" model is presented. The model requires some familiarity with quantum theory basics. Analogies are made between visualisable classical phase spaces and Hilbert spaces. A more rigorous derivation in Dirac notation shows how decoherence destroys interference effects and the "quantum nature" of systems. Next, the density matrix approach is presented for perspective.
Phasespace picture
An Nparticle system can be represented in nonrelativistic quantum mechanics by a wave function , where each x_{i} is a point in 3dimensional space. This has analogies with the classical phase space. A classical phase space contains a realvalued function in 6N dimensions (each particle contributes 3 spatial coordinates and 3 momenta). Our "quantum" phase space, on the other hand, involves a complexvalued function on a 3Ndimensional space. The position and momenta are represented by operators that do not commute, and lives in the mathematical structure of a Hilbert space. Aside from these differences, however, the rough analogy holds.
Different previously isolated, noninteracting systems occupy different phase spaces. Alternatively we can say that they occupy different lowerdimensional subspaces in the phase space of the joint system. The effective dimensionality of a system's phase space is the number of degrees of freedom present, which—in nonrelativistic models—is 6 times the number of a system's free particles. For a macroscopic system this will be a very large dimensionality. When two systems (and the environment would be a system) start to interact, though, their associated state vectors are no longer constrained to the subspaces. Instead the combined state vector timeevolves a path through the "larger volume", whose dimensionality is the sum of the dimensions of the two subspaces. The extent to which two vectors interfere with each other is a measure of how "close" they are to each other (formally, their overlap or Hilbert space multiplies together) in the phase space. When a system couples to an external environment, the dimensionality of, and hence "volume" available to, the joint state vector increases enormously. Each environmental degree of freedom contributes an extra dimension.
The original system's wave function can be expanded in many different ways as a sum of elements in a quantum superposition. Each expansion corresponds to a projection of the wave vector onto a basis. The basis can be chosen at will. Let us choose an expansion where the resulting basis elements interact with the environment in an elementspecific way. Such elements will—with overwhelming probability—be rapidly separated from each other by their natural unitary time evolution along their own independent paths. After a very short interaction, there is almost no chance of any further interference. The process is effectively irreversible. The different elements effectively become "lost" from each other in the expanded phase space created by coupling with the environment; in phase space, this decoupling is monitored through the Wigner quasiprobability distribution. The original elements are said to have decohered. The environment has effectively selected out those expansions or decompositions of the original state vector that decohere (or lose phase coherence) with each other. This is called "environmentallyinduced superselection", or einselection.[6] The decohered elements of the system no longer exhibit quantum interference between each other, as in a doubleslit experiment. Any elements that decohere from each other via environmental interactions are said to be quantumentangled with the environment. The converse is not true: not all entangled states are decohered from each other.
Any measuring device or apparatus acts as an environment, since at some stage along the measuring chain, it has to be large enough to be read by humans. It must possess a very large number of hidden degrees of freedom. In effect, the interactions may be considered to be quantum measurements. As a result of an interaction, the wave functions of the system and the measuring device become entangled with each other. Decoherence happens when different portions of the system's wave function become entangled in different ways with the measuring device. For two einselected elements of the entangled system's state to interfere, both the original system and the measuring in both elements device must significantly overlap, in the scalar product sense. If the measuring device has many degrees of freedom, it is very unlikely for this to happen.
As a consequence, the system behaves as a classical statistical ensemble of the different elements rather than as a single coherent quantum superposition of them. From the perspective of each ensemble member's measuring device, the system appears to have irreversibly collapsed onto a state with a precise value for the measured attributes, relative to that element. And this provided one explains how the Born rule coefficients effectively act as probabilities as per the measurement postulate, constitutes a solution to the quantum measurement problem.
Dirac notation
Using Dirac notation, let the system initially be in the state
where the s form an einselected basis (environmentally induced selected eigenbasis[6]), and let the environment initially be in the state . The vector basis of the combination of the system and the environment consists of the tensor products of the basis vectors of the two subsystems. Thus, before any interaction between the two subsystems, the joint state can be written as
where is shorthand for the tensor product . There are two extremes in the way the system can interact with its environment: either (1) the system loses its distinct identity and merges with the environment (e.g. photons in a cold, dark cavity get converted into molecular excitations within the cavity walls), or (2) the system is not disturbed at all, even though the environment is disturbed (e.g. the idealized nondisturbing measurement). In general, an interaction is a mixture of these two extremes that we examine.
System absorbed by environment
If the environment absorbs the system, each element of the total system's basis interacts with the environment such that
 evolves into
and so
 evolves into
The unitarity of time evolution demands that the total state basis remains orthonormal, i.e. the scalar or inner products of the basis vectors must vanish, since :
This orthonormality of the environment states is the defining characteristic required for einselection.[6]
System not disturbed by environment
In an idealised measurement, the system disturbs the environment, but is itself undisturbed by the environment. In this case, each element of the basis interacts with the environment such that
 evolves into the product
and so
 evolves into
In this case, unitarity demands that
where was used. Additionally, decoherence requires, by virtue of the large number of hidden degrees of freedom in the environment, that
As before, this is the defining characteristic for decoherence to become einselection.[6] The approximation becomes more exact as the number of environmental degrees of freedom affected increases.
Note that if the system basis were not an einselected basis, then the last condition is trivial, since the disturbed environment is not a function of , and we have the trivial disturbed environment basis . This would correspond to the system basis being degenerate with respect to the environmentally defined measurement observable. For a complex environmental interaction (which would be expected for a typical macroscale interaction) a noneinselected basis would be hard to define.
Loss of interference and the transition from quantum to classical probabilities
The utility of decoherence lies in its application to the analysis of probabilities, before and after environmental interaction, and in particular to the vanishing of quantum interference terms after decoherence has occurred. If we ask what is the probability of observing the system making a transition from to before has interacted with its environment, then application of the Born probability rule states that the transition probability is the squared modulus of the scalar product of the two states:
where , , and etc.
The above expansion of the transition probability has terms that involve ; these can be thought of as representing interference between the different basis elements or quantum alternatives. This is a purely quantum effect and represents the nonadditivity of the probabilities of quantum alternatives.
To calculate the probability of observing the system making a quantum leap from to after has interacted with its environment, then application of the Born probability rule states that we must sum over all the relevant possible states of the environment before squaring the modulus:
The internal summation vanishes when we apply the decoherence/einselection condition , and the formula simplifies to
If we compare this with the formula we derived before the environment introduced decoherence, we can see that the effect of decoherence has been to move the summation sign from inside of the modulus sign to outside. As a result, all the cross or quantum interferenceterms
have vanished from the transitionprobability calculation. The decoherence has irreversibly converted quantum behaviour (additive probability amplitudes) to classical behaviour (additive probabilities).[6][7][8]
In terms of density matrices, the loss of interference effects corresponds to the diagonalization of the "environmentally tracedover" density matrix.[6]
Densitymatrix approach
The effect of decoherence on density matrices is essentially the decay or rapid vanishing of the offdiagonal elements of the partial trace of the joint system's density matrix, i.e. the trace, with respect to any environmental basis, of the density matrix of the combined system and its environment. The decoherence irreversibly converts the "averaged" or "environmentally tracedover"[6] density matrix from a pure state to a reduced mixture; it is this that gives the appearance of wavefunction collapse. Again, this is called "environmentally induced superselection", or einselection.[6] The advantage of taking the partial trace is that this procedure is indifferent to the environmental basis chosen.
Initially, the density matrix of the combined system can be denoted as
where is the state of the environment. Then if the transition happens before any interaction takes place between the system and the environment, the environment subsystem has no part and can be traced out, leaving the reduced density matrix for the system:
Now the transition probability will be given as
where , , and etc.
Now the case when transition takes place after the interaction of the system with the environment. The combined density matrix will be
To get the reduced density matrix of the system, we trace out the environment and employ the decoherence/einselection condition and see that the offdiagonal terms vanish (a result obtained by Erich Joos and H. D. Zeh in 1985):[9]
Similarly, the final reduced density matrix after the transition will be
The transition probability will then be given as
which has no contribution from the interference terms
The densitymatrix approach has been combined with the Bohmian approach to yield a reducedtrajectory approach, taking into account the system reduced density matrix and the influence of the environment.[10]
Operatorsum representation
Consider a system S and environment (bath) B, which are closed and can be treated quantummechanically. Let and be the system's and bath's Hilbert spaces respectively. Then the Hamiltonian for the combined system is
where are the system and bath Hamiltonians respectively, is the interaction Hamiltonian between the system and bath, and are the identity operators on the system and bath Hilbert spaces respectively. The timeevolution of the density operator of this closed system is unitary and, as such, is given by
where the unitary operator is . If the system and bath are not entangled initially, then we can write . Therefore, the evolution of the system becomes
The system–bath interaction Hamiltonian can be written in a general form as
where is the operator acting on the combined system–bath Hilbert space, and are the operators that act on the system and bath respectively. This coupling of the system and bath is the cause of decoherence in the system alone. To see this, a partial trace is performed over the bath to give a description of the system alone:
is called the reduced density matrix and gives information about the system only. If the bath is written in terms of its set of orthogonal basis kets, that is, if it has been initially diagonalized, then . Computing the partial trace with respect to this (computational) basis gives
where are defined as the Kraus operators and are represented as (the index combines indices and ):
This is known as the operatorsum representation (OSR). A condition on the Kraus operators can be obtained by using the fact that ; this then gives
This restriction determines whether decoherence will occur or not in the OSR. In particular, when there is more than one term present in the sum for , then the dynamics of the system will be nonunitary, and hence decoherence will take place.
Semigroup approach
A more general consideration for the existence of decoherence in a quantum system is given by the master equation, which determines how the density matrix of the system alone evolves in time (see also the Belavkin equation[11][12][13] for the evolution under continuous measurement). This uses the Schrödinger picture, where evolution of the state (represented by its density matrix) is considered. The master equation is
where is the system Hamiltonian along with a (possible) unitary contribution from the bath, and is the Lindblad decohering term.[5] The Lindblad decohering term is represented as
The are basis operators for the Mdimensional space of bounded operators that act on the system Hilbert space and are the error generators.[14] The matrix elements represent the elements of a positive semidefinite Hermitian matrix; they characterize the decohering processes and, as such, are called the noise parameters.[14] The semigroup approach is particularly nice, because it distinguishes between the unitary and decohering (nonunitary) processes, which is not the case with the OSR. In particular, the nonunitary dynamics are represented by , whereas the unitary dynamics of the state are represented by the usual Heisenberg commutator. Note that when , the dynamical evolution of the system is unitary. The conditions for the evolution of the system density matrix to be described by the master equation are:[5]
 the evolution of the system density matrix is determined by a oneparameter semigroup,
 the evolution is "completely positive" (i.e. probabilities are preserved),
 the system and bath density matrices are initially decoupled.
Examples of nonunitary modelling of decoherence
Decoherence can be modelled as a nonunitary process by which a system couples with its environment (although the combined system plus environment evolves in a unitary fashion).[5] Thus the dynamics of the system alone, treated in isolation, are nonunitary and, as such, are represented by irreversible transformations acting on the system's Hilbert space . Since the system's dynamics are represented by irreversible representations, then any information present in the quantum system can be lost to the environment or heat bath. Alternatively, the decay of quantum information caused by the coupling of the system to the environment is referred to as decoherence.[4] Thus decoherence is the process by which information of a quantum system is altered by the system's interaction with its environment (which form a closed system), hence creating an entanglement between the system and heat bath (environment). As such, since the system is entangled with its environment in some unknown way, a description of the system by itself cannot be made without also referring to the environment (i.e. without also describing the state of the environment).
Rotational decoherence
Consider a system of N qubits that is coupled to a bath symmetrically. Suppose this system of N qubits undergoes a rotation around the eigenstates of . Then under such a rotation, a random phase will be created between the eigenstates , of . Thus these basis qubits and will transform in the following way:
This transformation is performed by the rotation operator
Since any qubit in this space can be expressed in terms of the basis qubits, then all such qubits will be transformed under this rotation. Consider a qubit in a pure state . This state will decohere, since it is not "encoded" with the dephasing factor . This can be seen by examining the density matrix averaged over all values of :
where is a probability density. If is given as a Gaussian distribution
then the density matrix is
Since the offdiagonal elements—the coherence terms—decay for increasing , then the density matrices for the various qubits of the system will be indistinguishable. This means that no measurement can distinguish between the qubits, thus creating decoherence between the various qubit states. In particular, this dephasing process causes the qubits to collapse onto the axis. This is why this type of decoherence process is called collective dephasing, because the mutual phases between all qubits of the Nqubit system are destroyed.
Depolarizing
Depolarizing is a nonunitary transformation on a quantum system which maps pure states to mixed states. This is a nonunitary process, because any transformation that reverses this process will map states out of their respective Hilbert space thus not preserving positivity (i.e. the original probabilities are mapped to negative probabilities, which is not allowed). The 2dimensional case of such a transformation would consist of mapping pure states on the surface of the Bloch sphere to mixed states within the Bloch sphere. This would contract the Bloch sphere by some finite amount and the reverse process would expand the Bloch sphere, which cannot happen.
Dissipation
Dissipation is a decohering process by which the populations of quantum states are changed due to entanglement with a bath. An example of this would be a quantum system that can exchange its energy with a bath through the interaction Hamiltonian. If the system is not in its ground state and the bath is at a temperature lower than that of the system's, then the system will give off energy to the bath, and thus higherenergy eigenstates of the system Hamiltonian will decohere to the ground state after cooling and, as such, will all be nondegenerate. Since the states are no longer degenerate, they are not distinguishable, and thus this process is irreversible (nonunitary).
Timescales
Decoherence represents an extremely fast process for macroscopic objects, since these are interacting with many microscopic objects, with an enormous number of degrees of freedom, in their natural environment. The process is needed if we are to understand why we tend not to observe quantum behaviour in everyday macroscopic objects and why we do see classical fields emerge from the properties of the interaction between matter and radiation for large amounts of matter. The time taken for offdiagonal components of the density matrix to effectively vanish is called the decoherence time. It is typically extremely short for everyday, macroscale processes.[6][7][8] A modern basisindependent definition of the decoherence time relies on the shorttime behavior of the fidelity between the initial and the timedependent state [15] or, equivalently, the decay of the purity .[16]
Mathematical details
We assume for the moment that the system in question consists of a subsystem A being studied and the "environment" , and the total Hilbert space is the tensor product of a Hilbert space describing A and a Hilbert space describing , that is,
This is a reasonably good approximation in the case where A and are relatively independent (e.g. there is nothing like parts of A mixing with parts of or conversely). The point is, the interaction with the environment is for all practical purposes unavoidable (e.g. even a single excited atom in a vacuum would emit a photon, which would then go off). Let's say this interaction is described by a unitary transformation U acting upon . Assume that the initial state of the environment is , and the initial state of A is the superposition state
where and are orthogonal, and there is no entanglement initially. Also, choose an orthonormal basis for . (This could be a "continuously indexed basis" or a mixture of continuous and discrete indexes, in which case we would have to use a rigged Hilbert space and be more careful about what we mean by orthonormal, but that's an inessential detail for expository purposes.) Then, we can expand
and
uniquely as
and
respectively. One thing to realize is that the environment contains a huge number of degrees of freedom, a good number of them interacting with each other all the time. This makes the following assumption reasonable in a handwaving way, which can be shown to be true in some simple toy models. Assume that there exists a basis for such that and are all approximately orthogonal to a good degree if i ≠ j and the same thing for and and also for and for any i and j (the decoherence property).
This often turns out to be true (as a reasonable conjecture) in the position basis because how A interacts with the environment would often depend critically upon the position of the objects in A. Then, if we take the partial trace over the environment, we would find the density state is approximately described by
that is, we have a diagonal mixed state, there is no constructive or destructive interference, and the "probabilities" add up classically. The time it takes for U(t) (the unitary operator as a function of time) to display the decoherence property is called the decoherence time.
Experimental observations
Quantitative measurement
The decoherence rate depends on a number of factors, including temperature or uncertainty in position, and many experiments have tried to measure it depending on the external environment.[17]
The process of a quantum superposition gradually obliterated by decoherence was quantitatively measured for the first time by Serge Haroche and his coworkers at the École Normale Supérieure in Paris in 1996.[18] Their approach involved sending individual rubidium atoms, each in a superposition of two states, through a microwavefilled cavity. The two quantum states both cause shifts in the phase of the microwave field, but by different amounts, so that the field itself is also put into a superposition of two states. Due to photon scattering on cavitymirror imperfection, the cavity field loses phase coherence to the environment.
Haroche and his colleagues measured the resulting decoherence via correlations between the states of pairs of atoms sent through the cavity with various time delays between the atoms.
Reducing environmental decoherence
In July 2011, researchers from University of British Columbia and University of California, Santa Barbara were able to reduce environmental decoherence rate "to levels far below the threshold necessary for quantum information processing" by applying high magnetic fields in their experiment.[19][20][21]
In August 2020 scientists reported that that ionizing radiation from environmental radioactive materials and cosmic rays may substantially limit the coherence times of qubits if they aren't shielded adequately which may be critical for realizing faulttolerant superconducting quantum computers in the future.[22][23][24]
Criticism
Criticism of the adequacy of decoherence theory to solve the measurement problem has been expressed by Anthony Leggett: "I hear people murmur the dreaded word "decoherence". But I claim that this is a major red herring".[25] Concerning the experimental relevance of decoherence theory, Leggett has stated: "Let us now try to assess the decoherence argument. Actually, the most economical tactic at this point would be to go directly to the results of the next section, namely that it is experimentally refuted! However, it is interesting to spend a moment enquiring why it was reasonable to anticipate this in advance of the actual experiments. In fact, the argument contains several major loopholes".[26]
In interpretations of quantum mechanics
Before an understanding of decoherence was developed, the Copenhagen interpretation of quantum mechanics treated wavefunction collapse as a fundamental, a priori process. Decoherence as a possible explanatory mechanism for the appearance of wave function collapse was first developed by David Bohm in 1952, who applied it to Louis DeBroglie's pilotwave theory, producing Bohmian mechanics,[27][28] the first successful hiddenvariables interpretation of quantum mechanics. Decoherence was then used by Hugh Everett in 1957 to form the core of his manyworlds interpretation.[29] However, decoherence was largely ignored for many years (with the exception of Zeh's work),[1] and not until the 1980s[30][31] did decoherentbased explanations of the appearance of wavefunction collapse become popular, with the greater acceptance of the use of reduced density matrices.[9][7] The range of decoherent interpretations have subsequently been extended around the idea, such as consistent histories. Some versions of the Copenhagen interpretation have been modified to include decoherence.
Decoherence does not claim to provide a mechanism for the actual wavefunction collapse; rather it puts forth a reasonable framework for the appearance of wavefunction collapse. The quantum nature of the system is simply "leaked" into the environment so that a total superposition of the wave function still exists, but exists – at least for all practical purposes[32] — beyond the realm of measurement.[33] Of course, by definition, the claim that a merged but unmeasurable wave function still exists cannot be proven experimentally. Decoherence is needed to understand why a quantum system begins to obey classical probability rules after interacting with its environment (due to the suppression of the interference terms when applying Born's probability rules to the system).
See also
 Dephasing
 Dephasing rate SP formula
 Einselection
 Ghirardi–Rimini–Weber theory
 H. Dieter Zeh
 Interpretations of quantum mechanics
 Objectivecollapse theory
 Partial trace
 Photon polarization
 Quantum coherence
 Quantum Darwinism
 Quantum entanglement
 Quantum superposition
 Quantum Zeno effect
References
 H. Dieter Zeh, "On the Interpretation of Measurement in Quantum Theory", Foundations of Physics, vol. 1, pp. 69–76, (1970).
 Schlosshauer, Maximilian (2005). "Decoherence, the measurement problem, and interpretations of quantum mechanics". Reviews of Modern Physics. 76 (4): 1267–1305. arXiv:quantph/0312059. Bibcode:2004RvMP...76.1267S. doi:10.1103/RevModPhys.76.1267. S2CID 7295619.
 Joos and Zeh (1985) state ‘'Of course no unitary treatment of the time dependence can explain why only one of these dynamically independent components is experienced.'’ And in a recent review on decoherence, Joos (1999) states ‘'Does decoherence solve the measurement problem? Clearly not. What decoherence tells us is that certain objects appear classical when observed. But what is an observation? At some stage we still have to apply the usual probability rules of quantum theory.'’Adler, Stephen L. (2003). "Why decoherence has not solved the measurement problem: a response to P.W. Anderson". Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics. 34 (1): 135–142. arXiv:quantph/0112095. Bibcode:2003SHPMP..34..135A. doi:10.1016/S13552198(02)000862. S2CID 21040195.
 Bacon, D. (2001). "Decoherence, control, and symmetry in quantum computers". arXiv:quantph/0305025.
 Lidar, Daniel A.; Whaley, K. Birgitta (2003). "DecoherenceFree Subspaces and Subsystems". In Benatti, F.; Floreanini, R. (eds.). Irreversible Quantum Dynamics. Irreversible Quantum Dynamics. Springer Lecture Notes in Physics. 622. Berlin. pp. 83–120. arXiv:quantph/0301032. Bibcode:2003LNP...622...83L. doi:10.1007/3540448748_5. ISBN 9783540402237. S2CID 117748831.
 Zurek, Wojciech H. (2003). "Decoherence, einselection, and the quantum origins of the classical". Reviews of Modern Physics. 75 (3): 715. arXiv:quantph/0105127. Bibcode:2003RvMP...75..715Z. doi:10.1103/revmodphys.75.715. S2CID 14759237.
 Wojciech H. Zurek, "Decoherence and the transition from quantum to classical", Physics Today, 44, pp. 36–44 (1991).
 Zurek, Wojciech (2002). "Decoherence and the Transition from Quantum to Classical—Revisited" (PDF). Los Alamos Science. 27. arXiv:quantph/0306072. Bibcode:2003quant.ph..6072Z.
 E. Joos and H. D. Zeh, "The emergence of classical properties through interaction with the environment", Zeitschrift für Physik B, 59(2), pp. 223–243 (June 1985): eq. 1.2.
 A. S. Sanz, F. Borondo: A quantum trajectory description of decoherence, quantph/0310096v5.
 V. P. Belavkin (1989). "A new wave equation for a continuous nondemolition measurement". Physics Letters A. 140 (7–8): 355–358. arXiv:quantph/0512136. Bibcode:1989PhLA..140..355B. doi:10.1016/03759601(89)900662. S2CID 6083856.
 Howard J. Carmichael (1993). An Open Systems Approach to Quantum Optics. Berlin Heidelberg NewYork: SpringerVerlag.
 Michel Bauer; Denis Bernard; Tristan Benoist. Iterated Stochastic Measurements (Technical report). arXiv:1210.0425. Bibcode:2012JPhA...45W4020B. doi:10.1088/17518113/45/49/494020.

 Lidar, D. A.; Chuang, I. L.; Whaley, K. B. (1998). "DecoherenceFree Subspaces for Quantum Computation". Physical Review Letters. 81 (12): 2594–2597. arXiv:quantph/9807004. Bibcode:1998PhRvL..81.2594L. doi:10.1103/PhysRevLett.81.2594. S2CID 13979882.
 Beau, M.; Kiukas, J.; Egusquiza, I. L.; del Campo, A. (2017). "Nonexponential quantum decay under environmental decoherence". Phys. Rev. Lett. 119 (13): 130401. arXiv:1706.06943. Bibcode:2017PhRvL.119m0401B. doi:10.1103/PhysRevLett.119.130401. PMID 29341721. S2CID 206299205.
 Xu, Z.; GarcíaPintos, L. P.; Chenu, A.; del Campo, A. (2019). "Extreme Decoherence and Quantum Chaos". Phys. Rev. Lett. 122 (1): 014103. arXiv:1810.02319. Bibcode:2019PhRvL.122a4103X. doi:10.1103/PhysRevLett.122.014103. PMID 31012673. S2CID 53628496.
 Dan Stahlke. "Quantum Decoherence and the Measurement Problem" (PDF). Retrieved 23 July 2011.
 M. Brune, E. Hagley, J. Dreyer, X. Maître, A. Maali, C. Wunderlich, J. M. Raimond, S. Haroche (9 December 1996). "Observing the Progressive Decoherence of the "Meter" in a Quantum Measurement". Phys. Rev. Lett. 77 (24): 4887–4890. Bibcode:1996PhRvL..77.4887B. doi:10.1103/PhysRevLett.77.4887. PMID 10062660.CS1 maint: uses authors parameter (link)
 "Discovery may overcome obstacle for quantum computing: UBC, California researchers". University of British Columbia. 20 July 2011. Retrieved 23 July 2011.
Our theory also predicted that we could suppress the decoherence, and push the decoherence rate in the experiment to levels far below the threshold necessary for quantum information processing, by applying high magnetic fields. (...)Magnetic molecules now suddenly appear to have serious potential as candidates for quantum computing hardware", said Susumu Takahashi, assistant professor of chemistry and physics at the University of Southern California. "This opens up a whole new area of experimental investigation with sizeable potential in applications, as well as for fundamental work".
 "USC Scientists Contribute to a Breakthrough in Quantum Computing". University of California, Santa Barbara. 20 July 2011. Retrieved 23 July 2011.
 "Breakthrough removes major hurdle for quantum computing". ZDNet. 20 July 2011. Retrieved 23 July 2011.
 "Quantum computers may be destroyed by highenergy particles from space". New Scientist. Retrieved 7 September 2020.
 "Cosmic rays may soon stymie quantum computing". phys.org. Retrieved 7 September 2020.
 Vepsäläinen, Antti P.; Karamlou, Amir H.; Orrell, John L.; Dogra, Akshunna S.; Loer, Ben; Vasconcelos, Francisca; Kim, David K.; Melville, Alexander J.; Niedzielski, Bethany M.; Yoder, Jonilyn L.; Gustavsson, Simon; Formaggio, Joseph A.; VanDevender, Brent A.; Oliver, William D. (August 2020). "Impact of ionizing radiation on superconducting qubit coherence". Nature. 584 (7822): 551–556. arXiv:2001.09190. doi:10.1038/s4158602026198. ISSN 14764687. PMID 32848227. S2CID 210920566. Retrieved 7 September 2020.
 Leggett, A. J. (2001). "Probing quantum mechanics towards the everyday world: where do we stand". Physica Scripta. 102 (01): 69–73. doi:10.1238/Physica.Topical.102a00069.
 Leggett, A. J. (2002). "Testing the limits of quantum mechanics: Motivation, state of play, prospects". Journal of Physics: Condensed Matter. 14 (15): R415–R451. doi:10.1088/09538984/14/15/201.
 David Bohm, A Suggested Interpretation of the Quantum Theory in Terms of "Hidden Variables", I, Physical Review, (1952), 85, pp. 166–179.
 David Bohm, A Suggested Interpretation of the Quantum Theory in Terms of "Hidden Variables", II, Physical Review, (1952), 85, pp. 180–193.
 Hugh Everett, Relative State Formulation of Quantum Mechanics, Reviews of Modern Physics, vol. 29, (1957) pp. 454–462.
 Wojciech H. Zurek, Pointer Basis of Quantum Apparatus: Into what Mixture does the Wave Packet Collapse?, Physical Review D, 24, pp. 1516–1525 (1981).
 Wojciech H. Zurek, EnvironmentInduced Superselection Rules, Physical Review D, 26, pp. 1862–1880, (1982).
 Roger Penrose (2004), The Road to Reality, pp. 802–803: "...the environmentaldecoherence viewpoint [...] maintains that state vector reduction [the R process] can be understood as coming about because the environmental system under consideration becomes inextricably entangled with its environment. [...] We think of the environment as extremely complicated and essentially 'random' [...], accordingly we sum over the unknown states in the environment to obtain a density matrix [...] Under normal circumstances, one must regard the density matrix as some kind of approximation to the whole quantum truth. For there is no general principle providing an absolute bar to extracting information from the environment. [...] Accordingly, such descriptions are referred to as FAPP [for all practical purposes]".
 Huw Price (1996), Times' Arrow and Archimedes' Point, p. 226: "There is a world of difference between saying 'the environment explains why collapse happens where it does' and saying 'the environment explains why collapse seems to happen even though it doesn't really happen'."
Further reading
 Schlosshauer, Maximilian (2007). Decoherence and the QuantumtoClassical Transition (1st ed.). Berlin/Heidelberg: Springer.
 Joos, E.; et al. (2003). Decoherence and the Appearance of a Classical World in Quantum Theory (2nd ed.). Berlin: Springer.
 Omnes, R. (1999). Understanding Quantum Mechanics. Princeton: Princeton University Press.
 Zurek, Wojciech H. (2003). "Decoherence and the transition from quantum to classical – REVISITED", arXiv:quantph/0306072 (An updated version of PHYSICS TODAY, 44:36–44 (1991) article)
 Schlosshauer, Maximilian (23 February 2005). "Decoherence, the Measurement Problem, and Interpretations of Quantum Mechanics". Reviews of Modern Physics. 76 (2004): 1267–1305. arXiv:quantph/0312059. Bibcode:2004RvMP...76.1267S. doi:10.1103/RevModPhys.76.1267. S2CID 7295619.
 J. J. Halliwell, J. PerezMercader, Wojciech H. Zurek, eds, The Physical Origins of Time Asymmetry, Part 3: Decoherence, ISBN 0521568374
 BertholdGeorg Englert, Marlan O. Scully & Herbert Walther, Quantum Optical Tests of Complementarity, Nature, Vol 351, pp 111–116 (9 May 1991) and (same authors) The Duality in Matter and Light Scientific American, pg 56–61, (December 1994). Demonstrates that complementarity is enforced, and quantum interference effects destroyed, by irreversible objectapparatus correlations, and not, as was previously popularly believed, by Heisenberg's uncertainty principle itself.
 Mario Castagnino, Sebastian Fortin, Roberto Laura and Olimpia Lombardi, A general theoretical framework for decoherence in open and closed systems, Classical and Quantum Gravity, 25, pp. 154002–154013, (2008). A general theoretical framework for decoherence is proposed, which encompasses formalisms originally devised to deal just with open or closed systems.
External links
 Decoherence.info by Erich Joos
 http://plato.stanford.edu/entries/qmdecoherence/
 Schlosshauer, Maximilian (2005). "Decoherence, the measurement problem, and interpretations of quantum mechanics". Reviews of Modern Physics. 76 (4): 1267–1305. arXiv:quantph/0312059. doi:10.1103/RevModPhys.76.1267. S2CID 7295619.
 Dass, Tulsi (2005). "Measurements and Decoherence". arXiv:quantph/0505070.
 A detailed introduction from a graduate student's website at Drexel University
 Quantum Bug : Qubits might spontaneously decay in seconds Scientific American (October 2005)
 Quantum Decoherence and the Measurement Problem