One of the biggest challenges in the foundations of physics is the quantum measurement problem. It is associated with a few key (distinct but related) questions.
i. How does a measurement convert a coherent state undergoing unitary dynamics to a "classical" mixed state for which we can talk about probabilities of outcomes?
ii. Why is the outcome of an individual measurement always definite for the "pointer states" of the measuring apparatus?
iii. Can one derive the Born rule, which gives the probability of a particular outcome?
Emergence of the classical world from the quantum world via decoherence
A quantum system always interacts to some extent with its environment. This interaction leads to decoherence, whereby quantum interference effects are washed out. Consequently, superposition states of the system decay into mixed states described by a diagonal density matrix. A major research goal of the past three decades has been understanding decoherence and the extent to which it does provide answers to the quantum measurement problem. One achievement is that decoherence theory seems to give a mechanism and time scale for the “collapse of the wavefunction” within the framework of unitary dynamics. However, this is not the case because decoherence is not the same as a projection (which is what a single quantum measurement is). Decoherence does not produce definite outcomes but rather statistical mixtures. Decoherence only resolves the issue if one identifies ensembles of measured states with ensembles of the decohered density matrix (the statistical interpretation of quantum mechanics). Thus, it seems decoherence only answers the first question above, but not the last two. On the other hand, Zurek has pushed the decoherence picture further and given a “derivation” of the Born rule within its framework. In other words, decoherence does not solve the quantum measurement problem: measurements always produce definite outcomes.
One approach to solving the problem is to view quantum theory as only an approximate theory. In particular, it could be an effective theory for some underlying theory valid at time and length scales much smaller than those for which quantum theory has been precisely tested by experiments.
Emergence of quantum field theory from a “classical” statistical theory
Einstein did not accept the statistical nature of quantum theory and considered it should be derivable from a more “realistic” theory. In particular, he suggested “a complete physical description, the statistical quantum theory would …. take an approximately analogous position to the statistical mechanics within the framework of classical mechanics.”
Einstein's challenge was taken up in a concrete and impressive fashion by Stephen Adler in a book, “Quantum Theory as an Emergent Phenomenon: The Statistical Mechanics of Matrix Models as the Precursor of Quantum Field Theory”, published in 2004. A helpful summary is given in a review by Pearle.
The starting point is "classical" dynamical variables qr and pr which are NxN matrices, where N is even. Half of these variables are bosonic, and the others are fermionic. They all obey Hamilton's equations of motion for an unspecified Hamiltonian H. Three quantities are conserved: H, the fermion number N, and (very importantly) the traceless anti-self-adjoint matrix,
where the first term is the sum for all the bosonic variables of their commutator, and the second is the sum over anti-commutators for the fermionic variables.
Quantum theory is obtained by tracing over all the classical variables with respect to a canonical ensemble with three (matrix) Lagrange multipliers [analogues of temperature and chemical potential in conventional statistical mechanics] corresponding to the conserved quantities H, N, and C. The expectation values of the diagonal elements of C are assumed to all have the same value, hbar!
An analogy of the equipartition theorem in classical statistical mechanics (which looks like a Ward identity in quantum field theory) leads to dynamical equations (trace dynamics) for effective fields. To make these equations look like regular quantum field theory, an assumption is made about a hierarchy of length, energy, and "temperature" [Lagrange multiplier] scales, which cause the Trace dynamics to be dominated by C rather than H, the trace Hamiltonian. Adler suggests these scales may be Planck scales. Then, the usual quantum dynamical equations and the Dirac correspondence of Poisson brackets and commutators emerge. Most of the actual details of the trace Hamiltonian H do not matter; another case of universality, a common characteristic of emergent phenomena.
The “classical” field C fluctuates about its average value. These fluctuations can be identified with corrections to locality in quantum field theory and with the noise terms which appear in the modified Schrodinger equation of "physical collapse" models of quantum theory.
More recently, theorists including Gerard t’Hooft and John Preskill have investigated how quantum mechanics can emerge from other deterministic systems. This is sometimes known as the emergent quantum mechanics (EmQM) hypothesis.
Underlying deterministic systems considered include
Hamilton-Randers systems defined in co-tangent spaces of large-dimensional configuration spaces,
neural networks,
cellular automata,
fast-moving classical variables, and the
boundary of a local classical model with a length that is exponentially large in the number of qubits in the quantum system.
In most of these versions of EmQM the length scale at which the underlying theory becomes relevant is conjectured to be of the order of the Planck length.
The fact that quantum theory can emerge from such a diverse range of underlying theories again illustrates universality.
The question of quantum physics emerging from an underlying classical theory is not just a question in the foundations of physics or in philosophy. Slagle points out that Emergent Quantum Mechanics may mean that the computational power of quantum computers is severely limited. He has proposed a specific experimental protocol to test for EmQM. A large number d of entangling gates (the circuit depth d) are applied to n qbits in the computational basis, followed by the inverse gates. This is followed by measurement in the computational basis. The fidelity should decay exponentially with d, whereas for EmQM will decay much faster above some critical d, for sufficiently large n.
Independent of experimental evidence, EmQM provides an alternative interpretation to quantum theory that avoids the thorny issues such as the many-worlds interpretation.