Tuesday, May 6, 2025

Characteristics of static disorder can emerge from electron-phonon interactions

Electronic systems with large amounts of static disorder can exhibit distinct properties, including localisation of electronic states and sub-gap band tails in the density of states and electronic absorption. 

Eric Heller and collaborators have recently published a nice series of papers that show how these properties can also appear, at least on sufficiently long time scales, in the absence of disorder, due to the electron-phonon interaction. On a technical level, a coherent state representation for phonons is used. This provides a natural way of taking a classical limit, similar to what is done in quantum optics for photons. Details are set out in the following paper 

Coherent charge carrier dynamics in the presence of thermal lattice vibrations, Donghwan Kim, Alhun Aydin, Alvar Daza, Kobra N. Avanaki, Joonas Keski-Rahkonen, and Eric J. Heller

This work brought back memories from long ago when I was a postdoc with John Wilkins. I was puzzled by several related things about quasi-one-dimensional electronic systems, such as polyacetylene, that underwent a Peierls instability. First, the zero-point motion of the lattice was comparable to lattice dimerisation that produced an energy gap at the Fermi energy. Second, even in clean systems, there was a large subgap optical absorption. Third, there was no sign of the square-root singularity expected in the density of states, predicted by theories which treated the lattice classically, i.e., calculated electronic properties in the Born-Oppenheimer approximation.

I found that on the energy scales relevant to the sub-gap absorption, the phonons could be treated like static disorder and make use of known exact results for one-dimensional Dirac equations with random disorder. This explained the puzzles.

Effect of Lattice Zero-Point Motion on Electronic Properties of the Peierls-Fröhlich State

The disorder model can also be motivated by considering the Feynman diagrams for the electronic Green's function perturbation expansion in powers of the electron-phonon interaction. In the limit that the phonon frequency is small, all the diagrams become like those for a disordered system, where the strength of the static disorder is given by 

I then teamed up with another postdoc, Kihong Kim, who calculated the optical conductivity for this disorder model.

Universal subgap optical conductivity in quasi-one-dimensional Peierls systems

Two things were surprising about our results. First, the theory agreed well with experimental results for a range of materials, including the temperature dependence. Second,  the frequency dependence had a universal form. Wilkins was clever and persistent at extracting such forms, probably from his experience working on the Kondo problem.

Friday, May 2, 2025

Could quantum mechanics be emergent?

One of the biggest challenges in the foundations of physics is the quantum measurement problem. It is associated with a few key (distinct but related) questions.

i. How does a measurement convert a coherent state undergoing unitary dynamics to a "classical" mixed state for which we can talk about probabilities of outcomes?

ii. Why is the outcome of an individual measurement always definite for the "pointer states" of the measuring apparatus?

iii. Can one derive the Born rule, which gives the probability of a particular outcome?

Emergence of the classical world from the quantum world via decoherence

A quantum system always interacts to some extent with its environment. This interaction leads to decoherence, whereby quantum interference effects are washed out. Consequently, superposition states of the system decay into mixed states described by a diagonal density matrix. A major research goal of the past three decades has been understanding decoherence and the extent to which it does provide answers to the quantum measurement problem. One achievement is that decoherence theory seems to give a mechanism and time scale for the “collapse of the wavefunction” within the framework of unitary dynamics. However, this is not the case because decoherence is not the same as a projection (which is what a single quantum measurement is). Decoherence does not produce definite outcomes but rather statistical mixtures. Decoherence only resolves the issue if one identifies ensembles of measured states with ensembles of the decohered density matrix (the statistical interpretation of quantum mechanics). Thus, it seems decoherence only answers the first question above, but not the last two. On the other hand, Zurek has pushed the decoherence picture further and given a “derivation” of the Born rule within its framework. In other words, decoherence does not solve the quantum measurement problem: measurements always produce definite outcomes.

One approach to solving the problem is to view quantum theory as only an approximate theory. In particular, it could be an effective theory for some underlying theory valid at time and length scales much smaller than those for which quantum theory has been precisely tested by experiments. 

Emergence of quantum field theory from a “classical” statistical theory

Einstein did not accept the statistical nature of quantum theory and considered it should be derivable from a more “realistic” theory. In particular, he suggested “a complete physical description, the statistical quantum theory would …. take an approximately analogous position to the statistical mechanics within the framework of classical mechanics.”

Einstein's challenge was taken up in a concrete and impressive fashion by Stephen Adler in a book, “Quantum Theory as an Emergent Phenomenon: The Statistical Mechanics of Matrix Models as the Precursor of Quantum Field Theory”, published in 2004.  A helpful summary is given in a review by Pearle.

The starting point is "classical" dynamical variables qr and pr which are NxN matrices, where N is even. Half of these variables are bosonic, and the others are fermionic. They all obey Hamilton's equations of motion for an unspecified Hamiltonian H. Three quantities are conserved: H, the fermion number N, and (very importantly) the traceless anti-self-adjoint matrix, 

where the first term is the sum for all the bosonic variables of their commutator, and the second is the sum over anti-commutators for the fermionic variables.

Quantum theory is obtained by tracing over all the classical variables with respect to a canonical ensemble with three (matrix) Lagrange multipliers [analogues of temperature and chemical potential in conventional statistical mechanics] corresponding to the conserved quantities H, N, and C. The expectation values of the diagonal elements of C are assumed to all have the same value, hbar!

An analogy of the equipartition theorem in classical statistical mechanics (which looks like a Ward identity in quantum field theory) leads to dynamical equations (trace dynamics) for effective fields. To make these equations look like regular quantum field theory, an assumption is made about a hierarchy of length, energy, and "temperature" [Lagrange multiplier] scales, which cause the Trace dynamics to be dominated by C rather than H, the trace Hamiltonian. Adler suggests these scales may be Planck scales. Then, the usual quantum dynamical equations and the Dirac correspondence of Poisson brackets and commutators emerge. Most of the actual details of the trace Hamiltonian H do not matter; another case of universality, a common characteristic of emergent phenomena.

The “classical” field C fluctuates about its average value. These fluctuations can be identified with corrections to locality in quantum field theory and with the noise terms which appear in the modified Schrodinger equation of "physical collapse" models of quantum theory.

More recently, theorists including Gerard t’Hooft and John Preskill have investigated how quantum mechanics can emerge from other deterministic systems. This is sometimes known as the emergent quantum mechanics (EmQM) hypothesis.

Underlying deterministic systems considered include

Hamilton-Randers systems defined in co-tangent spaces of large-dimensional configuration spaces

neural networks,

cellular automata,

fast-moving classical variables, and the

 boundary of a local classical model with a length that is exponentially large in the number of qubits in the quantum system. 

In most of these versions of EmQM the length scale at which the underlying theory becomes relevant is conjectured to be of the order of the Planck length.

The fact that quantum theory can emerge from such a diverse range of underlying theories again illustrates universality.

The question of quantum physics emerging from an underlying classical theory is not just a question in the foundations of physics or in philosophy. Slagle points out that Emergent Quantum Mechanics may mean that the computational power of quantum computers is severely limited. He has proposed a specific experimental protocol to test for EmQM. A large number d of entangling gates (the circuit depth d) are applied to n qbits in the computational basis, followed by the inverse gates. This is followed by measurement in the computational basis. The fidelity should decay exponentially with d, whereas for EmQM will decay much faster above some critical d, for sufficiently large n.

Independent of experimental evidence, EmQM provides an alternative interpretation to quantum theory that avoids the thorny issues such as the many-worlds interpretation.

Characteristics of static disorder can emerge from electron-phonon interactions

Electronic systems with large amounts of static disorder can exhibit distinct properties, including localisation of electronic states and su...