Monday, May 26, 2025

Emergence and quantum theories of gravity

Einstein’s theory of General Relativity successfully describes gravity and large scales of length and mass. In contrast, quantum theory describes small scales of length and mass. Emergence is central to most attempts to unify the two theories. Before considering specific examples, it is useful to make some distinctions.

First, a quantum theory of gravity is not necessarily the same as a theory to unify gravity with the three other forces described by the Standard Model. Whether the two problems are inextricable is unknown.

Second, there are two distinct possibilities on how classical gravity might emerge from a quantum theory. In Einstein’s theory of General Relativity, space-time and gravity are intertwined. Consequently, the two possibilities are as follows.

i. Space-time is not emergent. Classical General Relativity emerges from an underlying quantum field theory describing fields at small length scales, probably comparable to the Planck length.

ii. Space-time emerges from some underlying granular structure. In some limit, classical gravity emerges with the space-time continuum. 

Third, there are "bottom-up" and "top-down" approaches to discovering how classical gravity emerges from an underlying quantum theory, as was emphasised by Bei Lok Hu.

Finally, there is the possibility that quantum theory itself is emergent, as discussed in an earlier post about the quantum measurement problem. Some proposals of Emergent Quantum Mechanics (EQM) attempt to include gravity.

I now mention several different approaches to quantum gravity and for each point out how they fit into the distinctions above.

Gravitons and semi-classical theory

A simple bottom-up approach is to start with classical General Relativity and consider gravitational waves as the normal modes of oscillation of the space-time continuum. They have a linear dispersion relation and move with the speed of light. They are analogous to sound waves in an elastic medium. Semi-classical quantisation of gravitational waves leads to gravitons which are a massless spin-2 field. They are the analogue of phonons in a crystal or photons in the electromagnetic vacuum. However, this reveals nothing about an underlying quantum theory, just as phonons with a linear dispersion relation reveal nothing about the underlying crystal structure.

On the other hand, one can start with a massless spin-2 quantum field and consider how it scatters off massive particles. In the 1960s, Weinberg showed that gauge invariance of the scattering amplitudes implied the equivalence principle (inertial and gravitational mass are identical) and the Einstein field equations. In a sense, this is a top-down approach, as it is a derivation of General Relativity from an underlying quantum theory. In passing, I mention Weinberg used a similar approach to derive charge conservation and Maxwell’s equations of classical electromagnetism, and classical Yang-Mills theory for non-abelian gauge fields. 

Weinberg pointed out that this could go against his reductionist claim that in the hierarchy of the sciences, the arrows of the explanation always point down, saying “sometimes it isn't so clear which way the arrows of explanation point… Which is more fundamental, general relativity or the existence of particles of mass zero and spin two?”

More recently, Weinberg discussed General Relativity as an effective field theory

"... we should not despair of applying quantum field theory to gravitation just because there is no renormalizable theory of the metric tensor that is invariant under general coordinate transformations. It increasingly seems apparent that the Einstein–Hilbert Lagrangian √gR is just the least suppressed term in the Lagrangian of an effective field theory containing every possible generally covariant function of the metric and its derivatives..."

This is a bottom-up approach. Weinberg then went on to discuss a top-down  approach:

“it is usually assumed that in the quantum theory of gravitation, when Λ reaches some very high energy, of the order of 10^15 to 10^18 GeV, the appropriate degrees of freedom are no longer the metric and the Standard Model fields, but something very different, perhaps strings... But maybe not..."

String theory 

Versions of string theory from the 1980s aimed to unify all four forces. They were formulated in terms of nine spatial dimensions and a large internal symmetry group, such as SO(32), where supersymmetric strings were the fundamental units. In the low-energy limit, vibrations of the strings are identified with elementary particles in four-dimensional space-time. A particle with mass zero and spin two appears as an immediate consequence of the symmetries of the string theory. Hence, this was originally claimed to be a quantum theory of gravity. However, subsequent developments have found that there are many alternative string theories and it is not possible to formulate the theory in terms of a unique vacuum.

AdS-CFT correspondence

In the context of string theory, this correspondence conjectures a connection (a dual relation) between classical gravity in Anti-deSitter space-time (AdS) and quantum conformal field theories (CFTs), including some gauge theories. This connection could be interpreted in two different ways. One is that space-time emerges from the quantum theory. Alternatively, the quantum theory emerges from the classical gravity theory.   This ambiguity of interpretation has been highlighted by Alyssa Ney, a philosopher of physics. In other words, it is ambiguous which of the two sides of the duality is the more fundamental. Witten has argued that AdS-CFT suggests that gauge symmetries are emergent. However, I cannot follow his argument.

Seiberg reviewed different approaches, within the string theory community, that lead to spacetime as emergent. An example of a toy model is a matrix model for quantum mechanics [which can be viewed as a zero-dimensional field theory]. Perturbation expansions can be viewed as discretised two-dimensional surfaces. In a large N limit, two-dimensional space and general covariance (the starting point for general relativity) both emerge. Thus, this shows how both two-dimensional gravity and spacetime can be emergent. However, this type of emergence is distinct from how low-energy theories emerge. Seiberg also notes that there are no examples of toy models where time (which is associated with locality and causality) is emergent.

Loop quantum gravity 

This is a top-down approach where both space-time and gravity emerge together from a granular structure, sometimes referred to as "spin foam" or a “spin network”, and has been reviewed by Rovelli. The starting point is Ashtekar’s demonstration that General Relativity can be described using the phase space of an SU(2) Yang-Mills theory. A boundary in four-dimensional space-time can be decomposed into cells and this can be used to define a dual graph (lattice) Gamma. The gravitational field on this discretised boundary is represented by the Hilbert space of a lattice SU(2) Yang-Mills theory. The quantum numbers used to define a basis for this Hilbert space are the graph Gamma,  the “spin” [SU(2) quantum number] associated with the face of each cell, and the volumes of the cells. The Planck length limits the size of the cells. In the limit of the continuum and then of large spin, or vice versa, one obtains General Relativity.

Quantum thermodynamics of event horizons

A bottom-up approach was taken by Padmanabhan. He emphasises Boltzmann's insight: "matter can only store and transfer heat because of internal degrees of freedom". In other words, if something has a temperature and entropy then it must have a microstructure. He does this by considering the connection between event horizons in General Relativity and the temperature of the thermal radiation associated with them. He frames his research as attempting to estimate Avogadro’s number for space-time.

The temperature and entropy associated with event horizons has been calculated for the following specific space-times:

a. For accelerating frames of reference (Rindler space-time) there is an event horizon which exhibits Unruh radiation with a temperature that was calculated by Fulling, Davies and Unruh.

b. The black hole horizon in the Schwarzschild metric has the temperature of Hawking radiation.

c. The cosmological horizon in deSitter space is associated with a temperature proportional to the Hubble constant H, as discussed in detail by Gibbons and Hawking.

Padmanabhan considers the number of degrees of freedom on the boundary of the event horizon, Ns, and in the bulk, Nb. He argues for the holographic principle that Ns = Nb. On the boundary surface, there is one degree of freedom associated with every Planck area, Ns = A/Lp2, where Lp is the Planck length and A is the surface area, which is related to the entropy of the horizon, as first discussed by Bekenstein and Hawking. In the bulk, classical equipartition of energy is assumed so the bulk energy E = Nb k T/2. 

Padmanabhan gives an alternative perspective on cosmology through a novel derivation of the dynamic equations for the scale factor R(t) in the Friedmann-Robertson-Walker metric of the universe in General Relativity. His starting point is a simple argument leading to 

V is the Hubble volume, 4\pi/3H^3, where H is the Hubble constant, and Lp is the Planck length. The right-hand side is zero for the deSitter universe, which is predicted to be the asymptotic state of our current universe.

He presents an argument that the cosmological constant is related to the Planck length, leading to the expression  

where mu is of order unity and gives a value consistent with observation.

Tuesday, May 20, 2025

The triumphs of lattice gauge theory

When first proposed by Ken Wilson in 1974, lattice gauge theory was arguably a toy model, i.e., an oversimplification. He treated space-time as a discrete lattice purely to make analysis more tractable. Borrowing insights and techniques from lattice models in statistical mechanics, Wilson could then argue for quark confinement, showing that the confining potential was linear with distance.

Earlier, in 1971, Wegner had proposed a Z2 gauge theory in the context of generalised Ising models in statistical mechanics to show how a phase transition was possible without a local order parameter, i.e., without symmetry breaking. Later, it was shown that the phase transition is similar to the confinement-deconfinement phase transition that occurs in QCD. [A nice review from 2014 by Wegner is here]. This work also provided a toy model to illustrate the possibility of a quantum spin liquid.

Perhaps, what was not anticipated was that lattice QCD could be used to calculate accurately properties of elementary particles.

The discrete nature of lattice gauge theory means it is amenable to numerical simulation. It is not necessary to have the continuum limit of real spacetime because of universality. Due to increases in computational power over the past 50 years and innovations in algorithms, lattice QCD can be used to calculate properties of nucleons and mesons, such as mass and decay rates, with impressive accuracy. The figure below is taken from a 2008 article in Science. 

The mass of three mesons is typically used to fix the mass of the light and strange quarks and the length scale. The mass of nine other particles, including the nucleon, is calculated with an uncertainty of less than one per cent, and in agreement with experimental values.

An indication that this is a strong coupling problem is that about 95 per cent of the mass of nucleons comes from the interactions. Only about 5 per cent is from the rest mass of the constituent quarks.

For more background on computational lattice QCD, there is a helpful 2004 Physics Today article, which drew a critical response from Herbert Neuberger. A recent (somewhat) pedagogical review by Sasa Prelovsek just appeared on the arXiv.


Tuesday, May 6, 2025

Characteristics of static disorder can emerge from electron-phonon interactions

Electronic systems with large amounts of static disorder can exhibit distinct properties, including localisation of electronic states and sub-gap band tails in the density of states and electronic absorption. 

Eric Heller and collaborators have recently published a nice series of papers that show how these properties can also appear, at least on sufficiently long time scales, in the absence of disorder, due to the electron-phonon interaction. On a technical level, a coherent state representation for phonons is used. This provides a natural way of taking a classical limit, similar to what is done in quantum optics for photons. Details are set out in the following paper 

Coherent charge carrier dynamics in the presence of thermal lattice vibrations, Donghwan Kim, Alhun Aydin, Alvar Daza, Kobra N. Avanaki, Joonas Keski-Rahkonen, and Eric J. Heller

This work brought back memories from long ago when I was a postdoc with John Wilkins. I was puzzled by several related things about quasi-one-dimensional electronic systems, such as polyacetylene, that underwent a Peierls instability. First, the zero-point motion of the lattice was comparable to lattice dimerisation that produced an energy gap at the Fermi energy. Second, even in clean systems, there was a large subgap optical absorption. Third, there was no sign of the square-root singularity expected in the density of states, predicted by theories which treated the lattice classically, i.e., calculated electronic properties in the Born-Oppenheimer approximation.

I found that on the energy scales relevant to the sub-gap absorption, the phonons could be treated like static disorder and make use of known exact results for one-dimensional Dirac equations with random disorder. This explained the puzzles.

Effect of Lattice Zero-Point Motion on Electronic Properties of the Peierls-Fröhlich State

The disorder model can also be motivated by considering the Feynman diagrams for the electronic Green's function perturbation expansion in powers of the electron-phonon interaction. In the limit that the phonon frequency is small, all the diagrams become like those for a disordered system, where the strength of the static disorder is given by 

I then teamed up with another postdoc, Kihong Kim, who calculated the optical conductivity for this disorder model.

Universal subgap optical conductivity in quasi-one-dimensional Peierls systems

Two things were surprising about our results. First, the theory agreed well with experimental results for a range of materials, including the temperature dependence. Second,  the frequency dependence had a universal form. Wilkins was clever and persistent at extracting such forms, probably from his experience working on the Kondo problem.

Friday, May 2, 2025

Could quantum mechanics be emergent?

One of the biggest challenges in the foundations of physics is the quantum measurement problem. It is associated with a few key (distinct but related) questions.

i. How does a measurement convert a coherent state undergoing unitary dynamics to a "classical" mixed state for which we can talk about probabilities of outcomes?

ii. Why is the outcome of an individual measurement always definite for the "pointer states" of the measuring apparatus?

iii. Can one derive the Born rule, which gives the probability of a particular outcome?

Emergence of the classical world from the quantum world via decoherence

A quantum system always interacts to some extent with its environment. This interaction leads to decoherence, whereby quantum interference effects are washed out. Consequently, superposition states of the system decay into mixed states described by a diagonal density matrix. A major research goal of the past three decades has been understanding decoherence and the extent to which it does provide answers to the quantum measurement problem. One achievement is that decoherence theory seems to give a mechanism and time scale for the “collapse of the wavefunction” within the framework of unitary dynamics. However, this is not the case because decoherence is not the same as a projection (which is what a single quantum measurement is). Decoherence does not produce definite outcomes but rather statistical mixtures. Decoherence only resolves the issue if one identifies ensembles of measured states with ensembles of the decohered density matrix (the statistical interpretation of quantum mechanics). Thus, it seems decoherence only answers the first question above, but not the last two. On the other hand, Zurek has pushed the decoherence picture further and given a “derivation” of the Born rule within its framework. In other words, decoherence does not solve the quantum measurement problem: measurements always produce definite outcomes.

One approach to solving the problem is to view quantum theory as only an approximate theory. In particular, it could be an effective theory for some underlying theory valid at time and length scales much smaller than those for which quantum theory has been precisely tested by experiments. 

Emergence of quantum field theory from a “classical” statistical theory

Einstein did not accept the statistical nature of quantum theory and considered it should be derivable from a more “realistic” theory. In particular, he suggested “a complete physical description, the statistical quantum theory would …. take an approximately analogous position to the statistical mechanics within the framework of classical mechanics.”

Einstein's challenge was taken up in a concrete and impressive fashion by Stephen Adler in a book, “Quantum Theory as an Emergent Phenomenon: The Statistical Mechanics of Matrix Models as the Precursor of Quantum Field Theory”, published in 2004.  A helpful summary is given in a review by Pearle.

The starting point is "classical" dynamical variables qr and pr which are NxN matrices, where N is even. Half of these variables are bosonic, and the others are fermionic. They all obey Hamilton's equations of motion for an unspecified Hamiltonian H. Three quantities are conserved: H, the fermion number N, and (very importantly) the traceless anti-self-adjoint matrix, 

where the first term is the sum for all the bosonic variables of their commutator, and the second is the sum over anti-commutators for the fermionic variables.

Quantum theory is obtained by tracing over all the classical variables with respect to a canonical ensemble with three (matrix) Lagrange multipliers [analogues of temperature and chemical potential in conventional statistical mechanics] corresponding to the conserved quantities H, N, and C. The expectation values of the diagonal elements of C are assumed to all have the same value, hbar!

An analogy of the equipartition theorem in classical statistical mechanics (which looks like a Ward identity in quantum field theory) leads to dynamical equations (trace dynamics) for effective fields. To make these equations look like regular quantum field theory, an assumption is made about a hierarchy of length, energy, and "temperature" [Lagrange multiplier] scales, which cause the Trace dynamics to be dominated by C rather than H, the trace Hamiltonian. Adler suggests these scales may be Planck scales. Then, the usual quantum dynamical equations and the Dirac correspondence of Poisson brackets and commutators emerge. Most of the actual details of the trace Hamiltonian H do not matter; another case of universality, a common characteristic of emergent phenomena.

The “classical” field C fluctuates about its average value. These fluctuations can be identified with corrections to locality in quantum field theory and with the noise terms which appear in the modified Schrodinger equation of "physical collapse" models of quantum theory.

More recently, theorists including Gerard t’Hooft and John Preskill have investigated how quantum mechanics can emerge from other deterministic systems. This is sometimes known as the emergent quantum mechanics (EmQM) hypothesis.

Underlying deterministic systems considered include

Hamilton-Randers systems defined in co-tangent spaces of large-dimensional configuration spaces

neural networks,

cellular automata,

fast-moving classical variables, and the

 boundary of a local classical model with a length that is exponentially large in the number of qubits in the quantum system. 

In most of these versions of EmQM the length scale at which the underlying theory becomes relevant is conjectured to be of the order of the Planck length.

The fact that quantum theory can emerge from such a diverse range of underlying theories again illustrates universality.

The question of quantum physics emerging from an underlying classical theory is not just a question in the foundations of physics or in philosophy. Slagle points out that Emergent Quantum Mechanics may mean that the computational power of quantum computers is severely limited. He has proposed a specific experimental protocol to test for EmQM. A large number d of entangling gates (the circuit depth d) are applied to n qbits in the computational basis, followed by the inverse gates. This is followed by measurement in the computational basis. The fidelity should decay exponentially with d, whereas for EmQM will decay much faster above some critical d, for sufficiently large n.

Independent of experimental evidence, EmQM provides an alternative interpretation to quantum theory that avoids the thorny issues such as the many-worlds interpretation.

Friday, April 25, 2025

Phase diagrams elucidate emergence

Phase diagrams have been ubiquitous in materials science for decades. They show what states of matter are thermodynamically stable depending on the value of external parameters such as temperature, pressure, magnetic field, or chemical composition. However, they are only beginning to be appreciated in other fields. Recently, Bouchaud argued that they needed to be used more to understand agent-based models in the social sciences.

For theoretical models, whether in condensed matter, dynamical systems, or economics, phase diagrams can show how the state of the system predicted by the model has qualitatively different properties depending on the parameters in the model, such as the strength of interactions. 

Phase diagrams illustrate discontinuities, how quantitative changes produce qualitative changes (tipping points), and diversity (simple models can describe rich behaviour). Phase diagrams show how robust and universal a state is, i.e., whether it only exists for fine-tuning of parameters. Theoretical phase diagrams can expand our scientific imagination, suggesting new regimes that might be explored by experiments. An example is how the phase diagram for QCD matter (shown below) has suggested new experiments, such as at the RHIC.

For dynamical systems, I recently illustrated this with the phase diagram for the Lorenz model. It shows for what parameter ranges strange attractors exist.

Today, for theoretical models for strongly correlated electron systems it is common to map out phase diagrams as a function of the model parameters. However, this was not always the case. It was more common to just investigate a model for specific parameter values that were deemed to be relevant to specific materials. Perhaps, Anderson stimulated this new approach when, in 1961, he drew the phase diagram for the mean-field solution to his model for local moments in metals, a paper that was partly the basis of his 1977 Nobel Prize.

At a minimum, a phase diagram should show the state with the emergent property and the disordered state. Diagrams that contain multiple phases may provide hints for developing a theory for a specific phase. For example, for the high-Tc cuprate superconductors, the proximity of the Mott insulating, pseudogap, and non-Fermi liquid metal phases has aided and constrained theory development.

Phase diagrams constrain theories as they provide a minimum criterion of something a successful theory should explain, even if only qualitatively. Phase diagrams illustrate the potential and pitfalls of mean-field theories. Sometimes they get qualitative details correct, even for complex phase diagrams, and can show what emergent states are possible. Ginzburg-Landau and BCS theories are mean-field theories and work extremely well for many superconductors. On the other hand, in systems with large fluctuations, mean-field theory may fail spectacularly, and they are sometimes the most interesting and theoretically challenging systems.

Thursday, April 17, 2025

Lamenting the disintegration of elite USA universities

Elite universities in the USA have nurtured and enhanced my whole academic life. In 1983, I moved to the USA as an international student, commenced a Ph.D. at Princeton, and then worked at Northwestern and Ohio State. After I returned to Australia in 1994, I visited the USA every year for several weeks for conferences, collaborations, and university visits. Much of my research was shaped by ideas I got from those trips. This blog started through the influence of I2CAM, a wonderful institution funded by the NSF. My movement into chemical physics was facilitated by attending workshops at the Telluride Science Center. I deeply appreciate my colleagues (and their institutions) for their stimulation, support, interest, encouragement, and hospitality. 

My trips to the USA only ended with COVID-19, retirement, family health issues, and my new general aversion to international travel. Currently I would be too scared to travel to the USA, based on what I read in the Travel Section of The Sydney Morning Herald.

Most importantly, what I have learned and done has been built largely on intellectual foundations laid by people in these elite universities.  Other parts of the world have played a role too, but my focus here is the USA due to current political events leading to the impending disintegration of these universities.

I readily acknowledge that these universities have flaws and need reform. On this blog, I occasionally discussed issues, such as the obsession with money, metrics, management, and marketing. Teaching undergraduates and robust scholarship has sometimes become subsidiary. I have critiqued some of the flaky science published in luxury journals by groups from these universities.

Nevertheless, if something is broken you do not fix it by smashing it. Consider a beautiful ancient vase with a large crack. You do not restore the vase by smashing it and hiring your teenage cousin to make a new one.

Reading about what is happening raises multiple questions. What is really happening? Why is it happening? How significant is it? What might it lead to? How should individuals and institutions respond? 

Today when I was on the UQ campus it was serene and the challenges my colleagues are facing, as formidable and important as they are, seem trifling compared to what I imagine is happening on Ivy campuses right now. In passing, I mention that Australia is not completely immune to what is happening in the USA. Universities here that receive some research grant funding from the USA government have had it paused or cancelled.

I can't imagine what it would be like to be an international student at Princeton right now.

On the one hand, I do not feel qualified to comment on what is happening as I am so distant. On the other hand, I do want to try and express some solidarity with and appreciation of institutions and colleagues that have blessed me and the world. I make a few general observations. This is my advice, for what it is worth, to my younger self.

Protect your mental health. You and your colleagues and your institutional are encountering an existential crisis, perhaps like none encountered before. Don't live in denial. But also don't let this consume you and destroy you as a person or a community. Limit your intake of news and how much you think about it and discuss it. Practise the basics: exercise; eat, drink, and sleep well; get help sooner than later; limit screen time; rest.

Expect the unexpected. Expect more surprises, pain, uncertainty, instability, intra-institutional conflict, and disappointments. 

Get the big picture. This is about a lot more than federal funding for universities. There are broader issues about what a university is actually for. What do you want to preserve and protect? What are you willing to compromise on? Beyond the university, many significant issues are at stake concerning politics, democracy, economics, pluralism, culture, and the law. This is an opportunity, albeit a scary one, to think about and learn about these issues.

Make the effort to have conversations across the divides. Try to  have civil and respectful discussions with people with different perspectives on how individuals and institutions should respond to the current situation. Talk to colleagues in the humanities and social sciences. Talk to those with different political perspectives, both inside and outside the university.

Read widely. History is instructive but not determinative. I recommend two very short books that I think are relevant and helpful.

On Tyranny: Twenty Lessons from the Twentieth Century by Timothy Snyder.

The Power of the Powerless, by Vaclav Havel, first published in 1978 in the context of living in communist totalitarian Czechoslovakia. I have a Penguin Vintage edition which includes a beautiful introduction by Timothy Snyder, written in 2018, for a 40th Anniversary edition. 

I thank Charles Ringma for bringing both books to my attention.

What do you think? I would love to hear from people in US universities who are living through this.

Remembering the student protestors who died 36 years ago

Memorial plaque in the Great Court of the University of Queensland today.