Tuesday, May 20, 2025

The triumphs of lattice gauge theory

When first proposed by Ken Wilson in 1974, lattice gauge theory was arguably a toy model, i.e., an oversimplification. He treated space-time as a discrete lattice purely to make analysis more tractable. Borrowing insights and techniques from lattice models in statistical mechanics, Wilson could then argue for quark confinement, showing that the confining potential was linear with distance.

Earlier, in 1971, Wegner had proposed a Z2 gauge theory in the context of generalised Ising models in statistical mechanics to show how a phase transition was possible without a local order parameter, i.e., without symmetry breaking. Later, it was shown that the phase transition is similar to the confinement-deconfinement phase transition that occurs in QCD. [A nice review from 2014 by Wegner is here]. This work also provided a toy model to illustrate the possibility of a quantum spin liquid.

Perhaps, what was not anticipated was that lattice QCD could be used to calculate accurately properties of elementary particles.

The discrete nature of lattice gauge theory means it is amenable to numerical simulation. It is not necessary to have the continuum limit of real spacetime because of universality. Due to increases in computational power over the past 50 years and innovations in algorithms, lattice QCD can be used to calculate properties of nucleons and mesons, such as mass and decay rates, with impressive accuracy. The figure below is taken from a 2008 article in Science. 

The mass of three mesons is typically used to fix the mass of the light and strange quarks and the length scale. The mass of nine other particles, including the nucleon, is calculated with an uncertainty of less than one per cent, and in agreement with experimental values.

An indication that this is a strong coupling problem is that about 95 per cent of the mass of nucleons comes from the interactions. Only about 5 per cent is from the rest mass of the constituent quarks.

For more background on computational lattice QCD, there is a helpful 2004 Physics Today article, which drew a critical response from Herbert Neuberger. A recent (somewhat) pedagogical review by Sasa Prelovsek just appeared on the arXiv.


Tuesday, May 6, 2025

Characteristics of static disorder can emerge from electron-phonon interactions

Electronic systems with large amounts of static disorder can exhibit distinct properties, including localisation of electronic states and sub-gap band tails in the density of states and electronic absorption. 

Eric Heller and collaborators have recently published a nice series of papers that show how these properties can also appear, at least on sufficiently long time scales, in the absence of disorder, due to the electron-phonon interaction. On a technical level, a coherent state representation for phonons is used. This provides a natural way of taking a classical limit, similar to what is done in quantum optics for photons. Details are set out in the following paper 

Coherent charge carrier dynamics in the presence of thermal lattice vibrations, Donghwan Kim, Alhun Aydin, Alvar Daza, Kobra N. Avanaki, Joonas Keski-Rahkonen, and Eric J. Heller

This work brought back memories from long ago when I was a postdoc with John Wilkins. I was puzzled by several related things about quasi-one-dimensional electronic systems, such as polyacetylene, that underwent a Peierls instability. First, the zero-point motion of the lattice was comparable to lattice dimerisation that produced an energy gap at the Fermi energy. Second, even in clean systems, there was a large subgap optical absorption. Third, there was no sign of the square-root singularity expected in the density of states, predicted by theories which treated the lattice classically, i.e., calculated electronic properties in the Born-Oppenheimer approximation.

I found that on the energy scales relevant to the sub-gap absorption, the phonons could be treated like static disorder and make use of known exact results for one-dimensional Dirac equations with random disorder. This explained the puzzles.

Effect of Lattice Zero-Point Motion on Electronic Properties of the Peierls-Fröhlich State

The disorder model can also be motivated by considering the Feynman diagrams for the electronic Green's function perturbation expansion in powers of the electron-phonon interaction. In the limit that the phonon frequency is small, all the diagrams become like those for a disordered system, where the strength of the static disorder is given by 

I then teamed up with another postdoc, Kihong Kim, who calculated the optical conductivity for this disorder model.

Universal subgap optical conductivity in quasi-one-dimensional Peierls systems

Two things were surprising about our results. First, the theory agreed well with experimental results for a range of materials, including the temperature dependence. Second,  the frequency dependence had a universal form. Wilkins was clever and persistent at extracting such forms, probably from his experience working on the Kondo problem.

Friday, May 2, 2025

Could quantum mechanics be emergent?

One of the biggest challenges in the foundations of physics is the quantum measurement problem. It is associated with a few key (distinct but related) questions.

i. How does a measurement convert a coherent state undergoing unitary dynamics to a "classical" mixed state for which we can talk about probabilities of outcomes?

ii. Why is the outcome of an individual measurement always definite for the "pointer states" of the measuring apparatus?

iii. Can one derive the Born rule, which gives the probability of a particular outcome?

Emergence of the classical world from the quantum world via decoherence

A quantum system always interacts to some extent with its environment. This interaction leads to decoherence, whereby quantum interference effects are washed out. Consequently, superposition states of the system decay into mixed states described by a diagonal density matrix. A major research goal of the past three decades has been understanding decoherence and the extent to which it does provide answers to the quantum measurement problem. One achievement is that decoherence theory seems to give a mechanism and time scale for the “collapse of the wavefunction” within the framework of unitary dynamics. However, this is not the case because decoherence is not the same as a projection (which is what a single quantum measurement is). Decoherence does not produce definite outcomes but rather statistical mixtures. Decoherence only resolves the issue if one identifies ensembles of measured states with ensembles of the decohered density matrix (the statistical interpretation of quantum mechanics). Thus, it seems decoherence only answers the first question above, but not the last two. On the other hand, Zurek has pushed the decoherence picture further and given a “derivation” of the Born rule within its framework. In other words, decoherence does not solve the quantum measurement problem: measurements always produce definite outcomes.

One approach to solving the problem is to view quantum theory as only an approximate theory. In particular, it could be an effective theory for some underlying theory valid at time and length scales much smaller than those for which quantum theory has been precisely tested by experiments. 

Emergence of quantum field theory from a “classical” statistical theory

Einstein did not accept the statistical nature of quantum theory and considered it should be derivable from a more “realistic” theory. In particular, he suggested “a complete physical description, the statistical quantum theory would …. take an approximately analogous position to the statistical mechanics within the framework of classical mechanics.”

Einstein's challenge was taken up in a concrete and impressive fashion by Stephen Adler in a book, “Quantum Theory as an Emergent Phenomenon: The Statistical Mechanics of Matrix Models as the Precursor of Quantum Field Theory”, published in 2004.  A helpful summary is given in a review by Pearle.

The starting point is "classical" dynamical variables qr and pr which are NxN matrices, where N is even. Half of these variables are bosonic, and the others are fermionic. They all obey Hamilton's equations of motion for an unspecified Hamiltonian H. Three quantities are conserved: H, the fermion number N, and (very importantly) the traceless anti-self-adjoint matrix, 

where the first term is the sum for all the bosonic variables of their commutator, and the second is the sum over anti-commutators for the fermionic variables.

Quantum theory is obtained by tracing over all the classical variables with respect to a canonical ensemble with three (matrix) Lagrange multipliers [analogues of temperature and chemical potential in conventional statistical mechanics] corresponding to the conserved quantities H, N, and C. The expectation values of the diagonal elements of C are assumed to all have the same value, hbar!

An analogy of the equipartition theorem in classical statistical mechanics (which looks like a Ward identity in quantum field theory) leads to dynamical equations (trace dynamics) for effective fields. To make these equations look like regular quantum field theory, an assumption is made about a hierarchy of length, energy, and "temperature" [Lagrange multiplier] scales, which cause the Trace dynamics to be dominated by C rather than H, the trace Hamiltonian. Adler suggests these scales may be Planck scales. Then, the usual quantum dynamical equations and the Dirac correspondence of Poisson brackets and commutators emerge. Most of the actual details of the trace Hamiltonian H do not matter; another case of universality, a common characteristic of emergent phenomena.

The “classical” field C fluctuates about its average value. These fluctuations can be identified with corrections to locality in quantum field theory and with the noise terms which appear in the modified Schrodinger equation of "physical collapse" models of quantum theory.

More recently, theorists including Gerard t’Hooft and John Preskill have investigated how quantum mechanics can emerge from other deterministic systems. This is sometimes known as the emergent quantum mechanics (EmQM) hypothesis.

Underlying deterministic systems considered include

Hamilton-Randers systems defined in co-tangent spaces of large-dimensional configuration spaces

neural networks,

cellular automata,

fast-moving classical variables, and the

 boundary of a local classical model with a length that is exponentially large in the number of qubits in the quantum system. 

In most of these versions of EmQM the length scale at which the underlying theory becomes relevant is conjectured to be of the order of the Planck length.

The fact that quantum theory can emerge from such a diverse range of underlying theories again illustrates universality.

The question of quantum physics emerging from an underlying classical theory is not just a question in the foundations of physics or in philosophy. Slagle points out that Emergent Quantum Mechanics may mean that the computational power of quantum computers is severely limited. He has proposed a specific experimental protocol to test for EmQM. A large number d of entangling gates (the circuit depth d) are applied to n qbits in the computational basis, followed by the inverse gates. This is followed by measurement in the computational basis. The fidelity should decay exponentially with d, whereas for EmQM will decay much faster above some critical d, for sufficiently large n.

Independent of experimental evidence, EmQM provides an alternative interpretation to quantum theory that avoids the thorny issues such as the many-worlds interpretation.

Friday, April 25, 2025

Phase diagrams elucidate emergence

Phase diagrams have been ubiquitous in materials science for decades. They show what states of matter are thermodynamically stable depending on the value of external parameters such as temperature, pressure, magnetic field, or chemical composition. However, they are only beginning to be appreciated in other fields. Recently, Bouchaud argued that they needed to be used more to understand agent-based models in the social sciences.

For theoretical models, whether in condensed matter, dynamical systems, or economics, phase diagrams can show how the state of the system predicted by the model has qualitatively different properties depending on the parameters in the model, such as the strength of interactions. 

Phase diagrams illustrate discontinuities, how quantitative changes produce qualitative changes (tipping points), and diversity (simple models can describe rich behaviour). Phase diagrams show how robust and universal a state is, i.e., whether it only exists for fine-tuning of parameters. Theoretical phase diagrams can expand our scientific imagination, suggesting new regimes that might be explored by experiments. An example is how the phase diagram for QCD matter (shown below) has suggested new experiments, such as at the RHIC.

For dynamical systems, I recently illustrated this with the phase diagram for the Lorenz model. It shows for what parameter ranges strange attractors exist.

Today, for theoretical models for strongly correlated electron systems it is common to map out phase diagrams as a function of the model parameters. However, this was not always the case. It was more common to just investigate a model for specific parameter values that were deemed to be relevant to specific materials. Perhaps, Anderson stimulated this new approach when, in 1961, he drew the phase diagram for the mean-field solution to his model for local moments in metals, a paper that was partly the basis of his 1977 Nobel Prize.

At a minimum, a phase diagram should show the state with the emergent property and the disordered state. Diagrams that contain multiple phases may provide hints for developing a theory for a specific phase. For example, for the high-Tc cuprate superconductors, the proximity of the Mott insulating, pseudogap, and non-Fermi liquid metal phases has aided and constrained theory development.

Phase diagrams constrain theories as they provide a minimum criterion of something a successful theory should explain, even if only qualitatively. Phase diagrams illustrate the potential and pitfalls of mean-field theories. Sometimes they get qualitative details correct, even for complex phase diagrams, and can show what emergent states are possible. Ginzburg-Landau and BCS theories are mean-field theories and work extremely well for many superconductors. On the other hand, in systems with large fluctuations, mean-field theory may fail spectacularly, and they are sometimes the most interesting and theoretically challenging systems.

Thursday, April 17, 2025

Lamenting the disintegration of elite USA universities

Elite universities in the USA have nurtured and enhanced my whole academic life. In 1983, I moved to the USA as an international student, commenced a Ph.D. at Princeton, and then worked at Northwestern and Ohio State. After I returned to Australia in 1994, I visited the USA every year for several weeks for conferences, collaborations, and university visits. Much of my research was shaped by ideas I got from those trips. This blog started through the influence of I2CAM, a wonderful institution funded by the NSF. My movement into chemical physics was facilitated by attending workshops at the Telluride Science Center. I deeply appreciate my colleagues (and their institutions) for their stimulation, support, interest, encouragement, and hospitality. 

My trips to the USA only ended with COVID-19, retirement, family health issues, and my new general aversion to international travel. Currently I would be too scared to travel to the USA, based on what I read in the Travel Section of The Sydney Morning Herald.

Most importantly, what I have learned and done has been built largely on intellectual foundations laid by people in these elite universities.  Other parts of the world have played a role too, but my focus here is the USA due to current political events leading to the impending disintegration of these universities.

I readily acknowledge that these universities have flaws and need reform. On this blog, I occasionally discussed issues, such as the obsession with money, metrics, management, and marketing. Teaching undergraduates and robust scholarship has sometimes become subsidiary. I have critiqued some of the flaky science published in luxury journals by groups from these universities.

Nevertheless, if something is broken you do not fix it by smashing it. Consider a beautiful ancient vase with a large crack. You do not restore the vase by smashing it and hiring your teenage cousin to make a new one.

Reading about what is happening raises multiple questions. What is really happening? Why is it happening? How significant is it? What might it lead to? How should individuals and institutions respond? 

Today when I was on the UQ campus it was serene and the challenges my colleagues are facing, as formidable and important as they are, seem trifling compared to what I imagine is happening on Ivy campuses right now. In passing, I mention that Australia is not completely immune to what is happening in the USA. Universities here that receive some research grant funding from the USA government have had it paused or cancelled.

I can't imagine what it would be like to be an international student at Princeton right now.

On the one hand, I do not feel qualified to comment on what is happening as I am so distant. On the other hand, I do want to try and express some solidarity with and appreciation of institutions and colleagues that have blessed me and the world. I make a few general observations. This is my advice, for what it is worth, to my younger self.

Protect your mental health. You and your colleagues and your institutional are encountering an existential crisis, perhaps like none encountered before. Don't live in denial. But also don't let this consume you and destroy you as a person or a community. Limit your intake of news and how much you think about it and discuss it. Practise the basics: exercise; eat, drink, and sleep well; get help sooner than later; limit screen time; rest.

Expect the unexpected. Expect more surprises, pain, uncertainty, instability, intra-institutional conflict, and disappointments. 

Get the big picture. This is about a lot more than federal funding for universities. There are broader issues about what a university is actually for. What do you want to preserve and protect? What are you willing to compromise on? Beyond the university, many significant issues are at stake concerning politics, democracy, economics, pluralism, culture, and the law. This is an opportunity, albeit a scary one, to think about and learn about these issues.

Make the effort to have conversations across the divides. Try to  have civil and respectful discussions with people with different perspectives on how individuals and institutions should respond to the current situation. Talk to colleagues in the humanities and social sciences. Talk to those with different political perspectives, both inside and outside the university.

Read widely. History is instructive but not determinative. I recommend two very short books that I think are relevant and helpful.

On Tyranny: Twenty Lessons from the Twentieth Century by Timothy Snyder.

The Power of the Powerless, by Vaclav Havel, first published in 1978 in the context of living in communist totalitarian Czechoslovakia. I have a Penguin Vintage edition which includes a beautiful introduction by Timothy Snyder, written in 2018, for a 40th Anniversary edition. 

I thank Charles Ringma for bringing both books to my attention.

What do you think? I would love to hear from people in US universities who are living through this.

Saturday, April 12, 2025

An authoritarian government takes over universities: one case history

Adventures of a Bystander, by Peter Drucker, contains the following account. Drucker was a faculty member at Frankfurt University in 1933.

“[S]everal weeks after the Nazis had come to power, was the first Nazi-led faculty meeting at the University. Frankfurt was the first university the Nazis tackled, precisely because it was the most self-confidently liberal of major German universities, with a faculty that prided itself on its allegiance to scholarship, freedom of conscience, and democracy. The Nazis knew that control of Frankfurt University would mean control of German academia altogether. So did everyone at the University. 
Above all, Frankfurt had a science faculty distinguished both by its scholarship and by its liberal convictions; and outstanding among the Frankfurt scientists was a biochemist of Nobel Prize caliber and impeccable liberal credentials. When the appointment of a Nazi commissar for Frankfurt was announced around February 25 of that year and when not only every teacher but also every graduate assistant at the University was summoned to a faculty meeting to hear his new master, everybody knew that a trial of strength was at hand. … 
The new Nazi commissar wasted no time on the amenities…. [He] pointed his finger at one department chairman after another and said: ‘You either do what I tell you or we’ll put you into a concentration camp.’ 
There was dead silence when he finished; everybody waited for the distinguished biochemist. The great liberal got up, cleared his throat, and said: ‘Very interesting, Mr. Commissar, and in some respects very illuminating. But one point I didn’t get too clearly. Will there be more money for research in physiology?’ The meeting broke up shortly thereafter with the commissar assuring the scholars that indeed there would be plenty of money for ‘racially pure science’.”

I became aware of this chilling story through Peter Woit's blog who got it from a blog post by Adam Przeworski

Tuesday, March 25, 2025

Superconductivity: a poster child for emergence

Superconductivity beautifully illustrates the characteristics of emergent properties.

Novelty. 

Distinct properties of the superconducting state include zero resistivity, the Meissner effect, and the Josephson effect. The normal metallic state does not exhibit these properties.

At low temperatures, solid tin exhibits the property of superconductivity. However, a single atom of tin is not a superconductor. A small number of tin atoms has an energy gap due to pairing interactions, but not bulk superconductivity.

There is more than one superconducting state of matter. The order parameter may have the same symmetry as a non-trivial representation of the crystal symmetry and it can have spin singlet or triplet symmetry. Type II superconductors in a magnetic field have an Abrikosov vortex lattice, another distinct state of matter.

Unpredictability. 

Even though the underlying laws describing the interactions between electrons in a crystal have been known for one hundred years, the discovery of superconductivity in many specific materials was not predicted. Even after the BCS theory was worked out in 1957 the discovery of superconductivity in intermetallic compounds, cuprates, organic charge transfer salts, fullerenes, and heavy fermion compounds was not predicted.71

Order and structure. 

In the superconducting state, the electrons become ordered in a particular way. The motion of the electrons relative to one another is not independent but correlated. Long-range order is reflected in the generalised rigidity, which is responsible for the zero resistivity. Properties of individual atoms (e.g., NMR chemical shifts) are different in vacuum, metallic state, and superconducting state.

Universality. 

Properties of superconductivity such as zero electrical resistance, the expulsion of magnetic fields, quantisation of magnetic flux, and the Josephson effects are universal. The existence and description of these properties are independent of the chemical and structural details of the material in which the superconductivity is observed. This is why the Ginzburg-Landau theory works so well. In BCS theory, the temperature dependences of thermodynamic and transport properties are given by universal functions of T/Tc where Tc is the transition temperature. Experimental data is consistent with this for a wide range of superconducting materials, particularly elemental metals for which the electron-phonon coupling is weak.

Modularity at the mesoscale. 

Emergent entities include Cooper pairs and vortices. There are two associated emergent length scales, typically much larger than the microscopic scales defined by the interatomic spacing or the Fermi wavelength of electrons. The coherence length is associated with the energy cost of spatial variations in the order parameter. It defines the extent of the proximity effect where the surface of a non-superconducting metal can become superconducting when it is in electrical contact with a superconductor. The coherence length turns out to be of the order of the size of Cooper pairs in BCS theory.  The second length scale is the magnetic penetration depth (also known as the London length) which determines the extent that an external magnetic field can penetrate the surface of a superconductor. It is determined by the superfluid density. The relative size of the coherence length and the penetration depth determines whether  the formation of an Abrikosov vortex lattice is stable in a large enough magnetic field.

Quasiparticles. 

The elementary excitations are Bogoliubov quasiparticles that are qualitatively different to particle and hole excitations in a normal metal. They are a coherent superposition of a particle and hole excitation (relative to the Fermi sea), have zero charge and only exist above the energy gap. The mixed particle-hole character of the quasiparticles is reflected in the phenomenom of Andreev reflection.

Singularities. 

Superconductivity is a non-perturbative phenomenon. In BCS theory the transition temperature, Tc, and the excitation energy gap are a non-analytic function of the electron-phonon coupling constant lambda, Tc \sim exp(-1/lambda).

A singular structure is also evident in the properties of the current-current correlation function. Interchange of the limits of zero wavevector and zero frequency do not commute, this being intimately connected with the non-zero superfluid density.

Effective theories.

These are illustrated in the Figure below. The many-particle Schrodinger equation describes electrons and atomic nuclei interacting with one another. Many-body theory can be used to justify considering the electrons as a jellium liquid of non-interacting fermions interacting with phonons. Bardeen, Pines, and Frohlich showed that for that system there is an effective interaction between fermions that is attractive. The BCS theory includes a truncated version of this attractive interaction. Gorkov showed that Ginzburg-Landau theory could be derived from BCS theory. The London equations can be derived from Ginzburg-Landau theory. The Josephson equations only include the phase of order parameter to describe a pair of coupled superconductors.

The historical of the development of theories mostly went downwards. London preceded Ginzburg-Landau which preceded BCS theory. Today for specific materials where superconductivity is known to be due to electron-phonon coupling and the electron gas is weakly correlated one can now work upwards using computational methods such as Density Functional Theory (DFT) for Superconductors or the Eliashberg theory with input parameters calculated from DFT-based methods. However, in reality this has debatable success. The superconducting transition temperatures calculated typically vary with the approximations used in the DFT such as the choice of functional and basis set, and often differ from experimental results by the order of 50 percent. This illustrates how hard prediction is for emergent phenomena.

Potential and pitfalls of mean-field theory. 

Mean-field approximations and theories can provide a useful guide as what emergent properties are possible and as a starting point to map out properties such as phase diagrams. For some systems and properties, they work incredibly well and for others they fail spectacularly and are misleading. 

Ginzburg-Landau theory and BCS theory are both mean-field theories. For three-dimensional superconductors they work extremely well. However, in two dimensions as long-range order and breaking of a continuous symmetry cannot occur and the physics associated with the Berezinskii-Kosterlitz-Thouless transition occurs. Nevertheless, the Ginzburg-Landau theory provides the background to understand the justification for the XY model and the presence of vortices to proceed. Similarly, the BCS theory fails for strongly correlated electron systems, but a version of the BCS theory does give a surprisingly good description of the superconducting state.

Cross-fertilisation of fields. 

Concepts and methods developed for the theory of superconductivity bore fruit in other sub-fields of physics including nuclear physics, elementary particles, and astrophysics. Considering the matter fields (associated with the electrons) coupled to electromagnetic fields (a U(1) gauge theory) the matter fields can be integrated out to give a theory in which the photon has mass. This is a perspective on the Meissner effect in which the magnitude of an external magnetic field decays exponentially as it penetrates a superconductor. This idea of a massless gauge field acquiring a mass due to spontaneous symmetry breaking was central to steps towards the Standard Model made by Nambu and by Weinberg. 

The triumphs of lattice gauge theory

When first proposed by Ken Wilson in 1974, lattice gauge theory was arguably a toy model , i.e., an oversimplification. He treated space-tim...