Monday, May 29, 2023

Spontaneous symmetry breaking and the thermodynamic limit

 Spontaneous symmetry breaking is a fundamental concept in condensed matter and quantum field theory. Amongst philosophers of science the concept is receiving increasing attention, particularly in the context of discussions about emergence.

How do we understand the following two observations about a system at zero temperature?

At zero temperature for a finite-sized system there is no symmetry breaking. The ground state transforms as the trivial representation of the symmetry group of the Hamiltonian. It is non-degenerate.

In the thermodynamic limit, there is a family of degenerate ground states. They are related to one another by a transformation of the symmetry group. This concept is captured in picture below of the Mexican hat potential.

Motion around the trough is associated with the Goldstone mode. Motion perpendicular to the trough is associated with the "Higgs boson".

How does this picture connect with a finite system?

An intuitive picture is that the ball in the trough has a finite mass and so motion in the trough is like the quantum mechanics of a rotor with finite moment of inertia. Then there is a non-generate ground state with equal probability to be located at any angle. What might the moment of inertia be? For reasons described below it turns out to be related to the superfluid stiffness.

For the case of a Heisenberg antiferromagnet, the physics was worked out by Phil Anderson in 1952 where he introduced the concept of a "tower of states" that become degenerate in the thermodynamic limit. 

They are described by the following effective Hamiltonian

c is the speed of magnons (Goldstone bosons). vec(S) is the total spin, V is the volume of the system, and rho_s is the spin stiffness associated with the broken-symmetry. As the thermodynamic limit is approached the energy of these states scale with L^-d where d is the dimension of the system. In contrast, the magnon states scale with L^-1. Thus, for exact diagonalisation of sufficiently large systems, the "tower of states" should be clearly be below the magnons states.

 In 1992 all of the above was confirmed for the triangular lattice in numerical work by Bernu, Hluillier, and Pierre.  

In the figure below, the top panel shows the low-lying eigenstates. The lowest energy states do scale with S^2. The middle panel shows how these states do separate from the magnon states. 


The figure below shows how the moment of inertia [proportional to the denominator in the tower of states equation above] does scale with the system size.

 

More recently there has been some interesting work that explores how the tower of states appears in the entanglement entropy.

Entanglement Entropy of Systems with Spontaneously Broken Continuous Symmetry

Max A. Metlitski, Tarun Grover

But for now, discussing that is above my pay grade 😀

I thank Gerard Milburn for asking me questions that led to me finally getting a better physical picture of the issues discussed here.

Thursday, May 25, 2023

The incomplete veil: from macroscopic to the microscopic

 It is natural to assume that scientists need to probe a system at the microscopic scale to learn about what is happening at that scale. If we take this view we will necessarily be pessimistic about the "bottom-up" research strategy for quantum gravity advocated by Bei Lok Hu. It goes from macro- to micro-, the opposite to the more popular approaches of string theory and loop quantum gravity. However, the history of science shows that we can learn a lot about microscopics from probing systems at much greater length scales. Here are some examples.

Following Perrin's experiments and Einstein's theory of Brownian motion, almost all scientists believed that atoms were not just a mathematical convenience but did exist and were the basic constituents of liquids and solids. All this was before X-ray diffraction allowed the more direct study of crystals at the atomic scale.

Crystallography was pretty much settled as a field before there was any direct evidence of the atomic constituents and their spatial arrangement. Cleavage of crystals, facets observed in minerals, and group theory provided a complete classification of all possible crystal structures. Observations of crystal facets and different modes of sound can be sufficient to determine (or at least constrain options for) the crystal class. 

Figure from Traité de minéralogie (1801) by Rene Hauy See also this.

In 1935, Linus Pauling proposed the crystal structure of common ice without any information from X-ray crystallography. He only used the measured value of the residual entropy, simple models of hydrogen bonding, and the Bernal-Fowler ice rules.

In 1961, the biochemist Peter Mitchell deduced the mechanism of the synthesis of ATP, the molecule responsible for energy transport in cells, without knowing any details of the molecular structure of cell membranes. He reasoned from thermodynamics and the fact that there was an electric potential across the cell membranes. His work led to the discovery of the enzyme ATP synthase, a molecular motor. The underlying physics is beautifully described by Phil Nelson in his text, Biological Physics.

I see two important and related lessons for today from these historical examples.  

1. We have access to amazing computational power and microscopic probes. However, before rushing off to use them, ponder what constraints on the microscopic might be deduced from macroscopic observations.

2. Given that a quantum theory of gravity seems so elusive more resources might be invested in the macro- to micro- strategy.

Aside: Overall, I think this post is going against the strong claims that Bob Laughlin makes in "The Dark Side of Protection", chapter 12 in A Different Universe.

Tuesday, May 23, 2023

Condensed Matter Physics: A Very Short Introduction now available on Kindle

The good news is that if you read books on Kindle, my book Condensed Matter Physics: A Very Short Introduction can now be purchased as an e-book on Amazon for US$7.50.  It seems to only be available on the USA site but I used my USA Amazon account and downloaded it.

I am no fan of Amazon and minimise my purchases from them. They are my shop of last resort. I understand that some readers will not want to go this route.

The bad news is that the production of print copies continues to progress slowly. Depending on the country, different sites advertise it being available at various dates over the next three months...  

I look forward to readers feedback.

Monday, May 15, 2023

Two distinctly different routes to a quantum theory of gravity

 Emergence in condensed matter physics may provide some valuable insights into the elusive search for a quantum theory of gravity. There was a helpful discussion by Bei Lok Hu  in Emergent/quantum gravity: macro/micro structures of spacetime

Hu makes a distinction between two approaches that he characterises as "bottom-up" and "top-down". Both have the common goal of understanding how space-time and Einstein's classical theory of gravity can emerge from some more "fundamental" theory that describes physics at higher energies and shorter distances, such as the Planck scale.

1. Going from the micro- to the macro-

Examples of this approach are string theory (a la Schwarz, Green, and Witten) and loop quantum gravity. The respective microscopic entities are strings and spin foam. This approach is motivated by the success of the standard model of elementary particles and gauge fields. One starts with a well-defined "classical" action inspired by symmetry (and broken symmetry) and uses quantum field theory to calculate observable properties. Generally, one is quantising the classical theory of gravity. Perhaps, we should not be surprised that this approach has not borne fruit as we know from condensed matter that deducing emergent (macro-)properties from microscopic theory is extremely hard.

This picture is taken from a recent Scientific American article

Hu also has the following valuable insight about whether quantising classical theory is the right approach.

[quantising the classical theory of spacetime] will not lead to a microscopic theory of spacetime. In the analogy of a crystal made of atoms quantizing the vibrational models yields phonons, not atoms. Finding the atomic structure of matter does not come from simply quantizing its collective degrees of freedom, but takes a very different path.

àMä?ÍaËdÌMä£ã􏰹î2Ê􏰅à­Ò?ÍEÊVáHu calls this approach "top-down" as it involves going from high energies down to low energies. I found this confusing as I tend to think of this approach means going up in distance, i.e. from the bottom structures (small distances) up to the top structures (long distances). 

2. Going from the macro- to the micro-

This approach is also ambitious. By considering the macroscopic theory (classical space-time and General Relativity) and the associated observed structures the goal is to deduce something about the microscopic theory, even without probes to investigate reality on very short distance scales. 

History suggests this is not completely fanciful. Consider for example the path to the belief that liquids and crystals were actually made of atoms. Einstein's theory of Brownian motion and Perrin's experiments were not at the atomic scale. People had deduced that crystals were made of arrays of atoms before the discovery of x-ray diffraction from crystals.

Space-time and the metric are viewed as collective variables, like order parameters in condensed matter.

Hu calls this approach "bottom-up", advocates it, and explores some possible ways to pursue it.

I thank Gerard Milburn for rekindling my interest in these issues.

Tuesday, May 9, 2023

Philosophers of science on which theories are fundamental

What is real? What is true? These big questions are central to philosophy and issues in the philosophy of science.

Emergent properties of complex systems raise similar philosophical questions such as  "What is fundamental?" and "Are quasiparticles real?".

Robert Batterman is a philosopher of science who is the author of the book,

The Devil in the Details: Asymptotic Reasoning in Explanation, Reduction, and Emergence

In 2017 Batterman wrote an article in an edition of the Journal of Statitiscal Physics that was in memory of Leo Kadanoff. 

Philosophical Implications of Kadanoff’s Work on the Renormalization Group

Below I reproduce some of the text as it provides a helpful (and disturbing) summary of how the philosophy of science has evolved.

There are very few natural philosophers anymore. The fields of philosophy and science parted company at the end of [the nineteenth] century. Philosophers more and more began to turn toward the disciplines of logic and the analysis of language, and their examination of the enterprise of science began to follow a different, less-engaged-with-scientific-detail, direction. They began to try to determine the logic and structure of scientific theorizing in a way that was much more arm-chair and much less concerned with details about individual theories. The aim was to construct or reconstruct the proper logical structures of scientific explanation, confirmation, and theory choice. The philosophical reconstructions were, by and large, designed to fit all empirical science. For example, an explanation in physics should share the same general (logical) form as explanations in biology, chemistry, or sociology. 

I find this problematic because how physicists and biologists do science and the knowledge that they produce is quite different. In fact, similar differences exist between elementary particle physics and condensed matter physics. That also applies to Batterman's next claim.

I think it is fair to say that from a philosophy of science point of view, physical theories are supposed to reflect our best attempts to understand nature. Philosophers are also enamored with the idea that theories have a certain logical structure—they can be written down in some kind of axiomatic form from which, given certain inputs, various features of physical systems (future states, e.g.) can be derived using logic and reasonably straightforward mathematics.

Furthermore, philosophers often distinguish fundamental from nonfundamental (or “phenomenological”) theories. This latter distinction presupposes the idea that fundamental theories are the ones that tell us really what nature is actually like at “bottom.” These presumably include, quantum theory, quantum field theory, maybe a theory of quantum gravity, etc.

In contrast, Bob Laughlin, argues that certain emergent properties are exact [such as quantisation of magnetic flux in a superconductor, hydrodynamics, sound waves] and so they are more fundamental than microscopic theories. [A Different Universe, pp. 36-40].

Batterman continues

Nonfundamental theories such as thermodynamics, continuum mechanics, and fluid dynamics, on the other hand, while pragmatically useful, are in a certain sense (exactly what sense is a matter of serious contention) superfluous. We could, in principle, solve problems involving the elastic bending of beams by starting from the fundamental atomic and subatomic theories of the constituents of the beam.

Nonfundamental theories don’t get nature right. Steel beams are not really the continua whose bending behaviors are described by the Navier–Cauchy equations. Gases are not continuous blobs of stuff. The important theories, according to many philosophers and, I believe, according to many physicists, are those that get the ontology right. In part, the (often unarticulated) reason for preferring fundamental theories over phenomenological theories is a realist presupposition that physical theories must accurately describe the world the way the world really is

 Perhaps the view that atoms are real but solids are not is reflected by Bertrand Russell in the opening paragraph of his book, The ABC of Atoms, published in 1923 and intended for popular audiences.


Phenomenological theories are often good for calculating, but they don’t accurately describe the world and so must, in a sense, play second fiddle to their fundamental partners.

This is also contentious. Thermodynamics, elasticity theory, and fluid dynamics are perfectly accurate and never wrong within their domain of validity. Many courses and texts on thermodynamics begin with the following quote from Einstein.

 A theory is the more impressive the greater the simplicity of its premises, the more different kinds of things it relates, and the more extended its area of applicability. Therefore the deep impression that classical thermodynamics made upon me. It is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts.

I should stress that Batterman is not agreeing with or promoting the views I have questioned above. Rather, he is trying to characterise what many philosophers believe.

From Leo Szilard to the Tasmanian wilderness

Richard Flanagan is an esteemed Australian writer. My son recently gave our family a copy of Flanagan's recent book, Question 7 . It is...