Monday, April 29, 2024

Emergence of the arrow of time

Time has a direction. Microscopic equations of motion in classical and quantum mechanics have time-reversible symmetry. But this symmetry is broken for many macroscopic phenomena. This observation is encoded in the second law of thermodynamics. We experience the flow of time and distinguish past, present, and future. The arrow of time is manifest in phenomena that occur at scales covering many orders of magnitude. Here are some of these different arrows of time, listed in order of increasing time scales. These are discussed by Tony Leggett in chapter 5 of The Problems of Physics.

Elementary particle physics. CP violation is observed in certain phenomena associated with the weak nuclear interaction, such as the decay of neutral kaons observed in 1964. The CPT symmetry theorem shows that any local quantum field theory that is invariant under the “proper” Lorentz transformations must also be invariant under combined CPT transformations. This means that CP violation means that time-reversal symmetry is broken. In 1989, the direction violation of T symmetry was observed.

Electromagnetism. When an electric charge is accelerated an electromagnetic wave propagates out from the charge towards infinity. Energy is transferred from the charge to its environment. We do not observe a wave that propagates from infinity into the accelerating charge, i.e., energy being transferred from the environment to the charge. Yet this possibility is allowed by the equations of motion for electromagnetism. There is an absence of the “advanced” solution to the equations of motion. 

Thermodynamics. Irreversibility happens in isolated systems. Heat never travels from a cold body to a hotter one. Fluids spontaneously mix. There is a time ordering of the thermodynamic states of isolated macroscopic systems. The thermodynamic entropy encodes this ordering.

Psychological experience. We remember the past and think we can affect the future. We don’t think we can affect the past or know the future.

Biological evolution. Over time species adapt to their environment and become more complex and more diverse.

Cosmology. There was a beginning to the universe. The universe is expanding not contracting. Density perturbations grow independent of cosmic time (Hawking and Laflamme).

It is debatable to what extent these arrows of time are related to one another. 

The problem of how statistical mechanics connects time-reversible microscopic dynamics with macroscopic irreversibility is subtle and contentious. Joel Lebowitz claimed this problem was solved by Boltzmann, provided the distinction between typical and average behaviour are accepted, along with the Past Hypothesis. This states that the universe was initially in a state of extremely low entropy. David Wallace discussed the need to accept the idea of probabilities in law of physics and that the competing interpretations of probability as frequency or ignorance matter. In contrast, David Deutsch claims that the second law of thermodynamics is an “emergent law”: it cannot be derived from microscopic laws, like the principle of testability.

I find the Past Hypothesis fascinating because it connects the arrow of time seen in the laboratory and everyday life (time scales of microseconds to years) to cosmology, covering timescales of the lifetime of the universe (10^10 years) and the “initial” state of the universe, perhaps at the end of the inflationary epoch (10^-33 seconds). This also raises questions about how to formulate the Second Law and the concept of entropy in the presence of gravity and on cosmological length and time scales. 

Monday, April 22, 2024

Effective theories in classical and quantum mechanics

Working in quantum many-body theory, I slowly learned that many key concepts and techniques have predecessors and analogues in classical systems and one-body quantum systems. Examples include Green's functions, path integrals, cumulants, the linked cluster theorem, Hubbard-Stratonavich transformation (completing the square), mean-field theory, localisation due to disorder, and BBGKY hierarchy. Learning a full-blown quantum many-body version is easier if you first understand simpler analogues.

This post is about effective theories in classical systems and one-body quantum systems, following my earlier post about effective theories in quantum field theories of elementary particles.

Michèle Levi has a pedagogical article

Effective field theories of post-Newtonian gravity: a comprehensive review





This is motivated by the use of EFTs to describe gravitational waves produced by the inspiraling and merging of binary black holes and neutron stars. She discusses the different scales involved and how there are effective theories at each scale. She also puts these EFTs in the broader context of other fields.

Analogues in one-body quantum mechanics are also discussed  in

Effective Field Theories, Reductionism and Scientific Explanation, by Stephan Hartmann

"In his beautiful book Qualitative Methods in Quantum Theory, Migdal (1977) discusses an instructive example from quantum mechanics. Let S be a system which is composed of a fast subsystem Sf and a slow subsystem Ss, characterised by two frequencies of and os. It can be shown that the effects of Sf on Ss can be taken into account effectively by adding a potential energy term to the Hamiltonian operator of Ss. In this case, as well as in many other cases, one ends up with an effective Hamiltonian operator for the subsystem characterised by the smaller frequency (or energy)."

An important example of this is the Born-Oppenheimer approximation which is based on the separation of time and energy scales associated with electronic and nuclear motion. It is used to describe and understand the dynamics of nuclei and electronic transitions in solids and molecules. The potential energy surfaces for different electronic states define an effective theory for the nuclei. Without this concept, much of theoretical chemistry and condensed matter would be incredibly difficult.

Tuesday, April 16, 2024

Physics on Netflix


The Netflix series, 3-body Problem, features physics and physicists throughout. I am not a big fan of science fiction, but watched the first episode, to try and get a sense of why the series is attracting so much attention. The opening scene (in the video above) is rooted in history. It depicts a "struggle session" during the Cultural Revolution, featuring the denunciation and killing of a physics professor, who is the father of the main character in the series.

For some more on the intellectual and political background see

Wednesday, April 10, 2024

Effective quantum field theories and hierarchial reality

 Over the last hundred years, there has been a fruitful cross-fertilisation of concepts and techniques between the theory of condensed matter and the quantum theory of elementary particles and fields. Examples include spontaneous symmetry breaking, renormalisation, and BCS theory. Sometimes, these efforts have occurred in parallel and only later did people realise that two different communities were doing essentially the same thing but using different language. Other times, one community adopted ideas or techniques from the other.

Central to condensed matter theory are ideas of emergence, a hierarchy of scales, and effective theories that are valid at a particular scale. Elementary particle theorists such as Steven Weinberg often distinguish themselves as reductionists with different goals and approaches. I only recently became aware that effective field theories have become a big thing in the elementary particle community, and Weinberg has been one of the leaders of this!

There is a helpful article in the CERN Courier, published just a year ago.

A theory of theories

Michèle Levi takes a tour through the past, present and future of Effective Field Theory, with applications ranging from LHC physics to cosmology.

The figure below, taken from the article, shows a hierarchy of energy scales and the corresponding effective field theories (EFTs).

n.b. Energy increases from bottom to top. [This may be confusing for condensed matter physicists, as we tend to put the high-energy theories at the bottom].


SM is the standard model
HQET is heavy quark effective theory in which the heavy quark degrees of freedom are integrated out.
EW breaking is Electro-Weak symmetry breaking which occurs on the scale of the Higgs boson.
The smallest energy scale in the figure is Lamda_QCD which is of the scale of the mass of the proton.

The standard model is now considered an effective field theory.

For the associated history and philosophy, I found this article helpful. Effective Field Theories, Reductionism and Scientific Explanation, by Stephan Hartmann

The decoupling theoremproved by Appelquist and Carazzone in 1975, [cited 2,500 times] is central to EFTs and a hierarchy of scales. 

In its simplest case, this theorem demonstrates that for two coupled systems with different energy scales m1 and m2 (with m2 > m1) and described by a renormalisable theory, there is always a renormalisation condition according to which the effects of the physics at scale m2 can be effectively included in the theory with the smaller scale m1 by changing the parameters of the corresponding theory. The decoupling theorem implies the existence of an EFT at scale m1 which will, however, cease to be applicable once the energy gets close to m2.

There are two distinct approaches to finding effective theories at a particular scale, referred to as bottom-up and top-down approaches. 

Top-down requires one to have a theory at a higher energy scale and then integrate out the high energy degrees of freedom (fields and particles) to find the effective theory for the lower energy scale. This is what Wilson did in his RG approach to critical phenomena. Another example is how string theorists try to derive GR and the Standard Model starting with strings.

Bottom-up can always be done because one does not need to know the higher energy theory. One can often write down the Lagrangian for the EFT based on symmetry considerations and phenomenology. An example is Fermi's theory of beta decay and the weak interactions.

In a previous post, I considered Bei Lok Hu's discussion of these two different routes to developing a quantum theory of gravity.

A major outstanding challenge in the theory of elementary particles and fields is the hierarchy problem: the measured values of some masses and coupling constants are many orders of magnitude different from the "bare" values used in the Lagrangian.

The articles I have read about the role of effective field theories make no mention of the corresponding issues in condensed matter or how emergence is involved. Emergence occurs in systems where there are many interacting components. Here those components are the quantum fields and their components with different momenta/energy. Hence, I would say that emergence is at the heart of big questions in the theory of elementary particles and fields.

Wednesday, April 3, 2024

Is biology better at computing than supercomputers?

Stimulated by discussions about the physics of learning machines with Gerard Milburn, I have been wondering about biomolecular machines such as proteins that do the transcription and translation of DNA in protein synthesis. These are rather amazing machines.

I found an article which considers a problem that is simpler than learning, computation.

The thermodynamic efficiency of computations made in cells across the range of life

Christopher P. Kempes, David Wolpert, Zachary Cohen and Juan Pérez-Mercader


It considers the computation of translating a random set of 20 amino acids into a specific string for a specific protein. Actual thermodynamic values are compared to a generalised Landauer bound for computationBelow is the punchline. (page 9)

Given that the average protein length is about 325 amino acids for 20 unique amino acids, we have that pi=p=1/20325=1.46×10−423, where there are 20325 states, such that the initial entropy is Inline Formula , which gives the free energy change of kT(SI−0)=4.03×10−18 (J) or 1.24×10−20 (J per amino acid). This value provides a minimum for synthesizing a typical protein. 

We can also calculate the biological value from the fact that if four ATP equivalents are required to add one amino acid to the polymer chain with a standard free energy of 47.7 (kJ mol−1) for ATP to ADP, then the efficiency is 1.03×10−16 (J) or 3.17×10−19 (J per amino acid).  

This value is about 26 times larger than the generalized Landauer bound.

These results illustrate that translation operates at an astonishingly high efficiency, even though it is still fairly far away from the Landauer bound. To put these results in context, it is interesting to note that the best supercomputers perform a bit operation at approximately 5.27×10−13 (J per bit). In other words, the cost of computation in supercomputers is about eight orders of magnitude worse than the Landauer bound of Inline Formula (J) for a bit operation, which is about six orders of magnitude less efficient than biological translation when both are compared to the appropriate Landauer bound. Biology is beating our current engineered computational thermodynamic efficiencies by an astonishing degree.

From Leo Szilard to the Tasmanian wilderness

Richard Flanagan is an esteemed Australian writer. My son recently gave our family a copy of Flanagan's recent book, Question 7 . It is...