Friday, December 20, 2024

From Leo Szilard to the Tasmanian wilderness

Richard Flanagan is an esteemed Australian writer. My son recently gave our family a copy of Flanagan's recent book, Question 7. It is a personal memoir that masterfully weaves together a dizzying array of topics, from nuclear physics to the Tasmanian wilderness. I mention it on this blog because of its endearing and fascinating portrayal of Leo Szilard, arguably one of the twentieth century's most creative, unconventional, and eccentric physicists.

The paragraph below gives an overview of the narrative that is used to weave together all the disparate topics.

“Without Rebecca West’s kiss H. G. Wells would not have run off to Switzerland to write a book in which everything burns, and without H. G. Wells’s book [The World Set Free] Leo Szilard would never have conceived of a nuclear chain reaction and without conceiving of a nuclear chain reaction he would never have grown terrified and without growing terrified Leo Szilard would never have persuaded Einstein to lobby Roosevelt and without Einstein lobbying Roosevelt there would have been no Manhattan Project and without the Manhattan Project there is no lever at 8.15 am on 6 August 1945 for Thomas Ferebee to release 31,000 feet over Hiroshima, there is no bomb on Hiroshima and no bomb on Nagasaki and 100,000 people or 160,000 people or 200,000 people live and my father dies. Poetry may make nothing happen, but a novel destroyed Hiroshima and without Hiroshima there is no me and these words erase themselves and me with them.”


You can read an extract here and a review in The Guardian here.

Wednesday, December 4, 2024

Are gravity and space-time emergent?

Attempts to develop a quantum theory of gravity continue to falter and stagnate. Given this, it is worth considering approaches that start with what we know about gravity at the macroscale and investigate whether it provides any hints about some underlying more microscopic theory. One such approach was taken by Thanu Padmanabhan and is elegantly described and summarised in a book chapter.

Gravity and Spacetime: An Emergent Perspective

Insights about microphysics from macrophysics 

Padmanabhan emphasises Boltzmann's insight: "matter can only store and transfer heat because of internal degrees of freedom". In other words, if something has a temperature and entropy then it must have a microstructure.

The approach of trying to surmise something about microphysics from macrophysics has a long and fruitful history, albeit probably with many false starts that we do not hear about. Kepler's snowflakes may have been the first example. Before people were completely convinced about the existence of atoms, the study of crystal facets and of Brownian motion provided hints of the atomic structure of matter. Planck deduced the existence of the quantum from the thermodynamics of black-body radiation.

Arguably, the first definitive determination of Avogadro's number was from Perrin's experiments on Brownian motion which involved macroscopic measurements.

Comparing classical statistical mechanics to bulk thermodynamic properties gave hints of an underlying quantum structure to reality. The Sackur-Tetrode equation for the entropy of an ideal gas hints at the quantisation of phase space. The Gibbs paradox hints that fundamental particles are indistinguishable. The third law of thermodynamics hints at the idea of quantum degeneracy.

Puzzles in classical General Relativity

Padmanabhan reviews aspects of the theory that he considers some consider to be "algebraic accidents" but he suggests that they may be hints to something deeper. These include the role of boundary terms in variational principles and he suggests hint at a classical holography (bulk behaviour is determined by the boundary). He also argues that the metric of space-time should not be viewed as a field, contrary to most attempts to develop a quantum field theory for gravity.

Thermodynamics of horizons

The key idea that is exploited to find the microstructure is that can define a temperature and an entropy for null surfaces (event horizons). These have been calculated for specific systems (metrics) including the following:

For accelerating frames of reference (Rindler) there is an event horizon which exhibits Unruh radiation with a temperature that was calculated by Fulling, Davies and Unruh.

The black hole horizon in the Schwarschild metric has the temperature of Hawking radiation.

The cosmological horizon in deSitter space is associated with a temperature proportional to the Hubble constant H. [This was discussed in detail by Gibbons and Hawking in 1977].

Estimating Avogadro's number for space-time

Consider the number of degrees of freedom on the boundary, N_s, and in the bulk, N_b. 

On the boundary surface, there is one degree of freedom associated with every Planck area (L_p^2) where L_p is the Planck length, i.e,  N_s = A/ L_p^2, where A is the surface area, which is related to the entropy of the horizon (cf. Bekenstein and Hawking).

In the bulk equipartition of energy is assumed so the bulk energy E = N_b k T/2 where

An alternative perspective on cosmology 

He presents a novel derivation of the dynamic equations for the scale factor R(t) in the Friedmann-Robertson-Walker metric of the universe in General Relativity. His starting point is a simple argument leading to 

V is the Hubble volume, 4pi/3H^3, where H is the Hubble constant, and L_P is the Planck length.

The right-hand side is zero for the deSitter universe, which is predicted to be the asymptotic state of our current universe.

Possible insights about the cosmological constant

One of the biggest problems in theoretical physics is to explain why the cosmological constant has the value that it does.

There are two aspects to the problem.
1. The measured value is so small, 120 orders of magnitude smaller than what one estimates based on the quantum vacuum energy!

2. The measured value seems to be finely tuned (to 120 significant figures!) to the value of the mass energy.

He presents an argument that the cosmological constant is related to the Planck length 
where mu is of order unity.

Details of his proposed solution are also discussed here.

I am not technically qualified to comment on the possible validity or usefulness of Padmanabhan's perspective and results. However, I think it provides a nice example of a modest and conventional scientific alternative to radical approaches, such as the multiverse, or ideas that seem to be going nowhere such as AdS/CFT that are too often invoked or clung onto to address these big questions. 

Aside. In the same book, there is also a short and helpful chapter, Quantum Spacetime on loop quantum gravity by Carlo Rovelli. He explicitly identifies the "atoms" of space-time as the elements of "spin foam".

Tuesday, November 26, 2024

Emergent gauge fields in spin ices

Spin ices are magnetic materials in which geometrically frustrated magnetic interactions between the spins prevent long-range magnetic order and lead to a residual entropy similar to in ice (solid water).

Spin ices provide a beautiful example of many aspects of emergence, including how surprising new entities can emerge at the mesoscale. I think the combined experimental and theoretical work on spin ice was one of the major achievements of condensed matter physics in the first decade of this century.

Novelty

Spin ices are composed of individual spins on a lattice. The system exhibits properties that the individual spins and the high-temperature state do not have. The novel properties can be understood in terms of an emergent gauge field. Novel entities include spin defects reminiscent of magnetic monopoles and Dirac strings.

State of matter

Spin ices exhibit a novel state of matter, the magnetic Coulomb phase. There is no long-range spin order, but there are power-law (dipolar) correlations that fall off as the inverse cube of distance.

Toy models

Classical models such as the Ising or Heisenberg models with antiferromagnetic nearest-neighbour interactions on the pyrochlore lattice exhibit the emergent physics associated with spin ices: absence of long-range order, residual entropy, ice type rules for local order, and long-range dipolar spin correlations exhibiting pinch points. These toy models can be used to derive the gauge theories that describe emergent properties such as monopoles and Dirac strings.

Actual materials that exhibit spin ice physics such as dysprosium titanate (Dy2Ti2O7) and holmium titanate (Ho2Ti2O7are more complicated. They involve quantum spins, ferromagnetic interactions, spin-orbit coupling, crystal fields, complex crystal structure and dipolar magnetic interactions. Chris Henley says these materials

"are well approximated as having nothing but (long-ranged) dipolar spin interactions, rather than nearest-neighbor ones. Although this model is clearly related to the “Coulomb phase,” I feel it is largely an independent paradigm with its own concepts that are different from the (entropic) Coulomb phase..."

Effective theory

Gauge fields described by equations analogous to electrostatics and magnetostatics in Maxwell’s electromagnetism are emergent in coarse-grained descriptions of spin ices. 

Consider a bipartite lattice where on each site we locate a tetrahedron. The "ice rules" require that two spins on each tetrahedron point in and two out. We can define a field L(i) on each lattice site i which is the sum of all the spins on the tetrahedron. The magnetic field B(r) is a coarse-graining of the field L(i). The ice rules and local conservation of flux require that 

The classical ground state of this model is infinitely degenerate. The emergent “magnetic” field [which it should be stressed is not a physical magnetic field] allows the presence of monopoles [magnetic charges]. These correspond to defects that do not satisfy the local ice rules in the spin system.

It can be shown that the total free energy of the system is

K is the "stiffness" or "magnetic permeability" associated with the gauge field. It is entirely of entropic origin, just like the elasticity of rubber.

[Aside: I would be curious to see a calculation of K from a microscopic model and an estimate from experiment. I have not stumbled upon one yet. Do you know of one? Henley points out that in water ice the entropic elasticity makes a contribution to the dielectric constant and this "has been long known."]

  A local spin flip produces a pair of oppositely charged monopoles. The monopoles are deconfined in that they can move freely through the lattice. They are joined together by a Dirac string.

This contrasts with real magnetism where there are no magnetic charges, only magnetic dipoles; one can view magnetic charges as confined within dipoles.

There is an effective interaction between the two monopoles [charges] that has the same form as Coulomb’s law.  There are only short-range (nearest neighbour) direct interactions between the spins. However, these act together to produce a long-range interaction between the monopoles (which are deviations from local spin order).

Universality

The novel properties of spin ice occur for both quantum and classical systems, Ising and Heisenberg spins, and for a range of lattices. The same physics occurs with water ice, magnetism, and charge order.

Modularity at the mesoscale

The system can be understood as a set of weakly interacting modular units. These include the tetrahedra of spins, the magnetic monopoles, and the Dirac strings. The measured temperature dependence of the specific heat of Dy2Ti2O7  is consistent with that calculated from Debye-Huckel theory for deconfined charges interacting by Coulomb's law, and shown as the blue curve below. The figure is taken from here.

Pinch points.

The gauge theory predicts that the spin correlation function (in momentum space) has a particular singular form exhibiting pinch points [also known as bow ties], which are seen experimentally.

Unpredictability

Most new states of matter are not predicted theoretically. They are discovered by experimentalists, often by serendipity. Spin ice and the magnetic Coulomb phase seems to be an exception. Please correct me if I am wrong.

Sexy magnetic monopoles or boring old electrical charges?

I am hoping a reader than clarify this issue. What is wrong with the following point of view. In the discussion above the "magnetic field" B(r) could equally well be replaced with an "electric field" E(r). Then the spin defects are just analogous to electrical charges and the "Dirac strings" become like a polymer chain with opposite electrical charges at its two ends. This is not as sexy. 

Note that Chris Henley says Dirac strings are "a nebulous and not very helpful notion when applied to the Coulomb phase proper (with its smallish polarisation), for the string's path is not well defined... It is only in an ordered phase... that the Dirac string has a clear meaning."

Or is the emergent field actually "magnetic"? It describes spin defects and these are associated with a local magnetic moment. Furthermore, the long-range dipolar correlations (with associated pinch points) of the gauge field are detected by magnetic neutron scattering and so the gauge field should be viewed as "magnetic" and not "electric".

Emergent gauge fields in quantum many-body systems?

In spin ice, the emergent gauge field is classical and arises in a spin system that can be described classically. This does raise two questions that have been investigated extensively by Xiao-Gang Wen. First, he has shown how certain mean-field treatments of frustrated antiferromagnetic (with quantum spin liquid ground states) and doped Mott insulators lead to emergent gauge fields. As fascinating as his work is, it needs to be stressed that there is no definitive evidence for these emergent gauge fields. They just provide appealing theoretical descriptions. This is in contrast to the emergent gauge fields for spin ice.

Second, based on Wen's success at constructing these emergent gauge fields he has pushed provocative (and highly creative) ideas that the gauge fields and fermions that are considered "fundamental" in the standard model of particle physics may be emergent entities. This is the origin of the subtitle of his 2004 book, Quantum Field Theory of Many-body Systems: From the Origin of Sound to an Origin of Light and Electrons.

To prepare this post I found the articles below helpful.

Emergent particles and gauge fields in quantum matter

Ben J. Powell

Maxwell electromagnetism as an emergent phenomenon in condensed matter

J. Rehn and R. Moessner

The “Coulomb Phase” in Frustrated Systems

Chris Henley

Friday, November 15, 2024

Emergence and protein folding

Proteins are a distinct state of matter. Globular proteins are tightly packed with a density comparable to a crystal but without the spatial regularity found in crystals. The native state is thermodynamically stable, in contrast to the globule state of synthetic polymers which is often glassy and metastable, with a structure that depends on the preparation history.

For a given amino acid sequence the native folded state of the protein is emergent. It has a structure, properties, and function that the individual amino acids do not, nor does the unfolded polymer chain. For example, the enzyme catalase has an active site whose function is as a catalyst to make hydrogen peroxide (which is toxic) decay rapidly.


Protein folding is an example of self-organisation. A key question is how the order of the folded state arises from the disorder (random configuration) of the unfolded state.

There are hierarchies of structures, length scales, and time scales associated with the folding.

The hierarchy of structures are primary, secondary, tertiary, and ternary structures. The primary structure is the amino acid sequence in the heteropolymer. Secondary structures include alpha-helices and beta-sheets, shown in the figure above in orange and blue, respectively. The tertiary structure is the native folded state. An example of a ternary structure is in hemoglobin which consists of four myoglobin units in a particular geometric arrangement.

The hierarchy of time scales varies over more than fourteen orders of magnitude, including folding (msec to sec), helix-coil transitions (microsec), hinge motion (nanosec), and bond vibrations (10 fsecs).

Folding exhibits a hierarchy of processes, summarised in the figure below which is taken from
Masaki Sasai, George Chikenji, Tomoki P. Terada
Modularity 
"Protein foldons are segments of a protein that can fold into stable structures independently. They are a key part of the protein folding process, which is the stepwise assembly of a protein's native structure." (from Google AI)
See for example.

Discontinuities
The folding-unfolding transition [denaturation] is a sharp transition, similar to a first-order phase transition. This sharpness reflects the cooperative nature of the transition. There is a well-defined enthalpy and entropy change associated with this transition.


Universality
Proteins exhibit "mutational plasticity", i.e., native structures tolerant to many mutations (changes in individual amino acids). Aspects of the folding process such as its speed, reliability, reversibility, and modularity appear to be universal, i.e., hold for all proteins.

Diversity with limitations
On the one hand, there are a multitude of distinct native structures and associated biological functions. On the other hand, this diversity is much smaller than the configuration space, presumably because thermodynamic stability vasts reduces the options.

Effective interactions
These are subtle. Some of the weak ones matter as the stabilisation energy of the native state is of order 40 kJ per mole, which is quite small as there are about 1000 amino acids in the polymer chain. Important interactions include hydrogen bonding, hydrophobic, and volume exclusion. In the folded state monomers interact with other monomers that are far apart on the chain. The subtle interplay of these competing interactions produces complex structures with small energy differences, as is often the case with emergent phenomena.

Toy models
1. Wako-Saito-Munoz-Eaton model
This is an Ising-like model on a chain. A short and helpful review is

Note that the interactions are not pairwise but involves strings of "spins" between native contacts.

2. Dill's HP polymer on a lattice
This consists of a polymer which has only two types of monomer units and undergoes a self-avoiding walk on a lattice. H and P denote hydrophobic and polar amino acid units, denoted by red and blue circles, respectively, in the figure below. The relative simplicity of the model allows complete enumeration of all possible confirmations for short chains. The model is simpler in two dimensions, yet still captures essential features of the folding problem.  

As the H-H attraction increases the chain undergoes a relatively sharp transition to just a few conformations that are compact and have hydrophobic cores. The model exhibits much of the universality of protein folding. Although there are 20 different amino acids in real proteins, the model divides them into two classes and still captures much of the phenomena of folding, including mutational plasticity.


co-operativity - helical order-disorder transition is sharp

Organising principles
Certain novel concepts such as the rugged energy landscape and the folding funnel apply at a particular scale.


This post drew on several nice papers written by Ken Dill and collaborators including

The Protein Folding Problem, H.S. Chan and K.A. Dill, Physics Today, 1993

Roy Nassar, Gregory L. Dignon, Rostam M. Razban, Ken A. Dill, Journal of Molecular Biology, 2021

Interestingly, in the 2021 article, Dill claims that the protein folding problem [which is not the prediction problem] has now essentially been solved.

Wednesday, October 30, 2024

A very effective Hamiltonian in nuclear physics

Atomic nuclei are complex quantum many-body systems. Effective theories have helped provide a better understanding of them. The best-known are the shell model, the (Aage) Bohr-Mottelson theory of non-spherical nuclei, and the liquid drop model. Here I introduce the Interacting Boson Model (IBM), which provides somewhat of a microscopic basis for the Bohr-Mottelson theory. Other effective theories in nuclear physics are chiral perturbation theory, Weinberg's theory for nucleon-pion interactions, and Wigner's random matrix theory.

The shell model has similarities to microscopic models in atomic physics. A major achievement is it explains the origins of magic numbers, i.e., nuclei with atomic numbers 2, 8, 20, 28, 50, 82, and 126 are particularly stable because they have closed shells. Other nuclei can then be described theoretically as an inert closed shell plus valence nucleons that interact with a mean-field potential due to the core nuclei and then with one another via effective interactions.

For medium to heavy nuclei the Bohr-Mottelson model describes collective excitations including transitions in the shape of nuclei.

An example of the trends in the low-lying excitation spectrum  to explain is shown in the figure below. The left spectrum is for nucleus with close to a magic number of nuclei and the right one for an almost half-filled shell. R_4/2 is the ratio of the energies of the J=4+ state to that of the 2+ state, relative to the ground state. B(E2) is the strength of the quadrupole transition between the 2+ state and the ground state.


The Interacting Boson Model (IBM) is surprisingly simple and successful. It illustrates the importance of quasi-particles, builds on the stability of closed shells, and neglects many degrees of freedom. It describes even-even nuclei, i.e., nuclei with an even number of protons and an even number of neutrons. The basic entities in the theory are pairs of nucleons. These are taken to be either an s-wave state or a d-wave state. There are five d-wave states (corresponding to the 2J+1 possible states of total angular momentum with J=2). Each state is represented by a boson creation operator and so the Hilbert space is six-dimensional. If the states are degenerate [which they are not] the model has U(6) symmetry.

The IBM Hamiltonian is written in terms of the most general possible combinations of the boson operators. This has a surprisingly simple form.

Note that it involves only four parameters. For a given nucleus these parameters can be fixed from experiment, and in principle calculated from the shell model. The Hamiltonian can be written in a form that gives physical insight, connects to the Bohr-Mottelson model and is amenable to a group theoretical analysis that makes calculation and understanding of the energy spectrum relatively simple.

Central to the group theoretical analysis is considering subalgebra chains as shown below

 

An example of an energy spectrum is shown below.

The fuzzy figures are taken from a helpful Physics Today article by Casten and Feng from 1984 (Aside: the article discusses an extension of the IBM involving supersymmetry, but I don't think that has been particularly fruitful).

The figure below connects the different parameter regimes of the model to the different subalgebra chains.


The nucleotide chart below has entries that have colour shading corresponding to their parameter values for the IBM model according to the symmetry triangle above.

The different vertices of the triangle correspond to different nuclear geometries and allow a connection to Aage Bohr's model for the surface excitations. 

This is discussed in a nice review article, which includes the figure above.

Quantum phase transitions in shape of nuclei

Pavel Cejnar, Jan Jolie, and Richard F. Casten

Aside: one thing that is not clear to me from the article concerns questions that arise because the nucleus has a finite number of degrees of freedom. Are the symmetries actually broken or is there tunneling between degenerate ground states?   

Tuesday, October 22, 2024

Colloquium on 2024 Nobel Prizes


This friday I am giving a colloquium for the UQ Physics department.

2024 Nobel Prizes in Physics and Chemistry: from biological physics to artificial intelligence and back

The 2024 Nobel Prize in Physics was awarded to John Hopfield and Geoffrey Hinton “for foundational discoveries and inventions that enable machine learning with artificial neural networks.” Half of the 2024 Chemistry prize was awarded to Dennis Hassabis and John Jumper for “protein structure prediction” using artificial intelligence. I will describe the physics background needed to appreciate the significance of the awardees work. 

Hopfield proposed a simple theoretical model for how networks of neurons in a brain can store and recall memories. Hopfield drew on his background in and ideas from condensed matter physics, including the theory of spin glasses, the subject of the 2021 Physics Nobel Prize.

Hinton, a computer scientist, generalised Hopfield’s model, using ideas from statistical physics to propose a “Boltzmann machine” that used an artificial neural network to learn to identify patterns in data, by being trained on a finite set of examples. 

For fifty years scientists have struggled with the following challenge in biochemistry: given the unique sequence of amino acids that make up a particular protein can the native structure of the protein be predicted? Hassabis, a computer scientist, and Jumper, a theoretical chemist, used AI methods to solve this problem, highlighting the power of AI in scientific research. 

I will briefly consider some issues these awards raise, including the blurring of boundaries between scientific disciplines, tensions between public and corporate interests, research driven by curiosity versus technological advance, and the limits of AI in scientific research.

Here is my current draft of the slides.

Saturday, October 19, 2024

John Hopfield on what physics is

A decade ago John Hopfield reflected on his scientific life in Annual Reviews in Condensed Matter Physics, Whatever Happened to Solid State Physics?

"What is physics? To me—growing up with a father and mother who were both physicists—physics was not subject matter. The atom, the troposphere, the nucleus, a piece of glass, the washing machine, my bicycle, the phonograph, a magnet—these were all incidentally the subject matter. The central idea was that the world is understandable, that you should be able to take anything apart, understand the relationships between its constituents, do experiments, and on that basis be able to develop a quantitative understanding of its behavior. 

Physics was a point of view that the world around us is, with effort, ingenuity, and adequate resources, understandable in a predictive and reasonably quantitative fashion. Being a physicist is a dedication to the quest for this kind of understanding."

He describes how this view was worked out in his work in solid state theory and moved into biological physics and the paper for which he was awarded the Nobel Prize. 

"Eventually, my knowledge of spin-glass lore (thanks to a lifetime of interaction with P.W. Anderson), Caltech chemistry computing facilities, and a little neurobiology led to the first paper in which I used the word neuron. It was to provide an entryway to working on neuroscience for many physicists..."

After he started working on biological physics in the late 1970s he got an offer from Chemistry and Biology at Caltech and Princeton Physics suggested he take it. 

"In 1997, I returned to Princeton—in the Molecular Biology Department, which was interested in expanding into neurobiology. Although no one in that department thought of me as anything but a physicist, there was a grudging realization that biology could use an infusion of physics attitudes and viewpoints. I had by then strayed too far from conventional physics to be courted for a position in any physics department. So I was quite astonished in 2003 to be asked by the American Physical Society to be a candidate for vice president. And, I was very happy to be elected and ultimately to serve as the APS president. I had consistently felt that the research I was doing was entirely in the spirit and paradigms of physics, even when disowned by university physics departments."

Saturday, October 12, 2024

2024 Nobel Prize in Physics

 I was happy to see John Hopfield was awarded the Nobel Prize in Physics for his work on neural networks. The award is based on this paper from 1982

Neural networks and physical systems with emergent collective computational abilities

One thing I find beautiful about the paper is how Hopfield drew on ideas about spin glasses (many competing interactions lead to many ground states and a complex energy landscape).

A central insight is that an efficient way to store the information describing multiple objects (different collective spin states in an Ising model) is in terms of the inter-spin interaction constants (J_ij's) in the Ising model. These are the "weights" that are trained/learned in computer neural nets.

It should be noted that Hopfield's motivation was not at all to contribute to computer science. It was to understand a problem in biological physics: what is the physical basis for associative memory? 

I have mixed feelings about Geoffrey Hinton sharing the prize.  On the one hand, in his initial work, Hinton used physics ideas (Boltzmann weights) to extend Hopfields ideas so they were useful in computer science. Basically, Hopfield considered a spin glass model at zero temperature and Hinton considered it at non-zero temperature. [Note, the temperature is not physical it is just a parameter in a Boltzmann probability distribution for different states of the neural network]. Hinton certainly deserves lots of prizes, but I am not sure a physics one is appropriate. His work on AI has certainly been helpful for physics research. But so have lots of other advances in computer software and hardware, and those pioneers did not receive a prize.

I feel a bit like I did with Jack Kilby getting a physics prize for his work on integrated circuits. I feel that sometimes the Nobel Committee just wants to remind the world how physics is so relevant to modern technology.

Ten years ago Hopfield wrote a nice scientific autobiography for Annual Reviews in Condensed Matter Physics,

Whatever Happened to Solid State Physics?

After the 2021 Physics Nobel to Parisi, I reflected on the legacy of spin glasses, including the work of Hopfield.

Aside: I once pondered whether a chemist will ever win the Physics prize, given that many condensed matter physicists have won the chemistry prize. Well now, we have had an electronic engineer and a computer scientist winning the Physics prize.

Another side: I think calling Hinton's network a Boltzmann machine is a scientific misnomer. I should add this to my list of people getting credit for things that did not do. Boltzmann never considered networks, spin glasses or computer algorithms. Boltzmann was a genius, but I don't think we should be attaching his name to everything that involves a Boltzmann distribution. To me, this is a bit like calling the Metropolis algorithm for Monte Carlo simulations the Boltzmann algorithm. 

Monday, October 7, 2024

Mental Health for Academics

Tomorrow I am giving a talk "Mental health for academics" for the ARC Centre for Engineered Quantum Systems as part of Mental Health Week.

Here is a video recording of my planned talk. As an experiment, I did a record practice versions of my talked and uploaded it on YouTube. Feedback both on content and the technology welcome.

Here are the slides.

A resource I mention at the end is the blog Voices of Academia, set up by Marissa Edwards from UQ.

Thursday, September 26, 2024

The multi-faceted character of emergence (part 2)

In the previous post, I considered five different characteristics that are often associated with emergence and classified them as being associated with ontology (what is real and observable) rather than epistemology (what we believe to be true). 

Below I consider five more characteristics: self-organisation, unpredictability, irreducibility, contextuality and downward causation, and intra-stratum closure.

6. Self-organisation

Self-organisation is not a property of the system but a mechanism that a theorist says causes an emergent property to come into being. Self-organisation is also referred to as spontaneous order. 

In the social sciences self-organisation is sometimes referred to as an endogenous cause, in contrast to an exogenous cause. There is no external force or agent causing the order, in contrast to order that is imposed externally. For example, suppose that in a city there is no government policy about the price of a loaf of sliced wholemeal bread or on how many loaves that bakers should produce. It is observed that prices are almost always in the range of $4 to $5 per loaf, and that rarely are there bread shortages. This outcome is a result of the self-organisation of the free-market, and economists would say the price range and its stability has an endogenous cause. In contrast, if the government legislated the price range and the production levels that would be an exogenous cause. Friedrich Hayek emphasised the role of spontaneous order in economics. In biology, Stuart Kaufmann equates emergence with spontaneous order and self-organisation.

In physics, the periodicity of the arrangement of atoms in a crystal is a result of self-organisation and has an endogenous cause. In contrast, the periodicity of atoms in an optical lattice is determined by the laser physicist who creates the lattice and so has an exogenous cause.

Self-organisation shows how local interactions can produce global properties. In different words, short-range interactions can lead to long-range order. After decades of debate and study, the Ising model showed that this was possible. Other examples of self-organisation, include flocking of birds and teamwork in ant colonies. There is no director or leader but the system acts “as if” there is. 

7. Unpredictability

Ernst Mayr (This is Biology, p.19) defines emergence as “in a structured system, new properties emerge at higher levels of integration that could not have been predicted from a knowledge of the lower-level components.” Philip Ball also defines emergence in terms of unpredictability (Quanta, 2024).

More broadly, in discussions of emergence, “prediction” is used in three different senses: logical prediction, historical prediction, and dynamical prediction.

Logical prediction (deduction) concerns whether one can predict (calculate) the emergent (novel) property of the whole system solely from a knowledge of all the properties of the parts of the system and their interactions. Logical predictability is one of the most contested characteristics of emergence. Sometimes “predict” is replaced with “difficult to predict”, “extremely difficult to predict”, “impossible to predict”, “almost impossible to predict”, or “possible in principle, but impossible in practice, to predict.” 

As an aside, I note that philosophers distinguish between epistemological emergence and ontological emergence. They are associated with prediction that is "possible in principle, but difficult in practice" and "impossible in principle" respectively.

After an emergent property has been discovered experimentally sometimes it can be understood in terms of the properties of the system parts. In a sense “pre-diction” then becomes “post-diction.” An example is the BCS theory of superconductivity, which provided a posteriori, rather than a priori, understanding. In different words, development of the theory was guided by a knowledge of the phenomena that had already been observed and characterised experimentally. Thus, a keyword in the statement above about logical prediction is “solely”. 

Historical prediction. Most new states of matter discovered by experimentalists were not predicted even though theorists knew the laws that the microscopic components of the system obeyed. Examples include superconductivity (elemental metals, cuprates, iron pnictides, organic charge transfer salts, …), superfluidity in liquid 4He, antiferromagnetism, quasicrystals, and the integer and fractional quantum Hall states.

There are a few exceptions where theorists did predict new states of matter. These include are Bose-Einstein Condensates (BECs) in dilute atomic gases and topological insulators, the Anderson insulator in disordered metals, the Haldane phase in even-integer quantum antiferromagnetic spin chains, and the hexatic phase in two dimensions. It should be noted that prediction of BECs and topological insulators were significantly helped that theorists could predict them starting with Hamiltonians of non-interacting particles. Furthermore, all of these predictions involved working with effective Hamiltonians. None started with microscopic Hamiltonians for specific materials.

Dynamical unpredictability concerns what it means in chaotic dynamical systems, where it relates to sensitivity to initial conditions. I do not see this as an example of emergence as it can occur in systems with only a few degrees of freedom. However, some authors do associate dynamical unpredictability with complexity and emergence.

8. Irreducibility and singularities

An emergent property cannot be reduced to properties of the parts, because if emergence is defined in terms of novelty, the parts do not have the property. 

Emergence is also associated with the problem of theory reduction. Formally, this is the process where a more general theory reduces in a particular mathematical limit to a less general theory. For example, quantum mechanics reduces to classical mechanics in the limit where Planck’s constant goes to zero. Einstein’s theory of special relativity reduces to Newtonian mechanics in the limit where the speeds of massive objects become much less than the speed of light. Theory reduction is a subtle philosophical problem that is arguably poorly understood both by scientists [who oversimplify or trivialise it] and philosophers [who arguably overstate the problems it presents for science producing reliable knowledge]. Subtleties arise because the two different theories usually involve language and concepts that are "incommensurate" with one another. 

Irreducibility is also related to the discontinuities and singularities associated with emergent phenomena. As emphasised independently by Hans Primas and Michael Berry, singularities occur because the mathematics of theory reduction involves singular asymptotic expansions. Primas illustrates this by considering a light wave incident on an object and producing a shadow. The shadow is an emergent property, well described by geometrical optics, but not by the more fundamental theory of Maxwell’s electromagnetism. The two theories are related in the asymptotic limit that the wavelength of light in Maxwell’s theory tends to zero. This example illustrates that theory reduction is compatible with the emergence of novelty. Primas also considers how the Born-Oppenheimer approximation, which is central to solid state theory and quantum chemistry, is associated with a singular asymptotic expansion (in the ratio of the mass of an electron to the mass of an atomic nuclei in the system). 

Berry considers several other examples of theory reduction, including going from general to special relativity, from statistical mechanics to thermodynamics, and from viscous (Navier-Stokes) fluid dynamics to inviscid (Euler) fluid dynamics. He has discussed in detail how the caustics that occur in ray optics are an emergent phenomena and are associated with singular asymptotic expansions in the wave theory.

The philosopher of science Jeremy Butterfield showed rigorously that theory reduction occurred for four specific systems that exhibited emergence, defined by him as a novel and robust property. Thus, novelty is not sufficient for irreducibility.

9. Contextuality and downward causation

Any real system has a context. For example, it has boundary and an environment, both in time and space. In many cases the properties of the system are completely determined by the parts of the system and their interactions. Previous history and boundaries do not matter. However, in some cases the context may have a significant influence on the state of the system. An example is Rayleigh-Bernard convection cells and turbulent flow whose existence and nature are determined by the interaction of the fluid with the container boundaries. A biological example concerns what factors determine the structure, properties, and function that a particular protein (linear chain of amino acids) has. It is now known that the only factor is not just the DNA sequence that encodes for the amino acid sequence, in contradiction to some versions of the Central Dogma of molecular biology.  Other factors may be the type of cell that contains the protein and the network of other proteins in which the particular protein is embedded. Context sometimes matters.

Supervenience is the idea that once the micro level is fixed, macro levels are fixed too. The examples above might be interpreted as evidence against supervenience. Supervenience is used to argue against “the possibility for mental causation above and beyond physical causation.” 

Downward causation is sometimes equated with emergence, particularly in debates about the nature of consciousness. In the context of biology, Denis Noble defines downward causation as when higher level processes can cause changes in lower level properties and processes. He gives examples where physiological effects can switch on and off individual genes or signalling processes in cells, including maternal effects and epigenetics.

10. Intra-stratum closure: informational, causal, and computational

The ideas described below were recently developed by Rosas et al. from a computer science perspective. They defined emergence in terms of universality and discussed its relationship to informational closure, causal closure, and computational closure. Each of these are given a precise technical definition in their paper. Here I give the sense of their definitions. In considering a general system they do not pre-define the micro- and macro- levels of a system but consider how they might be defined so that universality holds, i.e., so that properties at the macro-level are independent of the details of the micro-level (i.e., are universal).

Informational closure means that to predict the dynamics of the system at the macroscale an observer does not need any additional information about the details of the system at the microscale. Equilibrium thermodynamics and fluid dynamics are examples. 

Causal closure means that the system can be controlled at the macroscale without any knowledge of lower-level information. For example, changing the software code that is running on a computer allows one to reliably control the microstate of the hardware of the computer regardless of what is happening with the trajectories of electrons in the computer.

Computational closure is a more technical concept, being defined in terms of “a conceptual device called the ε-(epsilon) machine. This device can exist in some finite set of states and can predict its own future state on the basis of its current one... for an emergent system that is computationally closed, the machines at each level can be constructed by coarse-graining the components on just the level below: They are, “strongly lumpable.” "

Rosas et al., show that informational closure and causal closure are equivalent and that they are more restrictive than computational closure. It is not clear to me how these closures relate to novelty as a definition of emergence.

In summary, emergence means different things to different people. I have listed ten different characteristics that have been associated with emergent properties. They are not all equivalent and so when discussing emergence it is important to be clear about which characteristic one is using to define emergence.

Tuesday, September 24, 2024

The multi-faceted character of emergence (part 1)

There is more to emergence than novel properties, i.e., where a whole system has a property that the individual components of the system do not have. Here I focus on emergent properties, but in most cases “property” might be replaced with state, phenomenon, or entity. I now discuss ten characteristics often associated with emergence, beyond novelty. Some people include one or more of these characteristics in their definitions of emergence. However, I do not include them in my definition because as I explain some of the characteristics are contentious. Some may not be necessary or sufficient for novel system properties.

The first five characteristics discussed below might be classified as objective (i.e., observable properties of the system) and the second five as subjective (i.e., associated with how an investigator thinks about the system). In different words, the first five are mostly concerned with ontology (what is real) and the second five with epistemology (what we know). The first five characteristics concern discontinuities, universality, diversity, mesoscales, and modification of parts. The second five concern self-organisation, unpredictability, irreducibility, downward causation, and closure. 

1. Discontinuities 

Quantitative changes in the system can become qualitative changes in the system. For example, in condensed matter physics spontaneous symmetry breaking only occurs in the thermodynamic limit (i.e., when the number of particles of the system becomes infinite). More is different. Thus, as a quantitative change in the system size occurs the order parameter becomes non-zero. In a system that undergoes a phase transition at a non-zero temperature, a small change in temperature can lead to the appearance of order and to a new state of matter. For a first-order phase transition, there is discontinuity in properties such as the entropy and density. These discontinuities define a phase boundary in the pressure-temperature diagram. For continuous phase transitions the order parameter is a continuous function of temperature, becoming non-zero at the critical temperature. However the derivative with respect to temperature may be discontinuous and/or thermodynamic properties such as the specific heat and susceptibility associated with the order parameter may approach infinite as the critical temperature is approached.

Two different states of a system are said to be adiabatically connected if one can smoothly deform one state into the other and all the properties of the system also change smoothly. The case of the liquid-gas transition illustrates subtle issues about defining emergence. A discontinuity does not imply a qualitative difference (novelty). On the one hand, there is a discontinuity in the density and entropy of the system as the liquid-gas phase boundary is crossed in the pressure-temperature diagram. On the other hand, there is no qualitative difference between a gas and a liquid. There is only a quantitative difference: the density of the gas is less than the liquid. Albeit sometimes the difference is orders of magnitude. The liquid and gas state can be adiabatically connected. There is a path in the pressure-temperature phase diagram that can be followed to connect the liquid and gas states without any discontinuities in properties.

The ferromagnetic state also raises questions, as illustrated by a debate between Rudolf Peierls and Phil Anderson about whether ferromagnetism exhibits spontaneous symmetry breaking. Anderson argued that it did not as, in contrast to the antiferromagnetic state, a non-zero magnetisation (order parameter) occurs for finite systems and the magnetic order does not change the excitation spectrum, i.e., produce a Goldstone boson. On the other hand, singularities in properties at the Curie temperature (critical temperature for ferromagnetism) only exist in the thermodynamic limit. Also, a small change in the temperature, from just above the Curie temperature to below, can produce a qualitative change, a non-zero magnetisation.

2. Universality

Properties often referred to as emergent are universal in the sense that it is independent of many of the details of the parts of the system. There may be many different systems that can have a particular emergent property. For example, superconductivity is present in metals with a diverse range of crystal structures and chemical compositions. 

Robustness is related to universality. If small changes are made to the composition of the system (for example replacing some of the atoms in the system with atoms of different chemical element) the novel property of the system is still present. In elementary superconductors, introducing non-magnetic impurity atoms has no effect on the superconductivity.

Universality is both a blessing and a curse for theory. Universality can make it easier to develop successful theories because it means that many details need not be included in a theory in order for it to successfully describe an emergent phenomenon. This is why effective theories and toy models can work even better than might be expected. Universality can make theories more powerful because they can describe a wider range of systems. For example, properties of elemental superconductors can be described by BCS theory and by Ginzburg-Landau theory, even though the materials are chemically and structurally diverse. The curse of universality for theory is that universality illustrates the problem of “under-determination of theory”, “over-fitting of data” and “sloppy theories” [Sethna et al.]. A theory can agree with the experiment even when the parameters used in the theory may be quite different from the actual ones. For example, the observed phase diagram of water can be reproduced, sometimes with impressive quantitative detail, by combining classical statistical mechanics with empirical force fields that assume water molecules can be treated purely being composed of point charges.

Suppose we start with a specific microscopic theory and calculate the macroscopic properties of the system, and they agree with experiment. It would then be tempting to think that we have the correct microscopic theory. However, universality suggests this may not be the case.

For example, consider the case of a gas of weakly interacting atoms or molecules. We can treat the gas particles as classical or quantum. Statistical mechanics gives exactly the same equation of state and specific heat capacity for both microscopic descriptions. The only difference may be the Gibbs paradox [the calculated entropy is not an extensive quantity] which is sensitive to whether or not the particles are treated as identical or not. Unlike the zeroth, first, and second law of thermodynamics, the third law does require that the microscopic theory be quantum. Laughlin discusses these issues in terms of “protectorates” that hide “ultimate causes” .  

In some physical systems, universality can be defined in a rigorous technical sense, making use of the concepts and techniques of the renormalisation group and scaling. These techniques provide a method to perform coarse graining, to derive effective theories and effective interactions, and to define universality classes of systems. There are also questions of how universality is related to the robustness of strata, and the independence of effective theories from the coarse-graining procedure.

3. Diversity

Even when a system is composed of a small number of different components and interactions, the large number of possible stable states with qualitatively different properties that the system can have is amazing. Every snowflake is different. Water is found in 18 distinct solid states. All proteins are composed of linear chains of 20 different amino acids. Yet in the human body there are more than 100,000 different proteins and all perform specific biochemical functions. We encounter an incredible diversity of human personalities, cultures, and languages. A stunning case of diversity is life on earth. Billions of different plant and animal species are all an expression of different linear combinations of the four base pairs of DNA: A, G, T, and C.

This diversity is related to the idea that "simple models can describe complex behaviour". One example is Conway’s Game of Life. Another example is how simple Ising models with a few competing interactions can describe a devil's staircase of ground states or the multitude of different atomic orderings found in binary alloys.

Goldenfeld and Kadanoff defined complexity [emergence] as “structure with variations”. Holland (VSI) discusses “perpetual novelty” giving the example of the game of chess, where are typical game may involve the order of 1050 move sequences. “Motifs” are recurring patterns (sequences of moves) in games. 

Condensed matter physics illustrates diversity with the many different states of matter that have been discovered. The underlying microscopics is “just” electrons and atomic nuclei interacting according to Coulomb’s law.

The significance of this diversity might be downplayed by saying that it is just a result of combinatorics. But such a claim overlooks the issue of the stability of the diverse states that are observed. In a system composed of many components each of which can take on a few states the number of possible states of the whole system grows exponentially with the number of components. For example, for a chain of ten amino acids there are 1013 different possible linear sequences. But this does not mean that all these sequences will produce a functional protein, i.e., a molecule that will fold rapidly (on the timescale of milliseconds) into a stable tertiary structure and perform a useful biochemical function such as catalysis of a specific chemical reaction or signal transduction.

4. Simple entities at the mesoscale 

A key idea in condensed matter physics is that of quasi-particles. A system of strongly interacting particles may have excitations, seen in experiments such as inelastic neutron scattering and Angle Resolved PhotoElectron Spectroscopy (ARPES), that can be described as weakly interacting quasi-particles. These entities are composite particles, and have properties that are quantitatively different, and sometimes qualitatively different, from the microscopic particles. Sometimes this means that the scale (size) associated with the quasi-particles is intermediate between the micro- and the macro-scales, i.e., it is a mesoscale. The existence of quasi-particles leads naturally to the technique of constructing an effective Hamiltonian [effective theory] for the system where effective interactions describe the interactions between the quasi-particles.

The economist Herbert Simon argued that a characteristic of a complex system is that the system can be understood in terms of nearly decomposable units. Rosas et al., argue that emergence is associated with there being a scale at which the system is “strongly lumpable”. Denis Noble has highlighted how biological systems are modular, i.e., composed of simple interchangeable components.

5. Modification of parts and their relationships

Emergent properties are often associated with the state of the system exhibiting patterns, order, or structure, terms that may be used interchangeably. This reflects that there is a particular relationship (correlation) between the parts which is different to the relationships in a state without the emergent property. This relationship may also be reflected in a generalised rigidity. For example, in a solid applying a force on one surface results in all the atoms in the solid experiencing a force and moving together. The rigidity of the solid defines a particular relationship between the parts of the system.

Properties of the individual parts may also be different. For example, in a crystal single-atom properties such as electronic energy levels change quantitatively compared to their values for isolated atoms. Properties of finite subsystems are also modified, reflecting a change in interactions between the parts. For example, in a molecular crystal the frequencies associated with intramolecular atomic vibrations are different to their values for isolated molecules. However, emergence is a sufficient but not a necessary condition for these modifications. In gas and liquid states, novelty is not present but there are still such changes in the properties of the individual parts.

As stated at the beginning of this section the five characteristics above might be associated with ontology (what is real) and objective properties of the system that an investigator observes and depend less on what an observer thinks about the system. The next five characteristics might be considered to be more subjective, being concerned with epistemology (how we determine what is true). In making this dichotomy I do not want to gloss over the fuzziness of the distinction or of two thousand years of philosophical debates about the relationship between ontology and epistemology, or between reality and theory.

In the next post, I will discuss the remaining five characteristics: self-organisation, unpredictability, irreducibility, contextuality and downward causation, and intra-stratum closure.

Thanks for reading this far!

Friday, September 20, 2024

Steven Weinberg's radical change of mind

What is a fundamental theory? As we go to smaller and smaller distances and higher energies we keep finding new entities: atoms, electrons, nuclei, neutrons, protons, quarks, gluons, ...When will it stop?

If we look at a theory, such as a quantum field theory, at a particular energy and length scale, there may be hints that something is going on, such as the existence of new entities, at higher energies. One way to approach this problem is through the renormalisation group and to look at how coupling constants behave as the energy increases. If they start to blow up (diverge) is that a hint of something? But, this requires starting with a renormalisable theory...

An alternative approach is to start with an effective theory that one assumes [hypothesises] is valid at some limited energy scale. This goes against a previous dogma that one should only study renormalisable theories. Amongst elementary particle theorists, led by Steven Weinberg, there was a significant shift in perspective in the 1970s.

In a paper published in 2016, Effective field theory, past and future, Steven Weinberg reflected on how he changed his mind about renormalisability being a fundamental requirement for quantum field theories and how he came to the view that the Standard Model should be viewed as an effective field theory. Here are some quotes from the article. He first describes how in the 1960s he developed a field theory to describe the interactions of nucleons and pions.

"During this whole period, effective field theories appeared as only a device for more easily reproducing the results of current algebra. It was difficult to take them seriously as dynamical theories, because the derivative couplings that made them useful in the lowest order of perturbation theory also made them nonrenormalizable, thus apparently closing off the possibility of using these theories in higher order. 

My thinking about this began to change in 1976. I was invited to give a series of lectures at Erice that summer, and took the opportunity to learn the theory of critical phenomena by giving lectures about it. In preparing these lectures, I was struck by Kenneth Wilson’s device of “integrating out” short-distance degrees of freedom by introducing a variable ultraviolet cutoff, ...

Non-renormalizable theories, I realized, are just as renormalizable as renormalizable theories.

For me in 1979, the answer involved a radical reconsideration of the nature of quantum field theory.

The advent of effective field theories generated changes in point of view and suggested new techniques of calculation that propagated out to numerous areas of physics, some quite far removed from particle physics. Notable here is the use of the power-counting arguments of effective field theory to justify the approximations made in the BCS theory of superconductivity. Instead of counting powers of small momenta, one must count powers of the departures of momenta from the Fermi surface. Also, general features of theories of inflation have been clarified by re-casting these theories as effective field theories of the inflaton and gravitational fields. 

Perhaps the most important lesson from chiral dynamics was that we should keep an open mind about renormalizability. The renormalizable Standard Model of elementary particles may itself be just the first term in an effective field theory that contains every possible interaction allowed by Lorentz invariance and the SU (3) × SU (2) × U (1) gauge symmetry, only with the non-renormalizable terms suppressed by negative powers of some very large mass M...

... we should not despair of applying quantum field theory to gravitation just because there is no renormalizable theory of the metric tensor that is invariant under general coordinate transformations. It increasingly seems apparent that the Einstein–Hilbert Lagrangian √gR is just the least suppressed term in the Lagrangian of an effective field theory containing every possible generally covariant function of the metric and its derivatives...

it is usually assumed that in the quantum theory of gravitation, when Λ reaches some very high energy, of the order of 10^15 to 10^18 GeV, the appropriate degrees of freedom are no longer the metric and the Standard Model fields, but something very different, perhaps strings...

But maybe not..."

In 2021 Weinberg gave a talk, with a similar point of view, which inaugurated an international seminar series [online during covid-19]. 

In response to that talk, Peter Woit has a blog post where he objects to Weinberg's point of view that the Standard Model is "just" an effective theory, only valid at low energies.

Reviews of Modern Physics recently published a review that discussed how Weinberg's perspective is worked out in detail.

The standard model effective field theory at work

Gino Isidori, Felix Wilsch, and Daniel Wyler

The discussion above fits naturally with an emergentist perspective: reality is stratified. Effective theories at one strata may have singularities around boundaries between strata, and new entities emerge, both physically and theoretically, as one moves to the next higher or lower strata.

Sunday, September 15, 2024

Biology is about emergence in subtle ways

Biology is a field that is all about emergence. It exhibits a hierarchy of structures from DNA to proteins to cells to organs to organisms. Phenotypes emerge from genotypes. At each level of the hierarchy (stratum) there are unique entities, phenomena, principles, methods, theories, and sub-fields. But there is more to the story. 

Philip Ball is probably my favourite science writer. Earlier this year, he gave a beautiful lecture at The Royal Institution, What is Life and How does it Work?


The lecture presents the main ideas in his recent book,  How Life Works: A User's Guide to the New Biology

Here are a few things that stood out for me from the lecture.

1. The question, "What is life?" has been and continues to be notoriously difficult to answer.

It was originally stated by Francis Crick, and some commonly assumed corollaries of it are wrong. In simple terms, the Dogma states that DNA makes RNA and RNA makes proteins. This is a unique and unidirectional process. For example, a specific code (string of the letters A,G,T, and C) will produce a specific protein (sequence of amino acids) which will naturally fold into a unique structure with a specific biochemical function. 


The central dogma has undergirded the notion that genes determine everything in biology. Everything is bottom-up.
However, Ball gives several counterexamples.
A large fraction of our DNA does not code for proteins.
Many proteins are disordered, i.e., they do not have a unique folded structure.

Aside: An earlier failure of (some versions of) the central dogma was the discovery of reverse transcriptase by the obscure virus club, essential for the development of HIV drugs and covid-19 vaccines.

3. The role of emergence can be quantified in terms of information theory, helping to understand the notion of causal emergence: the cause of large-scale behaviour is not just a sum of micro-causes, i.e., the properties of and interactions between the constituents at smaller scales. Entities at the level of the phenomena are just as important as what occurs at lower levels.
(page 214 in the book). Causal emergence is concerned with fitting the scale of the causes to the scale of the effects.
The figure above is taken from this paper from 2021.


The authors quantify casual emergence in protein networks in terms of mutual information (between large and small scales) and effective information (a measure of the certainty in the connectivity of a network).

Aside: These quantitative notions of emergence have been developed more in recent work by Fernando Rosas and collaborators and discussed in a Quanta article by Philip Ball.

4. Context matters.  A particular amino acid sequence does not define a unique protein structure and function. They may depend on the specific cell in which the protein is contained.

5. Causal spreading.  Causality happens at different levels. It does not always happen at the bottom (genetic level). Sometimes it happens at higher levels. And, it can flow up or down.

6. Levels of description matter. This is well illustrated by morphology and the reasons that we have five fingers. This is not determined by genes.

7. Relevance to medicine. There has been a focus on the genetic origin of diseases. However, many diseases, such as cancer, do not predominantly happen at the genetic level. There has been a prejudice to focus on the genetic level, partly because that is where most tools are available. For cancer, focussing on other levels, such as the immune system, may be more fruitful.

8. Metaphors matter. Biology has been dominated by  metaphors such as living things are "machines made from genes" and "computers running a code". However, metaphors are metaphors. They have limitations, particularly as we learn more. All models are wrong, but some are useful. Ball proposes that metaphors from life, including the notion of agency, may be more fruitful.

9. The wisdom of Michael Berry. Ball ends with Berry's saying that the biggest unsolved problem in physics is not about dark matter (or some similar problem), but rather, "If all matter can be described by quantum theory, where does the aliveness of living things come from?" In other words, "Why is living matter so different from other matter?"

There is also an interesting episode of the How To Academy podcast, where Ball is interviewed about the book.

From Leo Szilard to the Tasmanian wilderness

Richard Flanagan is an esteemed Australian writer. My son recently gave our family a copy of Flanagan's recent book, Question 7 . It is...