Thursday, September 26, 2024

The multi-faceted character of emergence (part 2)

In the previous post, I considered five different characteristics that are often associated with emergence and classified them as being associated with ontology (what is real and observable) rather than epistemology (what we believe to be true). 

Below I consider five more characteristics: self-organisation, unpredictability, irreducibility, contextuality and downward causation, and intra-stratum closure.

6. Self-organisation

Self-organisation is not a property of the system but a mechanism that a theorist says causes an emergent property to come into being. Self-organisation is also referred to as spontaneous order. 

In the social sciences self-organisation is sometimes referred to as an endogenous cause, in contrast to an exogenous cause. There is no external force or agent causing the order, in contrast to order that is imposed externally. For example, suppose that in a city there is no government policy about the price of a loaf of sliced wholemeal bread or on how many loaves that bakers should produce. It is observed that prices are almost always in the range of $4 to $5 per loaf, and that rarely are there bread shortages. This outcome is a result of the self-organisation of the free-market, and economists would say the price range and its stability has an endogenous cause. In contrast, if the government legislated the price range and the production levels that would be an exogenous cause. Friedrich Hayek emphasised the role of spontaneous order in economics. In biology, Stuart Kaufmann equates emergence with spontaneous order and self-organisation.

In physics, the periodicity of the arrangement of atoms in a crystal is a result of self-organisation and has an endogenous cause. In contrast, the periodicity of atoms in an optical lattice is determined by the laser physicist who creates the lattice and so has an exogenous cause.

Self-organisation shows how local interactions can produce global properties. In different words, short-range interactions can lead to long-range order. After decades of debate and study, the Ising model showed that this was possible. Other examples of self-organisation, include flocking of birds and teamwork in ant colonies. There is no director or leader but the system acts “as if” there is. 

7. Unpredictability

Ernst Mayr (This is Biology, p.19) defines emergence as “in a structured system, new properties emerge at higher levels of integration that could not have been predicted from a knowledge of the lower-level components.” Philip Ball also defines emergence in terms of unpredictability (Quanta, 2024).

More broadly, in discussions of emergence, “prediction” is used in three different senses: logical prediction, historical prediction, and dynamical prediction.

Logical prediction (deduction) concerns whether one can predict (calculate) the emergent (novel) property of the whole system solely from a knowledge of all the properties of the parts of the system and their interactions. Logical predictability is one of the most contested characteristics of emergence. Sometimes “predict” is replaced with “difficult to predict”, “extremely difficult to predict”, “impossible to predict”, “almost impossible to predict”, or “possible in principle, but impossible in practice, to predict.” 

As an aside, I note that philosophers distinguish between epistemological emergence and ontological emergence. They are associated with prediction that is "possible in principle, but difficult in practice" and "impossible in principle" respectively.

After an emergent property has been discovered experimentally sometimes it can be understood in terms of the properties of the system parts. In a sense “pre-diction” then becomes “post-diction.” An example is the BCS theory of superconductivity, which provided a posteriori, rather than a priori, understanding. In different words, development of the theory was guided by a knowledge of the phenomena that had already been observed and characterised experimentally. Thus, a keyword in the statement above about logical prediction is “solely”. 

Historical prediction. Most new states of matter discovered by experimentalists were not predicted even though theorists knew the laws that the microscopic components of the system obeyed. Examples include superconductivity (elemental metals, cuprates, iron pnictides, organic charge transfer salts, …), superfluidity in liquid 4He, antiferromagnetism, quasicrystals, and the integer and fractional quantum Hall states.

There are a few exceptions where theorists did predict new states of matter. These include are Bose-Einstein Condensates (BECs) in dilute atomic gases and topological insulators, the Anderson insulator in disordered metals, the Haldane phase in even-integer quantum antiferromagnetic spin chains, and the hexatic phase in two dimensions. It should be noted that prediction of BECs and topological insulators were significantly helped that theorists could predict them starting with Hamiltonians of non-interacting particles. Furthermore, all of these predictions involved working with effective Hamiltonians. None started with microscopic Hamiltonians for specific materials.

Dynamical unpredictability concerns what it means in chaotic dynamical systems, where it relates to sensitivity to initial conditions. I do not see this as an example of emergence as it can occur in systems with only a few degrees of freedom. However, some authors do associate dynamical unpredictability with complexity and emergence.

8. Irreducibility and singularities

An emergent property cannot be reduced to properties of the parts, because if emergence is defined in terms of novelty, the parts do not have the property. 

Emergence is also associated with the problem of theory reduction. Formally, this is the process where a more general theory reduces in a particular mathematical limit to a less general theory. For example, quantum mechanics reduces to classical mechanics in the limit where Planck’s constant goes to zero. Einstein’s theory of special relativity reduces to Newtonian mechanics in the limit where the speeds of massive objects become much less than the speed of light. Theory reduction is a subtle philosophical problem that is arguably poorly understood both by scientists [who oversimplify or trivialise it] and philosophers [who arguably overstate the problems it presents for science producing reliable knowledge]. Subtleties arise because the two different theories usually involve language and concepts that are "incommensurate" with one another. 

Irreducibility is also related to the discontinuities and singularities associated with emergent phenomena. As emphasised independently by Hans Primas and Michael Berry, singularities occur because the mathematics of theory reduction involves singular asymptotic expansions. Primas illustrates this by considering a light wave incident on an object and producing a shadow. The shadow is an emergent property, well described by geometrical optics, but not by the more fundamental theory of Maxwell’s electromagnetism. The two theories are related in the asymptotic limit that the wavelength of light in Maxwell’s theory tends to zero. This example illustrates that theory reduction is compatible with the emergence of novelty. Primas also considers how the Born-Oppenheimer approximation, which is central to solid state theory and quantum chemistry, is associated with a singular asymptotic expansion (in the ratio of the mass of an electron to the mass of an atomic nuclei in the system). 

Berry considers several other examples of theory reduction, including going from general to special relativity, from statistical mechanics to thermodynamics, and from viscous (Navier-Stokes) fluid dynamics to inviscid (Euler) fluid dynamics. He has discussed in detail how the caustics that occur in ray optics are an emergent phenomena and are associated with singular asymptotic expansions in the wave theory.

The philosopher of science Jeremy Butterfield showed rigorously that theory reduction occurred for four specific systems that exhibited emergence, defined by him as a novel and robust property. Thus, novelty is not sufficient for irreducibility.

9. Contextuality and downward causation

Any real system has a context. For example, it has boundary and an environment, both in time and space. In many cases the properties of the system are completely determined by the parts of the system and their interactions. Previous history and boundaries do not matter. However, in some cases the context may have a significant influence on the state of the system. An example is Rayleigh-Bernard convection cells and turbulent flow whose existence and nature are determined by the interaction of the fluid with the container boundaries. A biological example concerns what factors determine the structure, properties, and function that a particular protein (linear chain of amino acids) has. It is now known that the only factor is not just the DNA sequence that encodes for the amino acid sequence, in contradiction to some versions of the Central Dogma of molecular biology.  Other factors may be the type of cell that contains the protein and the network of other proteins in which the particular protein is embedded. Context sometimes matters.

Supervenience is the idea that once the micro level is fixed, macro levels are fixed too. The examples above might be interpreted as evidence against supervenience. Supervenience is used to argue against “the possibility for mental causation above and beyond physical causation.” 

Downward causation is sometimes equated with emergence, particularly in debates about the nature of consciousness. In the context of biology, Denis Noble defines downward causation as when higher level processes can cause changes in lower level properties and processes. He gives examples where physiological effects can switch on and off individual genes or signalling processes in cells, including maternal effects and epigenetics.

10. Intra-stratum closure: informational, causal, and computational

The ideas described below were recently developed by Rosas et al. from a computer science perspective. They defined emergence in terms of universality and discussed its relationship to informational closure, causal closure, and computational closure. Each of these are given a precise technical definition in their paper. Here I give the sense of their definitions. In considering a general system they do not pre-define the micro- and macro- levels of a system but consider how they might be defined so that universality holds, i.e., so that properties at the macro-level are independent of the details of the micro-level (i.e., are universal).

Informational closure means that to predict the dynamics of the system at the macroscale an observer does not need any additional information about the details of the system at the microscale. Equilibrium thermodynamics and fluid dynamics are examples. 

Causal closure means that the system can be controlled at the macroscale without any knowledge of lower-level information. For example, changing the software code that is running on a computer allows one to reliably control the microstate of the hardware of the computer regardless of what is happening with the trajectories of electrons in the computer.

Computational closure is a more technical concept, being defined in terms of “a conceptual device called the ε-(epsilon) machine. This device can exist in some finite set of states and can predict its own future state on the basis of its current one... for an emergent system that is computationally closed, the machines at each level can be constructed by coarse-graining the components on just the level below: They are, “strongly lumpable.” "

Rosas et al., show that informational closure and causal closure are equivalent and that they are more restrictive than computational closure. It is not clear to me how these closures relate to novelty as a definition of emergence.

In summary, emergence means different things to different people. I have listed ten different characteristics that have been associated with emergent properties. They are not all equivalent and so when discussing emergence it is important to be clear about which characteristic one is using to define emergence.

Tuesday, September 24, 2024

The multi-faceted character of emergence (part 1)

There is more to emergence than novel properties, i.e., where a whole system has a property that the individual components of the system do not have. Here I focus on emergent properties, but in most cases “property” might be replaced with state, phenomenon, or entity. I now discuss ten characteristics often associated with emergence, beyond novelty. Some people include one or more of these characteristics in their definitions of emergence. However, I do not include them in my definition because as I explain some of the characteristics are contentious. Some may not be necessary or sufficient for novel system properties.

The first five characteristics discussed below might be classified as objective (i.e., observable properties of the system) and the second five as subjective (i.e., associated with how an investigator thinks about the system). In different words, the first five are mostly concerned with ontology (what is real) and the second five with epistemology (what we know). The first five characteristics concern discontinuities, universality, diversity, mesoscales, and modification of parts. The second five concern self-organisation, unpredictability, irreducibility, downward causation, and closure. 

1. Discontinuities 

Quantitative changes in the system can become qualitative changes in the system. For example, in condensed matter physics spontaneous symmetry breaking only occurs in the thermodynamic limit (i.e., when the number of particles of the system becomes infinite). More is different. Thus, as a quantitative change in the system size occurs the order parameter becomes non-zero. In a system that undergoes a phase transition at a non-zero temperature, a small change in temperature can lead to the appearance of order and to a new state of matter. For a first-order phase transition, there is discontinuity in properties such as the entropy and density. These discontinuities define a phase boundary in the pressure-temperature diagram. For continuous phase transitions the order parameter is a continuous function of temperature, becoming non-zero at the critical temperature. However the derivative with respect to temperature may be discontinuous and/or thermodynamic properties such as the specific heat and susceptibility associated with the order parameter may approach infinite as the critical temperature is approached.

Two different states of a system are said to be adiabatically connected if one can smoothly deform one state into the other and all the properties of the system also change smoothly. The case of the liquid-gas transition illustrates subtle issues about defining emergence. A discontinuity does not imply a qualitative difference (novelty). On the one hand, there is a discontinuity in the density and entropy of the system as the liquid-gas phase boundary is crossed in the pressure-temperature diagram. On the other hand, there is no qualitative difference between a gas and a liquid. There is only a quantitative difference: the density of the gas is less than the liquid. Albeit sometimes the difference is orders of magnitude. The liquid and gas state can be adiabatically connected. There is a path in the pressure-temperature phase diagram that can be followed to connect the liquid and gas states without any discontinuities in properties.

The ferromagnetic state also raises questions, as illustrated by a debate between Rudolf Peierls and Phil Anderson about whether ferromagnetism exhibits spontaneous symmetry breaking. Anderson argued that it did not as, in contrast to the antiferromagnetic state, a non-zero magnetisation (order parameter) occurs for finite systems and the magnetic order does not change the excitation spectrum, i.e., produce a Goldstone boson. On the other hand, singularities in properties at the Curie temperature (critical temperature for ferromagnetism) only exist in the thermodynamic limit. Also, a small change in the temperature, from just above the Curie temperature to below, can produce a qualitative change, a non-zero magnetisation.

2. Universality

Properties often referred to as emergent are universal in the sense that it is independent of many of the details of the parts of the system. There may be many different systems that can have a particular emergent property. For example, superconductivity is present in metals with a diverse range of crystal structures and chemical compositions. 

Robustness is related to universality. If small changes are made to the composition of the system (for example replacing some of the atoms in the system with atoms of different chemical element) the novel property of the system is still present. In elementary superconductors, introducing non-magnetic impurity atoms has no effect on the superconductivity.

Universality is both a blessing and a curse for theory. Universality can make it easier to develop successful theories because it means that many details need not be included in a theory in order for it to successfully describe an emergent phenomenon. This is why effective theories and toy models can work even better than might be expected. Universality can make theories more powerful because they can describe a wider range of systems. For example, properties of elemental superconductors can be described by BCS theory and by Ginzburg-Landau theory, even though the materials are chemically and structurally diverse. The curse of universality for theory is that universality illustrates the problem of “under-determination of theory”, “over-fitting of data” and “sloppy theories” [Sethna et al.]. A theory can agree with the experiment even when the parameters used in the theory may be quite different from the actual ones. For example, the observed phase diagram of water can be reproduced, sometimes with impressive quantitative detail, by combining classical statistical mechanics with empirical force fields that assume water molecules can be treated purely being composed of point charges.

Suppose we start with a specific microscopic theory and calculate the macroscopic properties of the system, and they agree with experiment. It would then be tempting to think that we have the correct microscopic theory. However, universality suggests this may not be the case.

For example, consider the case of a gas of weakly interacting atoms or molecules. We can treat the gas particles as classical or quantum. Statistical mechanics gives exactly the same equation of state and specific heat capacity for both microscopic descriptions. The only difference may be the Gibbs paradox [the calculated entropy is not an extensive quantity] which is sensitive to whether or not the particles are treated as identical or not. Unlike the zeroth, first, and second law of thermodynamics, the third law does require that the microscopic theory be quantum. Laughlin discusses these issues in terms of “protectorates” that hide “ultimate causes” .  

In some physical systems, universality can be defined in a rigorous technical sense, making use of the concepts and techniques of the renormalisation group and scaling. These techniques provide a method to perform coarse graining, to derive effective theories and effective interactions, and to define universality classes of systems. There are also questions of how universality is related to the robustness of strata, and the independence of effective theories from the coarse-graining procedure.

3. Diversity

Even when a system is composed of a small number of different components and interactions, the large number of possible stable states with qualitatively different properties that the system can have is amazing. Every snowflake is different. Water is found in 18 distinct solid states. All proteins are composed of linear chains of 20 different amino acids. Yet in the human body there are more than 100,000 different proteins and all perform specific biochemical functions. We encounter an incredible diversity of human personalities, cultures, and languages. A stunning case of diversity is life on earth. Billions of different plant and animal species are all an expression of different linear combinations of the four base pairs of DNA: A, G, T, and C.

This diversity is related to the idea that "simple models can describe complex behaviour". One example is Conway’s Game of Life. Another example is how simple Ising models with a few competing interactions can describe a devil's staircase of ground states or the multitude of different atomic orderings found in binary alloys.

Goldenfeld and Kadanoff defined complexity [emergence] as “structure with variations”. Holland (VSI) discusses “perpetual novelty” giving the example of the game of chess, where are typical game may involve the order of 1050 move sequences. “Motifs” are recurring patterns (sequences of moves) in games. 

Condensed matter physics illustrates diversity with the many different states of matter that have been discovered. The underlying microscopics is “just” electrons and atomic nuclei interacting according to Coulomb’s law.

The significance of this diversity might be downplayed by saying that it is just a result of combinatorics. But such a claim overlooks the issue of the stability of the diverse states that are observed. In a system composed of many components each of which can take on a few states the number of possible states of the whole system grows exponentially with the number of components. For example, for a chain of ten amino acids there are 1013 different possible linear sequences. But this does not mean that all these sequences will produce a functional protein, i.e., a molecule that will fold rapidly (on the timescale of milliseconds) into a stable tertiary structure and perform a useful biochemical function such as catalysis of a specific chemical reaction or signal transduction.

4. Simple entities at the mesoscale 

A key idea in condensed matter physics is that of quasi-particles. A system of strongly interacting particles may have excitations, seen in experiments such as inelastic neutron scattering and Angle Resolved PhotoElectron Spectroscopy (ARPES), that can be described as weakly interacting quasi-particles. These entities are composite particles, and have properties that are quantitatively different, and sometimes qualitatively different, from the microscopic particles. Sometimes this means that the scale (size) associated with the quasi-particles is intermediate between the micro- and the macro-scales, i.e., it is a mesoscale. The existence of quasi-particles leads naturally to the technique of constructing an effective Hamiltonian [effective theory] for the system where effective interactions describe the interactions between the quasi-particles.

The economist Herbert Simon argued that a characteristic of a complex system is that the system can be understood in terms of nearly decomposable units. Rosas et al., argue that emergence is associated with there being a scale at which the system is “strongly lumpable”. Denis Noble has highlighted how biological systems are modular, i.e., composed of simple interchangeable components.

5. Modification of parts and their relationships

Emergent properties are often associated with the state of the system exhibiting patterns, order, or structure, terms that may be used interchangeably. This reflects that there is a particular relationship (correlation) between the parts which is different to the relationships in a state without the emergent property. This relationship may also be reflected in a generalised rigidity. For example, in a solid applying a force on one surface results in all the atoms in the solid experiencing a force and moving together. The rigidity of the solid defines a particular relationship between the parts of the system.

Properties of the individual parts may also be different. For example, in a crystal single-atom properties such as electronic energy levels change quantitatively compared to their values for isolated atoms. Properties of finite subsystems are also modified, reflecting a change in interactions between the parts. For example, in a molecular crystal the frequencies associated with intramolecular atomic vibrations are different to their values for isolated molecules. However, emergence is a sufficient but not a necessary condition for these modifications. In gas and liquid states, novelty is not present but there are still such changes in the properties of the individual parts.

As stated at the beginning of this section the five characteristics above might be associated with ontology (what is real) and objective properties of the system that an investigator observes and depend less on what an observer thinks about the system. The next five characteristics might be considered to be more subjective, being concerned with epistemology (how we determine what is true). In making this dichotomy I do not want to gloss over the fuzziness of the distinction or of two thousand years of philosophical debates about the relationship between ontology and epistemology, or between reality and theory.

In the next post, I will discuss the remaining five characteristics: self-organisation, unpredictability, irreducibility, contextuality and downward causation, and intra-stratum closure.

Thanks for reading this far!

Friday, September 20, 2024

Steven Weinberg's radical change of mind

What is a fundamental theory? As we go to smaller and smaller distances and higher energies we keep finding new entities: atoms, electrons, nuclei, neutrons, protons, quarks, gluons, ...When will it stop?

If we look at a theory, such as a quantum field theory, at a particular energy and length scale, there may be hints that something is going on, such as the existence of new entities, at higher energies. One way to approach this problem is through the renormalisation group and to look at how coupling constants behave as the energy increases. If they start to blow up (diverge) is that a hint of something? But, this requires starting with a renormalisable theory...

An alternative approach is to start with an effective theory that one assumes [hypothesises] is valid at some limited energy scale. This goes against a previous dogma that one should only study renormalisable theories. Amongst elementary particle theorists, led by Steven Weinberg, there was a significant shift in perspective in the 1970s.

In a paper published in 2016, Effective field theory, past and future, Steven Weinberg reflected on how he changed his mind about renormalisability being a fundamental requirement for quantum field theories and how he came to the view that the Standard Model should be viewed as an effective field theory. Here are some quotes from the article. He first describes how in the 1960s he developed a field theory to describe the interactions of nucleons and pions.

"During this whole period, effective field theories appeared as only a device for more easily reproducing the results of current algebra. It was difficult to take them seriously as dynamical theories, because the derivative couplings that made them useful in the lowest order of perturbation theory also made them nonrenormalizable, thus apparently closing off the possibility of using these theories in higher order. 

My thinking about this began to change in 1976. I was invited to give a series of lectures at Erice that summer, and took the opportunity to learn the theory of critical phenomena by giving lectures about it. In preparing these lectures, I was struck by Kenneth Wilson’s device of “integrating out” short-distance degrees of freedom by introducing a variable ultraviolet cutoff, ...

Non-renormalizable theories, I realized, are just as renormalizable as renormalizable theories.

For me in 1979, the answer involved a radical reconsideration of the nature of quantum field theory.

The advent of effective field theories generated changes in point of view and suggested new techniques of calculation that propagated out to numerous areas of physics, some quite far removed from particle physics. Notable here is the use of the power-counting arguments of effective field theory to justify the approximations made in the BCS theory of superconductivity. Instead of counting powers of small momenta, one must count powers of the departures of momenta from the Fermi surface. Also, general features of theories of inflation have been clarified by re-casting these theories as effective field theories of the inflaton and gravitational fields. 

Perhaps the most important lesson from chiral dynamics was that we should keep an open mind about renormalizability. The renormalizable Standard Model of elementary particles may itself be just the first term in an effective field theory that contains every possible interaction allowed by Lorentz invariance and the SU (3) × SU (2) × U (1) gauge symmetry, only with the non-renormalizable terms suppressed by negative powers of some very large mass M...

... we should not despair of applying quantum field theory to gravitation just because there is no renormalizable theory of the metric tensor that is invariant under general coordinate transformations. It increasingly seems apparent that the Einstein–Hilbert Lagrangian √gR is just the least suppressed term in the Lagrangian of an effective field theory containing every possible generally covariant function of the metric and its derivatives...

it is usually assumed that in the quantum theory of gravitation, when Λ reaches some very high energy, of the order of 10^15 to 10^18 GeV, the appropriate degrees of freedom are no longer the metric and the Standard Model fields, but something very different, perhaps strings...

But maybe not..."

In 2021 Weinberg gave a talk, with a similar point of view, which inaugurated an international seminar series [online during covid-19]. 

In response to that talk, Peter Woit has a blog post where he objects to Weinberg's point of view that the Standard Model is "just" an effective theory, only valid at low energies.

Reviews of Modern Physics recently published a review that discussed how Weinberg's perspective is worked out in detail.

The standard model effective field theory at work

Gino Isidori, Felix Wilsch, and Daniel Wyler

The discussion above fits naturally with an emergentist perspective: reality is stratified. Effective theories at one strata may have singularities around boundaries between strata, and new entities emerge, both physically and theoretically, as one moves to the next higher or lower strata.

Sunday, September 15, 2024

Biology is about emergence in subtle ways

Biology is a field that is all about emergence. It exhibits a hierarchy of structures from DNA to proteins to cells to organs to organisms. Phenotypes emerge from genotypes. At each level of the hierarchy (stratum) there are unique entities, phenomena, principles, methods, theories, and sub-fields. But there is more to the story. 

Philip Ball is probably my favourite science writer. Earlier this year, he gave a beautiful lecture at The Royal Institution, What is Life and How does it Work?


The lecture presents the main ideas in his recent book,  How Life Works: A User's Guide to the New Biology

Here are a few things that stood out for me from the lecture.

1. The question, "What is life?" has been and continues to be notoriously difficult to answer.

It was originally stated by Francis Crick, and some commonly assumed corollaries of it are wrong. In simple terms, the Dogma states that DNA makes RNA and RNA makes proteins. This is a unique and unidirectional process. For example, a specific code (string of the letters A,G,T, and C) will produce a specific protein (sequence of amino acids) which will naturally fold into a unique structure with a specific biochemical function. 


The central dogma has undergirded the notion that genes determine everything in biology. Everything is bottom-up.
However, Ball gives several counterexamples.
A large fraction of our DNA does not code for proteins.
Many proteins are disordered, i.e., they do not have a unique folded structure.

Aside: An earlier failure of (some versions of) the central dogma was the discovery of reverse transcriptase by the obscure virus club, essential for the development of HIV drugs and covid-19 vaccines.

3. The role of emergence can be quantified in terms of information theory, helping to understand the notion of causal emergence: the cause of large-scale behaviour is not just a sum of micro-causes, i.e., the properties of and interactions between the constituents at smaller scales. Entities at the level of the phenomena are just as important as what occurs at lower levels.
(page 214 in the book). Causal emergence is concerned with fitting the scale of the causes to the scale of the effects.
The figure above is taken from this paper from 2021.


The authors quantify casual emergence in protein networks in terms of mutual information (between large and small scales) and effective information (a measure of the certainty in the connectivity of a network).

Aside: These quantitative notions of emergence have been developed more in recent work by Fernando Rosas and collaborators and discussed in a Quanta article by Philip Ball.

4. Context matters.  A particular amino acid sequence does not define a unique protein structure and function. They may depend on the specific cell in which the protein is contained.

5. Causal spreading.  Causality happens at different levels. It does not always happen at the bottom (genetic level). Sometimes it happens at higher levels. And, it can flow up or down.

6. Levels of description matter. This is well illustrated by morphology and the reasons that we have five fingers. This is not determined by genes.

7. Relevance to medicine. There has been a focus on the genetic origin of diseases. However, many diseases, such as cancer, do not predominantly happen at the genetic level. There has been a prejudice to focus on the genetic level, partly because that is where most tools are available. For cancer, focussing on other levels, such as the immune system, may be more fruitful.

8. Metaphors matter. Biology has been dominated by  metaphors such as living things are "machines made from genes" and "computers running a code". However, metaphors are metaphors. They have limitations, particularly as we learn more. All models are wrong, but some are useful. Ball proposes that metaphors from life, including the notion of agency, may be more fruitful.

9. The wisdom of Michael Berry. Ball ends with Berry's saying that the biggest unsolved problem in physics is not about dark matter (or some similar problem), but rather, "If all matter can be described by quantum theory, where does the aliveness of living things come from?" In other words, "Why is living matter so different from other matter?"

There is also an interesting episode of the How To Academy podcast, where Ball is interviewed about the book.

Wednesday, September 11, 2024

Emergence in classical optics: caustics and rainbows

                                                                Photo by Chris Lawton on Unsplash

I love seeing patterns such as those above in bodies of water. I did not know that they are an example of emergence, according to Michael Berry, who states:

“A caustic is a collective phenomena, a property of a family of rays that is not present in any individual ray. Probably the most familiar example is the rainbow.”

Caustics are envelopes of families of rays on which the intensity diverges. They occur in media where the refractive index is inhomogeneous. In the image above, there is an interplay of the uneven air-water interface and the difference in the refractive index between air and water. For rainbows, key parameters are the refractive index of the water droplets and the size of the droplets. The caustic is not the "rainbow", i.e., the spectrum of colours, but rather the large light intensity associated with the bow. The spectrum of colours arises because of dispersion (i.e., the refractive index of water depends on the wavelength of the light).

Caustics illustrate several characteristics of emergence properties: novelty, singularities, hierarchies, new scales, effective theories, and universality. 

Novelty. The whole system (a family of light rays) has a property (infinity intensity) that individual light rays do not.

Discontinuities. A caustic defines a spatial boundary across which there are discontinuities in properties.  

Irreducibility and singular limits. Caustics only occur in the theory of geometrical optics which corresponds to the limit where the wavelength of light goes to zero in a wave theory of light. Caustics (singularities) are not present in the wave theory.

Hierarchies. 
a. Light can be treated at the level of rays, scalar waves, and vector waves. At each level, there are qualitatively different singularities: caustics, phase singularities (vortices, wavefront dislocations, nodal lines), and polarisation singularities. 
b. Treating caustics at the level of wave theory, as pioneered by George Bidell Airy, reveals a hierarchy of non-analyticities, and an interference pattern, reflected in the supernumerary part of a rainbow.

New (emergent) scales. An example, is the universal angle of 42 degrees subtended by the rainbow, that was first calculated by Rene Descartes. Airy's wave theory showed that the spacing of the interference fringes shrinks as lambda^2/3.

Effective theories. At each level of the hierarchy, one can define and investigate effective theories. For ray theory, the effective theory is defined by the spatially dependent refractive index n(R)  and the ray action.

Universality. Caustics exist for any kind of waters: light, sound, and matter. They exhibit "structural stability". They fall into equivalence (universality) classes that are defined by the elementary catastrophes enumerated by Rene Thom and Vladimir Arnold and listed in the Table below. Any two members of a class can be smoothly deformed into one another.
The first column in the Table below is the name of the class given by Thom, and the second is the symbol used by Arnold. K is the number of parameters needed to define the class and the associated polynomial, which is given in the last column. 


For this post, I have drawn on several beautiful articles by Michael Berry.  A good place to start may be 
Nature's optics and our understanding of light (2015), which contains the figure I used above of the rainbow.

There is a beautiful description of some of the history and basic physics of the rainbow in Rainbows, Snowflakes, and Quarks: Physics and the World Around Us by Hans Christian Von Baeyer

Tuesday, September 3, 2024

Autobiography of John Goodenough (1922-2023)

 John Goodenough was an amazing scientist. He made important contributions to our understanding of strongly correlated electron materials, magnetism, solid state chemistry, and materials science and engineering. He developed materials that are widely used in computer RAMs and rechargeable lithium batteries. He kept working in the laboratory and writing papers into his early 90s. Goodenough was awarded the Nobel Prize in Chemistry in 2019. Here is his Nobel Lecture, including text, slides, and video.

In 2008 he published Witness to Grace, a brief autobiography that chronicles his personal, scientific, and spiritual journeys. It is a fascinating story. The book is now out of print and the publisher is out of business. I have scanned a copy. You can download it here. I thank David Purdy for bringing to my attention the need to preserve the book.


From Leo Szilard to the Tasmanian wilderness

Richard Flanagan is an esteemed Australian writer. My son recently gave our family a copy of Flanagan's recent book, Question 7 . It is...