Friday, July 26, 2024

Emergence, structuralism, realism, and quarks

"Structuralism as an influential intellectual movement of the twentieth century has been advocated by Bertrand Russell, Rudolf Carnap, Nicholas Bourbaki, Noam Chomsky, Talcott Parsons, Claude Levi-Strauss, Jean Piaget, Louis Althusser, and Bas van Fraassen, among many others, and developed in various disciplines such as linguistics, mathematics, psychology, anthropology, sociology, and philosophy." 

In different words, structuralism and post-structualism have been and are still a really big deal in the humanities and social sciences. Structuralism is central to the rise and fall of a multitude of academic fashions, careers and reputations.  

"As a method of enquiry, it takes a structure as a whole rather than its elements as the major or even the only legitimate subject for investigations. Here, a structure is defined either as a system of stable relations among a set of elements, or as a self-regulated whole under transformations, depending on the specific subject under consideration. The structuralist maintains that the character or even the reality of a whole is mainly determined by its structuring laws, and cannot be reduced to its parts; rather, the existence and essence of a part in the whole can only be defined through its place in the whole and its relations with other parts."

In a sense, structuralism favours emergence over reductionism. But, note some of the strong exclusivist language highlighted in bold in the quote above. Structuralism seems to be an overreaction to extreme reductionism. 

Condensed matter physics has something concrete to contribute to these debates. Consider the case of Ising models defined on a range of lattices, as I discussed in a previous post. We do not have an exclusive interest in the whole system or in the parts of the system. Rather, we want to know the relationship between macroscopic properties [different ordered states], mesoscopic properties [domains, long-range correlations, networks], and microscopic properties [the individual spins and their local interactions].

That is the main point of this post. But for more context, keep reading.

The quotations above are taken from a book by Tian Yu Cao. 

From Current Algebra to Quantum Chromodynamics: A Case for Structural Realism

Cao is interested in a broad range of philosophical questions related to QCD, such as "If quarks cannot be observed in isolation should they be considered to be real?"

He continues:

In the epistemically interesting cases involving unobservable entities, the structuralist usually argues that it is only the structure and the structural relations of its elements, rather than the elements themselves (properties or entities with properties) that are empirically accessible to us. It is obvious that such an anti-reductionist holistic stance has lent some support to phenomenalism

However, as an effort to combat compartmentalization, which urge is particularly strong in mathematics, linguistics, and anthropology, the structuralist also tries to uncover the unity among various appearances, in addition to invariance or stable correlation under transformations, which can help discover the deep reality embodied in deep structures. Furthermore, if we accept the attribution of reality to structures, then the antirealist implications of the underdetermination thesis [which claims that since evidence cannot uniquely determine (or, worse, can even support conflicting) theoretical claims about certain unobservable entities, no theoretical entities should be taken as representation of reality], is somewhat neutralized, because then we can talk about the realism of structures, or the reality of the structural features of unobservable entities exhibited in evidence, although we cannot directly talk about the reality of the entities themselves that are engaged in the structural relations. In fact, this realist implication of structuralism was one of the starting points of current interests in structural realism.

Monday, July 22, 2024

Clarity about the relationship of emergence, complexity, predictability, and universality

Emergence means different things to different people. Except, that practically everyone likes it! Or at least, likes using the word. Terms associated with emergence include novelty, unpredictability, universality, stratification, and self-organisation. We need to be clearer about what we mean by each of these terms and how they are related or unrelated. Significant progress is reported in a recent preprint.

Software in the natural world: A computational approach to hierarchical emergence

Fernando E. Rosas, Bernhard C. Geiger, Andrea I Luppi, Anil K. Seth, Daniel Polani, Michael Gastpar, Pedro A.M. Mediano

This preprint is the subject of a nice article in Quanta Magazine.

The New Math of How Large-Scale Order Emerges by Philip Ball

Ball defines emergence in terms of unpredictability. He states: 

"Loosely, the behavior of a complex system might be considered emergent if it can’t be predicted from the properties of the parts alone."

He describes the work of Rosas et al. as follows, 

"A complex system exhibits emergence, according to the new framework, by organizing itself into a hierarchy of levels that each operate independently of the details of the lower levels."

This is defining emergence in terms of universality. Rosas et al. use an analogy with software, which runs independently of the details of the hardware of the computer and does not depend on microscopic details such as electron dynamics.

There are three types of closure associated with emergence: informational, causal, and computational.

Informational closure means that to predict the dynamics of the system at the macroscale one does not need any additional  information from the microscale.

Equilibrium thermodynamics is a nice example. 

Causal closure means that the system can be controlled at the macroscale without any knowledge of lower-level information.

"Interventions we make at the macro level, such as changing the software code by typing on the keyboard, are not made more reliable by trying to alter individual electron trajectories."

"...we can use macroscopic variables like pressure and viscosity to talk about (and control) fluid flow, and knowing the positions and trajectories of individual molecules doesn’t add useful information for those purposes. And we can describe the market economy by considering companies as single entities, ignoring any details about the individuals that constitute them."

Computational closure is a more technical concept. 

"a conceptual device called the ε-(epsilon) machine. This device can exist in some finite set of states and can predict its own future state on the basis of its current one. It’s a bit like an elevator, said Rosas; an input to the machine, like pressing a button, will cause the machine to transition to a different state (floor) in a deterministic way that depends on its past history — namely, its current floor, whether it’s going up or down and which other buttons were pressed already. Of course an elevator has myriad component parts, but you don’t need to think about them. Likewise, an ε-machine is an optimal way to represent how unspecified interactions between component parts “compute” — or, one might say, cause — the machine’s future state."

Aside: epsilon-machines featured significantly in my previous post about What is a complex system? 

"Computational mechanics allows the web of interactions between a complex system’s components to be reduced to the simplest description, called its causal state."

"...for an emergent system that is computationally closed, the machines at each level can be constructed by coarse-graining the components on just the level below: They are, in the researchers’ terminology, “strongly lumpable.”"

In some sense, this may be related to the notion of quasiparticles and effective interactions in many-body physics. 

Aside: In 1962, Herbert Simon identified hierarchies as an essential feature of complex systems, both natural and artificial. A key property of a level in the hierarchy is that it is nearly decomposable into smaller units, i.e., it can be viewed as a collection of weakly interacting units. The time required for the evolution of the whole system is significantly decreased due to the hierarchical character. The construction of an artificial complex system, such as a clock, is faster and more reliable if different units are first assembled separately and then the units are brought together into the whole. Simon argues that the reduction in time scales due to modularity is why biological evolution can occur on realistic time scales.  The 1962 article is reprinted in The Sciences of the Artificial.

The paper by Rosas et al. is one of the most important ones I have encountered in the past few years. I am slowly digesting it.

The beauty of the paper that it is mathematically rigorous. All the concepts are precisely defined and the central results are actually theorems. This replaces the vagueness of most discussions of emergence, including by myself.

The paper has helpful figures and considers concrete examples including Ehrenfest's Urn, an Ising model with Glauber dynamics, and a Hopfield neural network model.

I thank Gerard Milburn for bringing the Quanta article to my attention.

Tuesday, July 9, 2024

Basic realities to accept about applying for funding

The advice that follows is directed to young people who are starting out in requesting funding for a project or an annual budget. My advice is based on about thirty years of experience of writing grant applications, reviewing requests, and being involved in making final decisions about applications. My experience has involved national research funding bodies, internal university schemes, charities, and NGOs. Over the years, I have been involved with requests in the range from a few thousand dollars to a few million dollars.

Accept reality
The world is messed up. Systems are broken. They are not the way they should be. Bad decisions are made. Processes are imperfect. I am all for trying to change things. However, when you make a funding application your chances of success are best if you accept the system and engage with it as it is today. Try to change it tomorrow. 

Put yourself in the shoes of the decision-makers.
You may not respect them or think they are particularly competent or well-qualified to make decisions about your funding request. However, put that aside and consider that they may be in an unenviable position. They are working within an imperfect system. They have limited time to read and evaluate a trove of applications, many on topics they do not really understand. They have limited resources to allocate. Most want to evaluate those scarce resources in a fair and equitable manner. In most contexts the ratio of available funds to the total amount of funds requested by all the applicants is somewhere in the ratio of 0.03 to 0.2. This means they need to reject a lot of applications and slim down the budgets of those that are accepted. 

You have to start with small amounts of money and build up.
Trust and success are incremental. You first get a grant for a few thousand dollars. You show that you have used that well to accomplish something. You may have to do that a few times before you get tens of thousands of dollars. You then use that to accomplish something bigger. And so on it goes.
You may think you deserve to receive several $100Ks and jump this process. However, it is highly unlikely to happen. You need to prove yourself.
In different words, any year do not ask for significantly more than you were budgeted last year. 

Every budget line item must be carefully justified.
Is each item really essential for successful completion of the project? We would all like to have a better computer, more technical support, a personal assistant, lots of international travel to exotic locales, release from other responsibilities, ...
But is each item necessary? Is each item consistent with your level of seniority and experience? Or is there a cheaper option? Could someone else fund it?
These issues are not just about good use of resources but also your credibility as someone who is a team player willing to accept institutional realities and limited resources.

The greater the requested budget the greater the scrutiny of the application.
Hence, asking for less money actually increases your chance of success. If your budget is 2 or 3 times the budget of competing applications the funding agency will almost always think that it is better to fund 2 or 3 groups rather than just one. 

Check your attitude.
You should have confidence that what you are doing is important and worth funding. However, that is not the same as making snide comments about competitors, stroking your ego, overselling the significance of what you are doing, or expressing grievances about perceived past slights and criticisms of your work. Exhibiting such attitudes only hurt your chances of success.

Saturday, June 29, 2024

Quantum BS: piling it higher

Hans Bachor recently gave a talk at UQ, Hype and Trust in Quantum Technologies
Trust is a core value in science, trust in data, analysis, concepts, models. This is achieved in physics by open publishing, scientific discourse, testing, repeating experiments, asking critical questions and designing new tests. Fortunately, science is self-correcting in the long term. Hype includes predictions which sensationalise scientific discoveries and exaggerate the future impact. Increasing competition for funding, visibility or job security can make this more attractive. But it also erodes trust in science by the public and investors and has negative social effects on us the researchers. How can we balance them?
I think this problem more broadly reflects the way universities have become to imitate the social context they are imbedded in, rather than being a critique of those societies.

The sociologist Christian Smith eloquently described the emergence of BS in universities, several years ago.

Friday, June 21, 2024

10 key ideas about emergence

Consider a system comprised of many interacting components. 

1. Many different definitions of emergence have been given. I take the defining characteristic of an emergent property of the system is novelty, i.e, the individual components of the system do not have this property.

2. Many other characteristics have been associated with emergence, such as universality, unpredictability, irreducibility, diversity, self-organisation, discontinuities, and singularities. However, it has not been established whether these characteristics are necessary or sufficient for novelty.

3. Emergent properties are ubiquitous across scientific disciplines from physics to biology to sociology to computer science. Emergence is central to many of the biggest scientific challenges today and some of the greatest societal problems.

4. Reality is stratified. A key concept is that of strata or hierarchies. At each level or stratum,  there is a distinct ontology (properties, phenomena, processes, entities, and effective interactions) and epistemology (theories, concepts, models, and methods). This stratification of reality leads to semi-autonomous scientific disciplines and sub-disciplines.

5. A common challenge is understanding the relationship between emergent properties observed at the macroscopic scale (whether in societies or in solids) and what is known about the microscopic scale: the components (whether individual humans or atoms) and their interactions. Often a key (but profound) insight is identifying an emergent mesoscopic scale (i.e., a scale intermediate between the macro- and micro- scales) at which new entities emerge and interact with one another weakly.

6. A key theoretical method is the development and study of effective theories and toy models. Effective theories can describe phenomena at the mesoscopic scale and be used to bridge the microscopic and macroscopic scales. Toy models involve just a few degrees of freedom, interactions, and parameters. Toy models are amenable to analytical and computational analysis and may reveal the minimal requirements for an emergent property to occur. The Ising model is a toy model that elucidates critical phenomena and key characteristics of emergence.

7. Condensed matter physics elucidates many of the key features and challenges of emergence. Unlike brains and economies, condensed states of matter are simple enough to be amenable to detailed and definitive analysis but complex enough to exhibit rich and diverse emergent phenomena.

8. The ideas above about emergence matter for scientific strategy in terms of choosing methodologies, setting priorities, and allocating resources.

9. An emergent perspective that does not privilege the parts or the whole can address contentious issues and fashions in the humanities and social sciences, particularly around structuralism.

10. Emergence is also at the heart of issues in philosophy including the nature of consciousness, truth, reality, and the sciences.

Tuesday, June 11, 2024

The interplay of ecological and evolutionary dynamics: immigration, extinction, and chaos (and DMFT?)

"Ecological and evolutionary dynamics are intrinsically entwined. On short timescales, ecological interactions determine the fate and impact of new mutants, while on longer timescales evolution shapes the entire community."

 Spatiotemporal ecological chaos enables gradual evolutionary diversification without niches or tradeoffs       Aditya Mahadevan, Michael T Pearce, and Daniel S Fisher

Understanding this interplay is "one of the biggest open problems in evolution and ecology."

New experimental techniques for measuring the properties of large microbial ecosystems have stimulated significant theoretical work, including from some with a background in theoretical condensed matter physics. For an excellent accessible introduction see:

Understanding chaos and diversity in complex ecosystems – insights from statistical physics

This is a nice 2.5-page article by Pankaj Mehta at the Journal Club for Condensed Matter. He clearly introduces an important problem in theoretical ecology and evolution and describes how some recent work has provided new insights using techniques adapted from Dynamical Mean-Field Theory, which was originally developed to describe strongly correlated electron systems. 

Here are just a few highlights of the article. It may be better to just read the actual article.

Fifty years ago, Robert May "argued that the more diverse an ecosystem is (roughly defined as the number of species present), the less stable it becomes." He derived this counter-intuitive result using a simple model and results from Random Matrix Theory. This is an example of an emergent property: qualitative difference occurs as a system of interacting parts becomes sufficiently large.

"One major deficiency of May’s argument is that it does not allow for the possibility that complex ecosystems can self organize through immigration and extinction. The simplest model that contains all these processes is the Generalized [to many species] Lotka-Volterra model (GLV)".

"Despite its simplicity, this equation holds many surprises, especially when the number of species is large".

Another case of how simple models can exhibit complex behaviour.

One special case is when the interactions are reciprocal – how species i affects species j is identical to how species j affects species I. "In the presence of non-reciprocity the system can exhibit complex dynamical behavior including chaos." Understanding this case was an open problem until the two papers reviewed by Mehta. For a detailed but pedagogical introduction see:

Les Houches Lectures on Community Ecology: From Niche Theory to Statistical Mechanics, Wenping Cui, Robert Marsland III, Pankaj Mehta

This is relevant to understanding the origin of the fine grained diversity observed in sequencing experiments of microbial ecosystems.

Aside: de Pirey and Bunin "derive analytic expressions for the steady-state abundance distribution and an analogue of the fluctuation-dissipation theorem for chaotic dynamics relating static and dynamics correlation functions."

"Using a DMFT solution, they derive a number of remarkable predictions... in the chaotic system the species fall into two groups: species at high abundances and species at low abundances near the immigration floor. de Pirey and Bunin show that even in the chaotic regime, the number of high abundance species in the ecosystem will always be less than the May stability bound. This result is quite surprising since it suggests that ecosystems self-organize in such a way that the high abundance species still follow May’s diversity bound even when they are chaotic."

Emergence, structuralism, realism, and quarks

"Structuralism as an influential intellectual movement of the twentieth century has been advocated by Bertrand Russell, Rudolf Carnap, ...