Emergence means different things to different people. Except, that practically everyone likes it! Or at least, likes using the word. Terms associated with emergence include novelty, unpredictability, universality, stratification, and self-organisation. We need to be clearer about what we mean by each of these terms and how they are related or unrelated. Significant progress is reported in a recent preprint.
Software in the natural world: A computational approach to hierarchical emergence
Fernando E. Rosas, Bernhard C. Geiger, Andrea I Luppi, Anil K. Seth, Daniel Polani, Michael Gastpar, Pedro A.M. Mediano
This preprint is the subject of a nice article in Quanta Magazine.
The New Math of How Large-Scale Order Emerges by Philip Ball
Ball defines emergence in terms of unpredictability. He states:
"Loosely, the behavior of a complex system might be considered emergent if it can’t be predicted from the properties of the parts alone."
He describes the work of Rosas et al. as follows,
"A complex system exhibits emergence, according to the new framework, by organizing itself into a hierarchy of levels that each operate independently of the details of the lower levels."
This is defining emergence in terms of universality. Rosas et al. use an analogy with software, which runs independently of the details of the hardware of the computer and does not depend on microscopic details such as electron dynamics.
There are three types of closure associated with emergence: informational, causal, and computational.
Informational closure means that to predict the dynamics of the system at the macroscale one does not need any additional information from the microscale.
Equilibrium thermodynamics is a nice example.
Causal closure means that the system can be controlled at the macroscale without any knowledge of lower-level information.
"Interventions we make at the macro level, such as changing the software code by typing on the keyboard, are not made more reliable by trying to alter individual electron trajectories."
"...we can use macroscopic variables like pressure and viscosity to talk about (and control) fluid flow, and knowing the positions and trajectories of individual molecules doesn’t add useful information for those purposes. And we can describe the market economy by considering companies as single entities, ignoring any details about the individuals that constitute them."
Computational closure is a more technical concept.
"a conceptual device called the ε-(epsilon) machine. This device can exist in some finite set of states and can predict its own future state on the basis of its current one. It’s a bit like an elevator, said Rosas; an input to the machine, like pressing a button, will cause the machine to transition to a different state (floor) in a deterministic way that depends on its past history — namely, its current floor, whether it’s going up or down and which other buttons were pressed already. Of course an elevator has myriad component parts, but you don’t need to think about them. Likewise, an ε-machine is an optimal way to represent how unspecified interactions between component parts “compute” — or, one might say, cause — the machine’s future state."
Aside: epsilon-machines featured significantly in my previous post about What is a complex system?
"Computational mechanics allows the web of interactions between a complex system’s components to be reduced to the simplest description, called its causal state."
"...for an emergent system that is computationally closed, the machines at each level can be constructed by coarse-graining the components on just the level below: They are, in the researchers’ terminology, “strongly lumpable.”"
In some sense, this may be related to the notion of quasiparticles and effective interactions in many-body physics.
Aside: In 1962, Herbert Simon identified hierarchies as an essential feature of complex systems, both natural and artificial. A key property of a level in the hierarchy is that it is nearly decomposable into smaller units, i.e., it can be viewed as a collection of weakly interacting units. The time required for the evolution of the whole system is significantly decreased due to the hierarchical character. The construction of an artificial complex system, such as a clock, is faster and more reliable if different units are first assembled separately and then the units are brought together into the whole. Simon argues that the reduction in time scales due to modularity is why biological evolution can occur on realistic time scales. The 1962 article is reprinted in The Sciences of the Artificial.
The paper by Rosas et al. is one of the most important ones I have encountered in the past few years. I am slowly digesting it.
The beauty of the paper that it is mathematically rigorous. All the concepts are precisely defined and the central results are actually theorems. This replaces the vagueness of most discussions of emergence, including by myself.
The paper has helpful figures and considers concrete examples including Ehrenfest's Urn, an Ising model with Glauber dynamics, and a Hopfield neural network model.
No comments:
Post a Comment