What do we mean when we say a particular system is "complex"? We main have some intuition that it means there are many degrees of freedom and/or that it is hard to understand. "Complexity" is sometimes used as a buzzword, just like "emergence." There are many research institutes that claim to be studying "complex systems" and there is something called "complexity theory". Complexity seems to mean different things to different people.
I am particularly interested in understanding the relationship between emergence and complexity. To do this we first need to be more precise about what we mean by both terms. A concrete question is the following. Consider a system that exhibits emergent properties. Often that will be associated with a hierarchy of scales. For example: atoms, molecules, proteins, DNA, genes, cells, organs, people. The corresponding hierarchy of research fields is physics, chemistry, biochemistry, genetics, cell biology, physiology, psychology. Within physics a hierarchy is quarks and leptons, nuclei and electrons, atoms, molecules, liquid, and fluid.
In More is Different, Anderson states that as one goes up the hierarchy the system scale and complexity increases. This makes sense when complexity is defined in terms of the number of degrees of freedom in the system (e.g., the size of the Hilbert space needed to describe the complete state of the system). On the other hand, the system state and its dynamics become simpler as one goes up the hierarchy. The state of the liquid can be described completely in terms of the density, temperature, and the equation of state. The dynamics of the fluid can be described by the Navier-Stokes equation. Although that is hard to solve in the regime of turbulence, the system is still arguably a lot simpler than quantum chromodynamics (QCD)! Thus, we need to be clearer about what we mean by complexity.
To address these issues I found the following article very helpful and stimulating.
What is a complex system? by James Ladyman, James Lambert, and Karoline Wiesner
It was published in 2013 in a philosophy journal, has been cited more than 800 times, and is co-authored by two philosophers of science and a physicist.
[I just discovered that Ladyman and Wiesner published a book with the same title in 2020. It is an expansion of the 2013 article.].
In 1999 the journal Science had a special issue that focussed on complex systems, with an Introduction entitled, Beyond Reductionism. Eight survey articles covered complexity in physics, chemistry, biology, earth science, and economics.
Ladyman et al., begin by pointing out how each of the authors of these articles chooses different properties to define what complexity is associated. These characteristics include non-linearity, feedback, spontaneous order, robustness and lack of central control, emergence, hierarchical organisation, and numerosity.
The problem is that these characteristics are not equivalent. If we do choose a specific definition for a complex system, the difficult problem then remains of determining whether each of the characteristics above is necessary, sufficient, both, or neither for the system to be complex (as defined). This is similar to what happens with attempts to define emergence.
Information content is sometimes used to quantify complexity. Shannon entropy and Kolmogorov complexity (Sections 3.1, 3.2) are discussed. The latter is also known as algorithmic complexity. This is the length of the shortest computer program (algorithm) that can be written to produce the entity as output. A problem with both these measures are they are non-computable.
Deterministic complexity is different from statistical complexity (Section 3.3). A deterministic measure treats a completely random sequence of 0s and 1s as having maximal complexity. A statistical measure treats a completely random sequence as having minimal complexity. Both Shannon and algorithmic complexity are deterministic.
Section 4 makes some important and helpful distinctions about different measures of complexity.
3 targets of measures: methods used, data obtained, system itself
3 types of measures: difficulty of description, difficulty of creation, or degree of organisation
They then review three distinct measures that have been proposed logical depth (Charles Bennett), thermodynamic depth (Seth Lloyd and Heinz Pagels), and effective complexity (Murray Gell-Mann).
Logical depth and effective complexity are complementary quantities. The Mandelbrot set is example of a system (set of data) that exhibits a complex structure that has a high information content. It is difficult to describe. It has a large logical depth.
Created by Wolfgang Beyer with the program Ultra Fractal 3.
On the other hand, the effective complexity of the set is quite small since it can be generated using the simple equation
z_n+1 = c + z_n^2
c is a complex number and the Mandelbrot set is the values of c for which the iterative map is bounded.
Ladyman et al, prefer the definition of a complex system below, but do acknowledge its limitations.
(Physical account) A complex system is an ensemble of many elements which are interacting in a disordered way, resulting in robust organisation and memory.
(Data-driven account) A system is complex if it can generate data series with high statistical complexity.
What is statistical complexity? It relates to degrees of pattern and some they refer to as causal state reconstruction. It is applied to data sets, not systems or methods. Central to their definition is the idea of the epsilon-machine, something introduced in a long and very mathematical article from 2001, Computational Mechanics: Pattern and Prediction, Structure and Simplicity, by Shalizi and Crutchfield.
The article concludes with a philosophical question. Do patterns really exist? This relates to debates about scientific realism versus instrumentalism. The authors advocate something known as "rainforest realism", that has been advanced by Daniel Dennett, Don Ross, and David Wallace. A pattern is real if one can construct an epsilon-machine that can simulate the phenomena and predict its behaviour.
No comments:
Post a Comment