Monday, November 25, 2013

The emergence of "sloppy" science

Here "sloppy" science is good science!

How do effective theories emerge?
What is the minimum number of variables and parameters needed to describe some emergent phenomena?
Is there a "blind"/"automated" procedure for determining what the relevant variables and parameters are?

There is an interesting paper in Science
Parameter Space Compression Underlies Emergent Theories and Predictive Models
Benjamin B. Machta, Ricky Chachra, Mark K. Transtrum, James P. Sethna

These issues are not just relevant in physics, but also in systems biology. The authors state:
important predictions largely depend only on a few “stiff” combinations of parameters, followed by a sequence of geometrically less important “sloppy” ones... This recurring characteristic, termed “sloppiness,” naturally arises in models describing collective data (not chosen to probe individual system components) and has implications similar to those of the renormalization group (RG) and continuum limit methods of statistical physics. Both physics and sloppy models show weak dependence of macroscopic observables on microscopic details and allow effective descriptions with reduced dimensionality.
The following idea is central to the paper.
The sensitivity of model predictions to changes in parameters is quantified by the Fisher Information Matrix (FIM). The FIM forms a metric on parameter space that measures the distinguishability between a model with parameters theta_m and a nearby model with parameters theta_m + delta theta_m.
The authors show that for several specific models, the eigenvalue spectrum of the FIM is dominated by just a few eigenvalues. These eigenvalues are then associated with the key parameters of the theory.

I have one minor quibble with the first sentence of the paper.
"Physics owes its success (1) in large part to the hierarchical character of scientific theories (2)."
Reference (1) is Eugene Wigner's famous 1960 essay "The Unreasonable Effectiveness of Mathematics in the Natural Sciences".
Reference (2) is Anderson's classic "More is different".

I think Wigner's paper is largely about something quite different from emergence, the focus of Anderson's paper. Wigner is primarily concerned with the even more profound philosophical question as to why nature can be described by mathematics at all. I see no scientific answer on the horizon.

Update. The authors also have longer papers on the same subject
Mark K. Transtrum; Benjamin B. Machta; Kevin S. Brown; Bryan C. Daniels; Christopher R. Myers; James P. Sethna 
(2015)

Katherine N Quinn, Michael C Abbott, Mark K Transtrum, Benjamin B Machta and James P Sethna
(2022)

1 comment:

  1. A good introduction to the Fisher matrix, the associated metric (Fisher-Rao?), and their relationship to e.g. the Bures length in quantum information theory is here:
    http://math.ucr.edu/home/baez/information/

    ReplyDelete

A very effective Hamiltonian in nuclear physics

Atomic nuclei are complex quantum many-body systems. Effective theories have helped provide a better understanding of them. The best-known a...