Thursday, February 3, 2011

Laws of complexity are emergent

In 1995 Phil Anderson organised a colloquium: "Physics: the opening to complexity" for the USA National Academy of Sciences. His introduction is worth reading. I reproduce below an extract because it gives such a clear discussion of emergence.
At this frontier [of complexity], the watchword is not reductionism but emergence. Emergent complex phenomena are by no means in violation of the microscopic laws, but they do not appear as logicaly consequent on these laws. That this is the case can be illustrated by two examples which show that a complex phenomenon can follow laws independently of the detailed substrate in which it is expressed.
(i) The "Bardeen-Cooper-Schrieffer (BCS)" phenomenon of broken gauge symmetry in dense Fermion liquids has at least three expressions: electrons in metals, of course, where it is called "superconductivity"; 3He atoms, which become a superfluid when liquid 3He is cooled below 1-3 mK; and nucleons both in ordinary nuclei (the pairing phenomenon explained by Bohr, Mottelson, and Pines) and in neutron stars, on a giant scale, where superfluidity is responsible for the "glitch" phenomenon. All of these different physical embodients obey the general laws of broken symmetry that are the canonical example of emergence in physics.
(ii) One may make a digital computer using electrical relays, vacuum tubes, transistors, or neurons; .... the rules governing computation do not vary depending on the physical substrate in which they are expressed; hence, they are logically independent of the physical laws governing that substrate.
The picture is the ENIAC. 

5 comments:

  1. I really do not agree on the second example. The rules of computation is not totally irrelevant from the substrate! To summarise, a computer has a discrete number of binary elements to store operated elements, and storage in these binary elements is not perfect. The first results in many difficulties since mathematics tend to assume infinite precision in numbers, the second, well, basically wreaks havoc in precision, so one has to be very careful about "fuzziness" in their calculations even/especially in modern machines (try programming in an IBM cell, like bluegene, Nvidia GPU, and Intel XEON, you'll see that each will have quite different opinions in a simple task like biorthogonalization, if you are not careful, and do not know the "substrate" well enough)
    It's like saying "we were using fortran in eniac, and we are still using fortran today" ignoring the very fact that what you call fortran back then was quite different than what you call fortran today.
    Another point I would like to make is that, yes, it is true that you can rely on classical mechanics, pretty much independent of the particular composition of the material, but, and this is a very big one, the particular composition of the material is there already, parametrized and hidden under concepts such as inertial mass, elasticity, center of mass etc. The fact that you've parametrized and hid them quite efficiently do not make the concept under discussion independent of the substrate, does it?

    ReplyDelete
  2. I have a somewhat different issue with the second example. What 'rules of computation' do those different systems express, but the ones we have constructed them to follow?

    ReplyDelete
  3. Thanks for the comments.
    I think Anderson is talking about "universal rules of computation" such as discussed by Turing
    en.wikipedia.org/wiki/Universal_Turing_machine
    These are independent of both hardware and software (e.g. Fortran).

    ReplyDelete
  4. Actually, that was the point I was trying to make. I think it is either a misconception or a very optimistic approximation that people think Turing concept applied to numerical computation is independent of hardware and/or software. The problem is that most of such theories start or contain "computability". If you transfer concept of mathematics to a machine, the very concept of "computability" changes. For a "cheap" example, anything that involves number pi will require literally infinite amount of time if you use the Turing principle between mathematics and computers exactly to the word. Thus, approximations come into play, and where there are approximations, there is fuziness, and when there is fuziness, the hardware and software matters a lot.

    ReplyDelete
  5. The cases in Anderson's first paragraph (low temp superconductivity, superfluid He 3, nucleons) can all be explained by Art Winfree's theory of coupled oscillators. Physicists don't seem to be aware of Winfree's theory, which has been developed further by Strogatz and others. As to BCS, synchronized phonons (oscillations) act as a metronome for electron oscillations (spin, orbit), causing some electrons to advance their phases a little, and other electrons to retard their phases a little, until such electron oscillations synchronize--antisynchronously. Then Cooper pairs are the inevitable result. Quite similar to Landau damping, but in a context where the oscillating waves do not decay, remaining constant instead. As to superfluid He 3, the oscillations couple exactly synchronously (think up, up, rather than up, down). Those are the two forms of coupling that Winfree allows in two oscillator systems.

    A recent experiment by Cavalleri at Oxford, involving laser light pulses which created a brief instance of superconductivity in a cuprate material that was normally not superconducting, is instructive. The laser pulses acted as a metronome. Synchronized photons thus replaced the synchronized phonons of low-temp superconductivity, BCS version. Again, a metronome around which the electrons synchronized their oscillations. This Oxford experiment indicates--to me, at least--that Winfree's theory of coupled oscillators can provide a unified explanation for low temp and high temp superconductivity.

    ReplyDelete