Posts

Showing posts from January, 2024

Emergence and the Ising model

Image
The Ising model is emblematic of “toy models” that have been proposed and studied to understand and describe emergent phenomena. Although originally proposed to describe ferromagnetic phase transitions, variants of it have found application in other areas of physics, and in biology, economics, sociology, neuroscience, complexity theory, …   Quanta magazine had a nice article marking the model's centenary. In the general model there is a set of lattice points {i} with a “spin” {sigma_i = +/-1} and a Hamiltonian where h is the strength of an external magnetic field and J_ij is the strength of the interaction between the spins on sites i and j. The simplest models are where the lattice is regular, and the interaction is uniform and only non-zero for nearest-neighbour sites. The Ising model illustrates many key features of emergent phenomena. Given the relative simplicity of the model, exhaustive studies since its proposal in 1920, have given definitive answers to questions often debat

David Mermin on his life in science: funny, insightful, and significant

  David Mermin has posted a preprint with the modest title, Autobiographical Notes of a Physicist There are many things I enjoyed and found interesting about his memories. A few of the stories I knew, but most I did not. He reminisces about his interactions with Ken Wilson, John Wilkins, Michael Fisher, Walter Kohn, and of course, Neil Ashcroft. Mermin is a gifted writer and can be amusing and mischievous. He is quite modest and self-deprecating about his own achievements. He explains why we should refer to the Hohenberg-Mermin-Wagner theorem, not Mermin-Wagner. One of his Reference Frame columns in Physics Today , stimulated Paul Ginsbarg to start the arXiv. I was struck by how Mermin's career belongs to a different era. The community was smaller and more personal. Doing physics was fun. Time was spent savouring the pleasure of learning new things and explaining them to others. Colleagues were friends rather than competitors. His research was curiosity-driven. This led to Mermin

Wading through AI hype about materials discovery

 Discovering new materials with functional properties is hard, very hard. We need all the tools we can from serendipity to high-performance computing to chemical intuition.  At the end of last year, two back-to-back papers appeared in the luxury journal Nature. Scaling deep learning for materials discovery All the authors are at Google. They claim that they have discovered more than two million new materials with stable crystal structures using DFT-based methods and AI. On Doug Natelson's blog there are several insightful comments on the paper about why to be skeptical about AI/DFT based "discovery". Here are a few of the reasons my immediate response to this paper is one of skepticism. It is published in Nature. Almost every "ground-breaking" paper I force myself to read is disappointing when you read the fine print. It concerns a very "hot" topic that is full of hype in both the science and business communities. It is a long way from discovering a st

Certain benefits of Bayes

Image
Best wishes for the New Year! One thing I hope to achieve this year is an actual understanding of things "Bayesian". I am particularly interested because it gives a way to be more quantitative and precise about some of the intuitions that I use in science. For example, I tend to be skeptical of new experimental results (often hyped) that claim to go against well-established theories, regardless of how good the "statistics" of the touted result. In this vein, Phil Anderson argued that  Bayesian methods  should have been used to rule out the significance of "discoveries" such as the 10 keV neutrino and the fifth force. In 1992 he wrote a  Physics Today column on the subject. An interesting metric for mathematical formula is the ratio of profound and wide implications to the simplicity of the formula and its derivation.  I suspect that Bayes' formula for conditional probabilities would win first place! P(A|B) denotes the probability of A given B.  The pr