Friday, January 26, 2018

A spicy scientific scandal

I am often on the lookout for interesting molecules and solids which involve short hydrogen bonds, particularly biomolecules where this bond may play a key role in functionality. Such bonds are of interest from a physics point of view because then the quantum motion of the proton matters.
Consequently, the following paper (published in October 2016) caught my attention.

Proton Probability Distribution in the O···H···O Low-Barrier Hydrogen Bond: A Combined Solid-State NMR and Quantum Chemical Computational Study of Dibenzoylmethane and Curcumin Xianqi Kong, Andreas Brinkmann Victor Terskikh, Roderick E. Wasylishen, Guy M. Bernard∥, Zhuang Duan∥, Qichao Wu∥, and Gang Wu

The authors state their motivation.
Curcumin was selected in our study, in part because it is being touted as a wonder drug and is of intense interest to the pharmaceutical and medical community.31−33
This sounds quite exciting. Could low barrier hydrogen bonds be important in curing cancer?
Curcumin is a major ingredient of tumeric, the yellow spice, which features heavily in Asian cooking.
This got me Googling and it turns out the claims of a "wonder drug" are dubious.

Experimental studies of curcumin turn out to be particularly problematic, as explained in a blog post
Curcumin will waste your time by Derek Lowe. It is worth reading because it highlights the need for replication studies and publication of null results.

But it gets worse. References 31 and 32 have the same last author, Bharat Aggarwal, who it turns out has been the major proponent of the "wonder drug". In 2015 he "retired" from the University of Texas, following allegations of scientific fraud. By August 2106, eighteen published papers by him had been withdrawn.

To illustrate the problem of metrics, in 2016 Aggarwal had an h-index of 160, and in 2015, Thomson Reuters (ISI Web of Science) listed him among the World's Most Influential Scientific Minds.

I should stress that none of this invalidates the results of the hydrogen bonding paper that got me on this trail.

Tuesday, January 23, 2018

Emergent stories

Steve Blundell has written a very nice article
Emergence, causation and storytelling: condensed matter physics and the limitations of the human mind

The article is lucid, creative, and stimulating.
He explores some issues that are of particular interest to philosophers such as the differences between "weak" and "strong" emergence, which are sometimes called "epistemological" and "ontological" emergence, respectively.

Part of his argument is based on the fact that human minds are finite and constrained by the physical world and that "information is physical". Unlike the philosophers, he argues that emergence always has both an ontological and an epistemological character.

To illustrate his arguments Steve uses several beautiful examples.

"To work, stories have to be succinct, told well, have a point and express some truth."
This is to accommodate the physical limitations of the human mind.

Number theory.
Integers are defined by the rules of a very simple algebra. Yet, rich phenomena emerge such as how the asymptotic distribution of prime numbers [given by the zeros of the Riemann zeta function] can be described by random matrix theory.

Conway's game of life.
He considers the "Scattering"  and the creation and destruction of "objects" such as spaceships, "Canada geese" (shown below), and "pulsars".

How emergence comes into play is described by the figure below.

This reminds me a bit of particle physics experiments. New entities emerge from the underlying rules encoded in the Standard Model.

Spin ice.
Emergent gauge field and magnetic monopoles.
This is also discussed as an example of emergence in a 2016 article by Rehn and Moessner.

Friday, January 19, 2018

Observation of renormalised quasi-particle excitations

A central concept of quantum-many body theory is that of coherent quasi-particles. Their key property is a well-defined relationship between energy and momentum (dispersion relation). Prior to the rise of ARPES (Angle-Resolved Photo-Emission Spectroscopy) over the past three decades, the existence of electronic quasi-particles was only inferred indirectly.

A very nice paper just appeared which shows a new way of measuring quasi-particle excitations in a
strongly correlated electron system. Furthermore, the experimental results are compared quantitatively to state-of-the-art theory, showing several subtle many-body effects.

Coherent band excitations in CePd3: A comparison of neutron scattering and ab initio theory 
Eugene A. Goremychkin, Hyowon Park, Raymond Osborn, Stephan Rosenkranz, John-Paul Castellan, Victor R. Fanelli, Andrew D. Christianson, Matthew B. Stone, Eric D. Bauer, Kenneth J. McClellan, Darrin D. Byler, Jon M. Lawrence

The mixed valence compound studied is of particular interest because with increasing temperature it exhibits a crossover from a Fermi liquid with coherent quasi-particle excitations to incoherent excitations, an example of a bad metal.

The figure below shows a colour intensity plot of the dynamical magnetic susceptibility
at a fixed energy omega, and a function of the wavevector Q. The top three panels are from the calculations of DFT+DMFT (Density Functional Theory + Dynamical Mean-Field Theory).

The bottom three panels are the corresponding results from inelastic neutron scattering.
A and B [D and E] are both at omega=35 meV and in two different momentum planes. C [F] is at omega=55 meV.
The crucial signal of coherence (i.e. dispersive quasi-particles) is that the shift of the maxima between the G and R points at 35 meV to the M and X points at 55 meV.

It should be stressed that these dispersing excitations are not due to single (charged) quasi-particles, but rather spin excitations which are particle-hole excitations.

The figure below shows how the dispersion [coherence] disappears as the temperature is increased from 6 K (top) to 300 K (bottom). The solid lines are theoretical curves.
The figure below shows that the irreducible vertex corrections associated with the particle-hole are crucial to the quantitative agreement of theory and experiment. The top (bottom) panel in the figure below shows the calculation at low (high) temperatures. The black (blue) curves are with (without) vertex corrections. The red curves are a rescaling of the blue curves by a numerical factor.
The correction has two effects: First, it smooths out some of the fine structure in the energy dependence of the spectra while broadly preserving both the Q variation and the overall energy scale; and second, it produces a strong enhancement of the intensity that is both energy and temperature dependent, for example, by a factor of ~6.5 at w = 60 meV at 100 K. This shows that the Q dependence of the scattering is predomi- nantly determined by the one-electron joint density of states, as expected for band transitions, whereas the overall intensity is amplified by the strong electron correlations. 
This landmark study is only possible due to recent parallel advances in theory, computation, and experiment. 
On the theory side, it is not just DMFT but also including particle-hole interactions in DMFT.
On computation, it is new DMFT algorithms and increasing computer speed. 
On the experimental side, it is pulsed neutron sources, and improvements in the sensitivity and spatial and energy resolution of neutron detectors.

Monday, January 15, 2018

The emergence of BS in universities

The Chronicle of Higher Education has an excellent (but depressing) article, Higher Education is Drowning in BS, by Christian Smith.

In both scope and eloquence, this article goes far beyond my post, The rise of BS in science and academia. Furthermore, as a sociologist, Smith argues that one of the challenges, is to the think about the problem in collective (dare I say emergent!) terms, rather than just individualistic terms.
Essential to realize in all of this is that most of the BS is produced not by pernicious individuals, but instead by complex dysfunctions in institutional systems. It is easy to be a really good academic or administrator and still actively contribute to the BS. So we need to think not individualistically, but systemically, about culture, institutions, and political economies. Pointing fingers at individual schools and people is not helpful here. Sociological analysis of systems and their consequences is.
Smith also spells out the broader moral and political implications of the problems.

In the end, a key issue, central to the problem, is there are many competing ideas and interests concerning what a university is actually for.  That ultimately comes from different values and world views, leading to different ethical, moral, and political perspectives. Nevertheless, the core mission should be clear: it is thinking about the world and training students to think.

I agree with Smith,
BS is the failure of leaders in higher education to champion the liberal-arts ideal — that college should challenge, develop, and transform students’ minds and hearts so they can lead good, flourishing, and socially productive lives — and their stampeding into the "practical" enterprise of producing specialized workers to feed The Economy.
Aside: One interesting feature of the comments on the article, is how much the problem of students using cell phones in class gets discussed.

I thank Mike Karim for bringing the article to my attention.

Wednesday, January 10, 2018

Should we be concerned about irreproducible results in condensed matter physics?

The problem of the irreproducibility of many results in psychology and medical research is getting a lot of attention. There is even an Wikipedia page about the Replication Crisis. In the USA the National Academies have just launched a study of the problem.

This naturally raises the question about how big is the problem is in physics and chemistry?

One survey showed that many chemists and physicists could not reproduce results of others. 

My anecdotal experience, is that for both experiments and computer simulations, there is a serious problem. Colleagues will often tell me privately they cannot reproduce the published results of others. Furthermore, this particularly seems to be a problem for "high impact" results, published in luxury journals. A concrete example is the case of USO's [Unidentified Superconducting Objects]. Here is just one specific case.

A recent paper looks at the problem for the case of a basic measurement in a very popular class of materials.

How Reproducible Are Isotherm Measurements in Metal–Organic Frameworks? 
 Jongwoo Park, Joshua D. Howe, and David S. Sholl
We show that for the well-studied case of CO2 adsorption there are only 15 of the thousands of known MOFs for which enough experiments have been reported to allow strong conclusions to be drawn about the reproducibility of these measurements.
Unlike most university press releases [which are too often full of misleading hype] the one from Georgia Tech associated with this paper is actually quite informative and worth reading.

A paper worth reading is that by John Ioannidis, "Why most published research findings are false", as it contains some nice basic statistical arguments as to why people should be publishing null results. He also makes the provocative statement:
The hotter a scientific field (with more scientific teams involved) the less likely the research findings are to be true.
I thank Sheri Kim and David Sholl for stimulating this post.

How serious do you think this problem is? What are the best ways to address the problem?