Wednesday, April 16, 2014

A definitive experimental signature of short hydrogen bonds in proteins: isotopic fractionation

I have written several posts about the controversial issue of low-barrier hydrogen bonds in proteins and whether they play any functional role, particularly in enzyme catalysis.

A basic issue is to first identify short hydrogen bonds, i.e., finding a reliable method to measure bond lengths.
I recently worked through and a nice article,
NMR studies of strong hydrogen bonds in enzymes and in a model compound
T.K. Harris, Q. Zhao, A.S. Mildvan

Surely, these bond lengths just be identified with x-ray crystallography? No.
the standard errors in distances determined by protein X-ray crystallography are 0.1–0.3 times the resolution. For a typical 2.0 Å X-ray structure of a protein, the standard errors in the distances are ±0.2–0.6 Å, precluding the distinction between short, strong and normal, weak hydrogen bonds. 
[Aside: I also wonder whether the fact that X-ray crystal structures are refined with classical molecular dynamics using force fields that are parametrised for weak bonds is also a problem. Such refinements will naturally bias towards weak bonds, i.e., the longer bond lengths that are common in proteins. I welcome comment on this.]

The authors then discuss how NMR can be used for bond length determinations. One of these NMR "rulers" involves isotopic fractionation, where one measures how much the relevant protons exchange with deuterium in a solvent,

Essentially, the relative fraction [ratio of concentrations] in thermodynamic equilibrium,

is determined by the relative zero-point energy (ZPE) of a D relative to an H in the enzyme. As described in a key JACS article the ratio is given by a formula such as
where T is the temperature.

If Planck's constant was zero, this ratio would always be one. It would also be one if there was no change in the vibrational frequencies of the H/D when they move from the solvent to the enzyme. Generally, as the H-bond strengthens [R gets shorter] the frequency change gets larger and so the difference between H/D gets larger [see this preprint for an extensive discussion], and phi gets smaller. However, for very short bonds the frequencies harden and phi will get larger, i.e. there will be a non-monotonic dependence on R, the distance between the donor and acceptor. This was highlighted in an extensive review which contains the following sketch.

Harris, Zhao, and Mildvan consider a particular parametrisation of the H-bond potential to connect the observed fractionation ratio with bond lengths in a range of proteins. They generally find reasonable agreement with other methods of determining the length [e.g., NMR chemical shift]. In particular the resolution is much better than from X-rays.

Tuesday, April 15, 2014

Roaming: a distinctly new dynamic mechanism for chemical reactions

Until recently, it was thought that the dynamics of breaking a chemical bond could occur via one of two mechanisms. The first is simply that one stretches a single bond until the relevant atoms are a long way apart. The second mechanism is via a transition state [a saddle point on a potential energy surface], where the geometry of the molecule is rearranged so that it is "half way" to the products of the chemical reaction. The energy of the transition state relative to the reactants determines the activation energy of the reaction. Transition state theory establishes this connection. Catalysts work by lowering the energy of the transition state. Enzymes work brilliantly because they are particularly good at lowering this energy barrier. An earlier post considered the controversial issue of whether it is necessary to go beyond transition state theory to explain some enzyme dynamics.

I have been struggling through an interesting Physics Today article Roaming reactions: the third way by Joel Bowman and Arthur Suits.

What is roaming?
It is a large amplitude trajectory on the potential energy surface that begins as a bond stretching.
It is best illustrated by watching videos such as this one which is an animation of a picture similar to that below.

What are the experimental signatures of roaming?
The key is to be able to resolve the distribution of the energy and angular momentum of the product molecules. It seems that if the reaction proceeds via a transition state that puts severe constraints on these distributions.

I was very happy that this week the UQ Chemistry seminar was given by Scott Kable who has pioneered recent experimental studies of roaming.
An interesting anecdote is that Scott's 2006 PNAS paper about roaming is actually based on data he took when he was a postdoc in 1990. He never published it because he did not understand it. I wonder how often this happens in science. Years ago, I wrote a short post arguing that experimentalists should not have to be able to explain their data in order to publish it.

Scott's work involves a nice experiment-theory collaboration with Meredith Jordan, who has calculated the relevant potential energy surfaces that are used in the dynamical calculations. Without the calculations it would be hard to definitively establish that roaming was an actual mechanism.

There has been an important un-anticipated consequence of this research. It may have solved a long mystery in atmospheric chemistry: the origin of organic acids in the troposphere. [See this Science paper]. This is a nice example of how basic "pure" research can lead to solutions to "applied" problems.

Roaming has now been clearly established in several gas phase reactions for small molecules.
An outstanding question is whether roaming can play a significant role in condensed phase reactions. It may be that molecular collisions will mess up the roaming trajectory. But, it may be just a matter of relative timescales. I look forward to seeing how all this develops.

Friday, April 11, 2014

How 5 years of blogging has changed me

Last month marked the 5 year anniversary of this blog. My first post was a tribute to Walter Kauzmann. In hindsight, after almost 1500 posts, I think that was a fitting beginning. Kauzmann represented many of the themes of the blog: careful and thorough scholarship, theory closely connected to experiment, simple understanding before computation, hydrogen bonding, fruitful interaction between chemistry and physics, ….

Reflecting on this anniversary I realised that writing the blog has had a significant influence on me. Writing posts forces one to be more reflective. I think I have a greater appreciation of
  • good science: solid and reproducible, influential, ...
  • how important it is to good science, rather than just publishing papers
  • how hard it is to do good science
  • today, the practise of science is increasingly broken
  • the bleak long-term job prospects on most young people in science
  • the danger and limitations of metrics for measuring research productivity and impact
  • the importance of simple models and physical pictures
  • diabatic states as a powerful conceptual, model building, and computational tool in chemistry
  • the importance of Dynamical Mean-Field Theory (DMFT)
  • bad metals as a unifying concept for strongly correlated metals
I thank all my readers, and particularly those who write comments.
I greatly value the feedback.
I do want to see more comments and discussion!

Tuesday, April 8, 2014

What role does reasoning by analogy have in science?

Two weeks ago I went to an interesting history seminar by Dalia Nassar that considered a debate between the philosopher Immanuel Kant and his former student Johann Gottfried von Herder.
Kant considered that thinking by analogy had no role in science whereas Herder considered it did. Apparently, for this reason Kant thought that biology [natural history] could never be a real science. Thinking objects were fundamentally different from non-thinking objects.

One of the reasons I like going to these seminars is that they stimulate my thinking in new directions. For example, a seminar last year helped me understand that one of my "problems" is that I view science as a vocation rather than a career, perhaps in the tradition of Robert Boyle and the Christian virtuoso.

After the seminar I had a brief discussion with some of my history colleagues about what scientists today think about analogy. I think it plays a very important role, because it can help us understand new systems and phenomena in terms of things we already understand. But where people sometimes come unstuck is when they start to assume that the analogy is reality or the complete picture. Here are a few important historical examples.

   * Electromagnetic radiation. The analogy of light waves with sound and water waves helped. But went array when people thought there must be a medium, i.e. the aether.

  * Quantum mechanics. Particles and waves. Again the analogy helped understand interference and quantisation of energy levels. But I also think that pushing to hard the partial analogies with classical mechanics and classical waves is the source of some of the confusion about quantum measurement and the quantum-classical crossover.

  * Quantum field theory and many-particle physics. Feynman diagrams, path integrals, renormalisation, symmetry breaking, Higgs boson,…. there is a lot of healthy cross-fertilisation.

 * Imaginary time quantum theory and classical statistical mechanics. Path integral = Partition function.

Coincidentally, yesterday when I was in the library [yes, the real physical library not the virtual one!] trying to track down Wigner's quote I stumbled across a 1993 Physics Today review by Tony Leggett  of Grigory Volovik's book Exotic properties of superfluid 3He. Leggett expresses his reservations about analogies.
As to the correspondences with particle physics, being the kind of philistine who does not feel that, for example, his understanding of the Bloch equations of nmr is particularly improved by being told that they are a consequence of Berry's phase, I have to confess to greeting the news that the "spin-orbit waves" of 3He-A are the analog of the W boson and the "clapping" modes the analog of the graviton with less than overwhelming excitement. These analogies no doubt display a certain virtuosity, but it is not clear that they actually help our concrete understanding of either the condensed matter or the particle-physics problems very much, especially when they have to be qualified as heavily as is done here.
What do you think? Does analogy have an important role to play? When does it cause problems?

Monday, April 7, 2014

Giant polarisability of low-barrier hydrogen bonds

An outstanding puzzle concerning simple acids and bases is their very broad infrared absorption, as highlighted in this earlier post. The first to highlight this problem was Georg Zundel. His solution involved two important new ideas:
  • the stability of H5O2+ in liquid water, [the Zundel cation]
  • that such complexes involving shared protons via hydrogen bonding have a giant electric polarisability, several orders of magnitude larger than typical molecules.
Both ideas remain controversial. A consequence of the second is that the coupling of electric field fluctuations associated with the solvent of the complex will result in a large range of vibrational energies, leading to the continuous absorption. 

Later I will discuss the relative merits of Zundel's explanation. Here I just want to focus on understanding the essential physics behind the claimed giant polarisability. The key paper appears to be a 1972 JACS

Extremely high polarizability of hydrogen bonds
R. Janoschek , E. G. Weidemann , H. Pfeiffer , G. Zundel

[The essential physics seems to be in a 1970 paper I don't have electronic access to].
If one considers the one-dimensional potential for proton transfer within a Zundel cation with O-O distance of 2.5 Angstroms it looks like the double well potential below.

The two lowest vibrational eigenstates are separated in energy by a small tunnel splitting omega of the order of 10-100 cm-1. These two states can be viewed as symmetric and anti-symmetric combinations of oscillator states approximately localised in the two wells. The transition dipole moment p between these two states is approximately the well separation [here roughly 0.5 Angstroms (times of the proton charge)].

At zero temperature the electric polarisability is approximately, p^2/omega. Omega is at least an order of magnitude smaller than a typical bond stretching frequency and p^2 can be an order of magnitude larger than for a typical covalent bond.
Hence, the polarisability can be orders of magnitude larger than that for a typical molecule.

A few consequences of this picture are that the polarisability will vary significantly with
  • isotopic [H/D] substitution
  • temperature (on the scale of the tunnel splitting)
  • the donor acceptor distance.

Did Wigner actually say this?

It is folklore that Eugene Wigner said
"It is nice to know that the computer understands the problem. But I would like to understand it too."
But did he actually say it? Where and when?

I have been trying to track it down. The earliest reference I can find is in the beginning of Chapter 5 of a 1992 book by Nussenzweig, which just says it is attributed to Wigner.

It is a great quote so it would be nice to know that Wigner actually said. I welcome any more information.

Friday, April 4, 2014

The grand challenge of wood stoves and the rural poor

Today I went to a very interesting talk, presented by the UQ Energy Initiative, by Gautam Yadama.
He described the "Wicked problem" of the use of wood stoves by the rural poor in the Majority world.

This causes a multitude of problems including deforestation, climate change, household pollution, disability due to respiratory problems,…. Yet solutions are elusive, particularly because of poverty, cultural obstacles, gender inequality, technical problems, …. In particular, previous "top down" "solutions" such as the wide scale free distribution of 35 million gas stoves by the Indian government in the 70s and 80s [largely funded by the World Bank] have been complete failures. He described his multi-disciplinary research involving social scientists, engineers, and medical experts. Yadama emphasized the importance of community involvement and programs that are "evidence based" using randomised trials [similar to those featured in Poor Economics].

Yadama has just published a "coffee table" book Fires, Fuel, and the Fate of 3 Billion that describes the  problem and features striking photos that capture some of the human dimension to the problem.