Monday, November 10, 2025

Why is the state of universities such an emotional issue for me?

It all about values!

Universities have changed dramatically over the course of my lifetime. Australian universities are receiving increasing media attention due to failures in management and governance. But there is a lot more to the story, particularly at the grassroots level, of the everyday experience of students and faculty. It is all about the four M's: management, marketing, metrics, and money. Learning, understanding, and discovering things for their own sake is alien and marginalised. I have stopped writing posts about this. So why come back to it?

I am often struck how emotional this issue is for me and how hard it is to sometimes talk about it, particularly with those with a different view from me. Writing blog posts (e.g. this one) about it has been a somewhat constructive outlet, rather than exploding in anger at an overpaid and unqualified "manager" or one of their many multiplying minions.

A few weeks ago, I listened to three public lectures by the Australian historian Peter Harrison. [He is my former UQ colleague. We are now both Emeritus. I benefited from excellent seminars he ran at UQ, some of which I blogged about].

The lectures helped me understand what has happened to universities and also why it is a sensitive subject for me. Briefly, it is all about values and virtues.

The lectures are nicely summarised by Peter in the short article, 

How our universities became disenchanted: Secularisation, bureaucracy and the erosion of value

Reading the article rather than this blog post is recommended. I won't try and summarise it, but rather highlight a few points and then make some peripheral commentary.

I agree with Peter's descriptions of the problems we see on the surface (bureaucracy, metrics, and management features significantly). His lectures are a much deeper analysis of underlying cultural changes and shifting worldviews that have occurred over centuries, leading universities to evolve into their current mangled form.

A few things to clarify to avoid potential misunderstanding of Peter's arguments.

Secularisation is defined broadly. It does not just refer to the decline in the public influence of Christianity in the Western world. It is also about Greek philosophy, particularly Aristotle, and the associated emphasis on virtues and transcendence. Peter states:

"The intrinsic motivations of teachers, researchers and scholars can be understood in terms of virtues or duties. According to virtue ethics, the “good” of an activity is related to the way it leads to a cultivation and expression of particular virtues. These, in turn, are related to a particular conception of natural human ends or goals. (Aristotle’s understanding of human nature, which informs virtue ethics, proposes that human beings are naturally oriented towards knowledge, and that they are fulfilled as persons to the extent that they pursue those goals and develop the requisite intellectual virtues.)"

The virtue ethics of Aristotle [and Alisdair MacIntyre] conflicts with competing ethical visions, including duty-oriented (deontological) ethics, consequentialist ethics, and particularly utilitarianism. This led to a shift away from intrinsic goods to what things are "good for", i.e., what practical outcomes they produce. For example, is scientific research "good" and have "value" because it cultivates curiousity, awe, and wonder, or because it will lead to technology that will stimulate economic growth?

Peter draws significantly on Max Weber's ideas about secularisation, institutions, and authority. Weber argued that a natural consequence of secularisation was disenchantment (the loss of magic in the world). This is not simply "people believe in science rather than magic". Disenchantment is a loss of a sense of awe, wonder, and mystery.

Now, a few peripheral responses to the lectures.

Is secularisation the dominant force that has created these problems for universities? In question time, Peter was asked whether capitalism was more important. i.e., universities are treated as businesses and students as customers? He agreed that capitalism is a factor but also pointed out how Weber emphasised that capitalism was connected to the secularising effects of the Protestant Reformation.

 I think that two other factors to consider are egalitarianism and opportunism. These flow from universities being "victims" of their own success. Similar issues may also be relevant to private schools, hospitals, and charities. They have often been founded by people of "charisma" [in the sense used by Weber] motivated by virtue ethics. Founders were not concerned with power, status, or money. What they were doing had intrinsic value to them and was "virtuous". In the early stages, these institutions attracted people with similar ideals. The associated energy, creativity, and common vision led to "success." Students learnt things, patients got healed, and poverty was alleviated. But, this success attracted attention and  the institution then had power, money, status, and influence.

The opportunists then move in. They are attracted to the potential to share in the power, money, status, and influence. The institution then takes on a life of its own, and the ideals and virtue ethics of the founders are squeezed out. In some sense, opportunism might be argued to be a consequence of secularisation. 

[Aside: two old posts considered a similar evolution, motivated by a classic article about the development of businesses.]

One indicator of the "success" of universities is how their graduates join the elite and hold significant influence in society. [Aside: ignoring the problem of distinguishing correlation and causality. Do universities actually train students well or just select those who will succeed anyway?]  Before (around) 1960, (mostly) only the children of the elite got to attend university. Demands arose that more people should have access to this privilege. This led to "massification" and an explosion in the number of students, courses, and institutions. This continues today, globally. Associated with this was more bureaucracy. Furthermore, the "iron triangle" of cost, access, and quality presents a challenge for this egalitarianism. If access increases, so does cost and quality decreases, unless you spend even more. It is wonderful that universities have become more diverse and accessible. On the other hand, I fear that for every underprivileged student admitted whose mind is expanded and life enriched, many more rich, lazy, and entitled students suck the life out of the system.

Metrics are pseudo-rational

Peter rightly discussed how the proliferation of the use of metrics to measure value is problematic, and reflects the "rationalisation" associated with bureaucracy (described by Weber). Even if one embraces the idea that "rational" and "objective" assessment is desirable, my observation is that in practice, metrics are invariably used in an irrational way. For example, managers look at the impact factor of journals, but are blissfully oblivious to the fact that the citation distribution for any journal is so broad and with a long tail that the mean number is meaningless. The underlying problem is that too many of the people doing assessments suffer from some mixture of busyness, intellectual laziness, and arrogance. Too many managers are power hungry and want to make the decisions themselves, and don't trust faculty who actually may understand the intellectual merits and weaknesses of the work being assessed.

The problems are just as great for the sciences as the humanities

On the surface, the humanities are doing worse than the sciences. For example, if you look at declining student numbers, threats of job cuts, political criticism, and status within the university. This is because science is associated with technology which is associated with jobs and economic growth. However, if you look at pure science that is driven by curiousity, awe, and wonder, then one should be concerned. There is an aversion to attacking difficult and risky problems, particularly those that require long-term investment or have been around for a while. The emphasis is on low-lying fruit and the latest fashion. Almost all physics and chemistry research is framed in terms of potential applications, not fundamental understanding. Sometimes I feel some of my colleagues are doing engineering not physics. In a similar vein, biochemists frame research in terms of biomedical applications, not the beauty and wonders of how biological systems work. 

Are universities destined for bureaucratic self-destruction?

Provocatively, Peter considered the potential implications of the arguments of historian and anthropologist Joseph Tainter concerning the collapse of complex societies. On the technical side, this reminded me of a famous result in ecology by Robert May, that as the complexity of a system (the number of components and interactions) increases, it can become unstable.

I don't think universities as institutions will collapse. They are too integrated into the fabric of modern capitalism. What may collapse is the production of well-educated (in the Renaissance sense) graduates and research that is beautiful, original, and awe-inspiring. This leads naturally into the following question.

Is the age of great discoveries over?

Peter briefly raised this issue. On the one hand, we are victims of our own success. It is amazing how much we now know and understand. Hence, it is harder to discover truly new and amazing things. On the other hand, because of emergence we should expect surprises.

There is hope on the margins

Peter did not just lament the current situation but made some concrete suggestions for addressing the problems, even though we are trapped in Weber's "iron cage" of bureaucracy.

  • Re-balancing the structures of authority
  • Finding a place for values discourse in the universities
  • Develop ways of resolving differences with a sense of the rationality of Alisdair MacIntyre in mind
On the first, I note the encouraging work of the ANU Governance Project.

Peter also encouraged people to work on the margins. I also think that this is where the most significant scholarship and stimulus for reform will happen. A nice example is the story that Malcolm Gladwell tells in a podcast episode, The Obscure Virus Club.




Monday, November 3, 2025

Overdoped cuprates are not Fermi liquids

They are anisotropic marginal Fermi liquids.

A commenter on my recent AI blog post mentioned the following preprint, with a very different point of view.

Superconductivity in overdoped cuprates can be understood from a BCS perspective!

B.J. Ramshaw, Steven A. Kivelson

The authors claim:

" a theoretical understanding of the "essential physics" is achievable in terms of a conventional Fermi-liquid treatment of the normal state...

...observed features of the overdoped materials that are inconsistent with this perspective can be attributed to the expected effects of the intrinsic disorder associated with most of the materials being solid state solutions"

On the latter point, they mention two papers that found the resistivity versus temperature can have a linear component. But there is much more.

The authors appear unaware of the experimental data and detailed theoretical analysis showing that the overdoped cuprates are anisotropic marginal Fermi liquids. 

Angle-dependent magnetoresistance measurements by Nigel Hussey's group, reported in 2006, were consistent with a Fermi surface anisotropy in the scattering rate.

Papers in 2011 and 2012 pushed the analysis further.

Consistent Description of the Metallic Phase of Overdoped Cuprate Superconductors as an Anisotropic Marginal Fermi Liquid, J. Kokalj and Ross H. McKenzie

Transport properties of the metallic state of overdoped cuprate superconductors from an anisotropic marginal Fermi liquid model, J. Kokalj, N. E. Hussey, and Ross H. McKenzie 

The self-energy is the sum of two terms with characteristic dependencies on temperature, frequency, location on the Fermi surface, and doping. The first term is isotropic over the Fermi surface, independent of doping, and has the frequency and temperature dependence characteristic of a Fermi liquid. 

The second term is anisotropic over the Fermi surface (vanishing at the same points as the superconducting energy gap), strongly varies with doping (scaling roughly with 𝑇𝑐, the superconducting transition temperature), and has the frequency and temperature dependence characteristic of a marginal Fermi liquid. 

The first paper showed that this self-energy can describe a range of experimental data including angle-dependent magnetoresistance and quasiparticle renormalizations determined from specific heat, quantum oscillations, and angle-resolved photoemission spectroscopy. 

The second paper, showed, without introducing new parameters and neglecting vertex corrections, that this model self-energy can give a quantitative description of the temperature and doping dependence of a range of reported transport properties of Tl2Ba2CuO6+𝛿 samples. These include the intralayer resistivity, the frequency-dependent optical conductivity, the intralayer magnetoresistance, and the Hall coefficient. The temperature dependence of the latter two are particularly sensitive to the anisotropy of the scattering rate and to the shape of the Fermi surface.

For a summary of all of this, see slides from a talk I gave at Stanford back in 2013.

I am curious whether the authors can explain the anisotropic part of the self-energy in terms of disorder in samples.

Wednesday, October 29, 2025

Rodney Baxter (1940-2025): Mathematical Physicist

I recently learnt that Rodney Baxter died earlier this year. He was adept at finding exact solutions to two-dimensional lattice models in statistical mechanics. He had a remarkably low public profile. But, during my lifetime, he was one of the Australian-based researchers who made the most significant and unique contributions to physics, broadly defined. Evidence of this is the list of international awards he received.

On Baxter's scientific achievements, see the obituary from the ANU, and earlier testimonials from Barry McCoy in 2000, and by Vladimir Bahzanov, on the award of the Henri Poincaré Prize to Baxter in 2021.

Exact solutions of "toy models" are important in understanding emergent phenomena. Before Onsager found an exact solution to the two-dimensional Ising model in 1944, there was debate about whether statistical mechanics could describe phase transitions and the associated discontinuities and singularities in thermodynamic quantities. 

Exact solutions provide benchmarks for approximation schemes and computational methods. They have also guided and elucidated key developments such as scaling, universality, the renormalisation group and conformal field theory.

Exact solutions guided Haldane's development of the Luttinger liquid and our understanding of the Kondo problem.

I mention the specific significance of a few of Baxter's solutions. His Exact solution of the eight-vertex model in 1972 gave continuously varying critical exponents that depended on the interaction strength in the model. This surprised many because it seemed to be against the hypothesis of the universality of critical exponents. This was later reconciled in terms of connections to the Berezinskii-Kosterlitz-Thouless transition (BKT) phase transition, which was discovered at the same time. I am not sure who explicitly resolved this.

It might be argued that Baxter independently discovered the BKT transition. For example, consider the abstract of a 1973 paper, Spontaneous staggered polarization of the F-model

"The “order parameter” of the two-dimensional F-model, namely the spontaneous staggered polarization P0, is derived exactly. At the critical temperature P0 has an essential singularity, both P0 and all its derivatives with respect to temperature vanishing."

Following earlier work by Lieb, Baxter explored the connection of two-dimensional classical models with one-dimensional quantum lattice models. For example, the solution of the XYZ quantum spin chain is related to the Eight-vertex model. Central to this is the Yang-Baxter equation. Alexander B. Zamolodchikov connected this to integrable quantum field theories in 1+1 dimensions. [Aside: the Yang is C.N. Yang, of Yang-Mills and Yang-Lee fame, who died last week.]

Baxter's work had completely unanticipated consequences beyond physics. Mathematicians discovered profound connections between his exact solutions and the theory of knots, number theory, and elliptic functions. It also stimulated the development of quantum groups.

I give two personal anecdotes on my own interactions with Baxter. I was an undergraduate at the ANU from 1979 to 1982. This meant I was completely separated from the half of the university known as the Institute for Advanced Studies (IAS), where Baxter worked. Faculty in the IAS there did no teaching, did not have to apply for external grants, and had considerable academic freedom. Most Ph.D. students were in the IAS. By today's standards, the IAS was a cushy deal, particularly if faculty did not get involved in internal politics. As an undergraduate, I really enjoyed my courses on thermodynamics, statistical mechanics, and pure mathematics. My honours supervisor, Hans Buchdahl, suggested that I talk to Baxter about possibly doing a Ph.D. with him. I found him quiet, unassuming, and unambitious. He had only supervised a few students. He wisely cautioned me that Ph.D. students might not be involved in finding exact solutions but might just be comparing exact results to series expansions.

In 1987, when I was a graduate student at Princeton, Baxter visited, hosted by Elliot Lieb, and gave a Mathematical Physics Seminar. This visit was just after he received the Dannie Heinemann Prize for Mathematical Physics from the American Physical Society. These seminars generally had a small audience, mostly people in the Mathematical Physics group. However, for Baxter, many string theorists (Witten, Callen, Gross, Harvey, ...) attended. They had a lot of questions for Baxter. But, from my vague recollection, he struggled to answer them, partly because he wasn't familiar with the language of quantum field theory. 

I was told that he got nice job offers from the USA. He could have earned more money and achieved a higher status. For personal reasons, he turned down the offer of a Royal Society Research Professorship at Cambridge.  But he seemed content puttering away in Australia. He just loved solving models and enjoyed family life down under.

Baxter wrote a short autobiography, An Accidental Academic. He began his career and made his big discoveries in a different era in Australian universities. The ANU had generous and guaranteed funding. Staff had the freedom to pursue curiosity-driven research on difficult problems that might take years to solve. There was little concern with the obsessions of today: money, metrics, management, and marketing. It is wonderful that Baxter was able to do what he did. It is striking that he says he retired early so he would not have to start making grant applications!

Saturday, October 25, 2025

Can AI solve quantum-many body problems?

I find it difficult to wade through all the hype about AI, along with the anecdotes about its failings to reliably answer basic questions.

Gerard Milburn kindly brought to my attention a nice paper that systematically addresses whether AI is useful as an aid (research assistant) for solving basic (but difficult) problems that researchers in condensed matter theorists care about.

CMT-Benchmark: A Benchmark for Condensed Matter Theory Built by Expert Researchers

The abstract is below.

My only comment is one of perspective. Is the cup half full or half empty? Do we emphasise the failures or the successes?

The optimists among us will claim that the success in solving a smaller number of these difficult problems shows the power and potential of AI. It is just a matter of time before LLMs can solve most of these problems, and we will see dramatic increases in research productivity (defined as the amount of time taken to complete a project).

The pessimists and skeptically oriented will claim that the failures highlight the limitations of AI, particularly when training data sets are small. We are still a long way from replacing graduate students with AI bots (or at least using AI to train students in the first year of their PhD).

What do you think? Should this study lead to optimism, pessimism, or just wait and see?

----------

Large language models (LLMs) have shown remarkable progress in coding and math problem-solving, but evaluation on advanced research-level problems in hard sciences remains scarce. To fill this gap, we present CMT-Benchmark, a dataset of 50 problems covering condensed matter theory (CMT) at the level of an expert researcher. Topics span analytical and computational approaches in quantum many-body, and classical statistical mechanics. The dataset was designed and verified by a panel of expert researchers from around the world. We built the dataset through a collaborative environment that challenges the panel to write and refine problems they would want a research assistant to solve, including Hartree-Fock, exact diagonalization, quantum/variational Monte Carlo, density matrix renormalization group (DMRG), quantum/classical statistical mechanics, and model building. We evaluate LLMs by programmatically checking solutions against expert-supplied ground truth. We developed machine-grading, including symbolic handling of non-commuting operators via normal ordering. They generalize across tasks too. Our evaluations show that frontier models struggle with all of the problems in the dataset, highlighting a gap in the physical reasoning skills of current LLMs. Notably, experts identified strategies for creating increasingly difficult problems by interacting with the LLMs and exploiting common failure modes. The best model, GPT5, solves 30\% of the problems; average across 17 models (GPT, Gemini, Claude, DeepSeek, Llama) is 11.4±2.1\%. Moreover, 18 problems are solved by none of the 17 models, and 26 by at most one. These unsolved problems span Quantum Monte Carlo, Variational Monte Carlo, and DMRG. Answers sometimes violate fundamental symmetries or have unphysical scaling dimensions. We believe this benchmark will guide development toward capable AI research assistants and tutors.

Monday, October 20, 2025

Undergraduates need to learn about the Ising model

A typical undergraduate course on statistical mechanics is arguably misleading because (unintentionally) it does not tell students several important things (related to one another).

Statistical mechanics is not just about how to calculate thermodynamic properties of a collection of non-interacting particles.

A hundred years ago, many physicists did not believe that statistical mechanics could describe phase transitions. Arguably, this lingering doubt only ended fifty years ago with Wilson's development of renormalisation group theory.

It is about emergence: how microscopic properties are related to macroscopic properties.

Leo Kadanoff commented, "Starting around 1925, a change occurred: With the work of Ising, statistical mechanics began to be used to describe the behaviour of many particles at once."

When I came to UQ 25 years ago, I taught PHYS3020 Statistical Mechanics a couple of times. To my shame, I never discussed the Ising model. There is a nice section on it in the course textbook, Thermal Physics: An Introduction, by Daniel Schroeder. I guess I did not think there was time to "fit it in" and back then, I did not appreciate how important the Ising model is. This was a mistake.

Things have changed for the better due to my colleagues Peter Jacobson and Karen Kheruntsyan. They now include one lecture on the model, and students complete a computational assignment in which they write a Monte Carlo code to simulate the model.

This year, I am giving the lecture on the model. Here are my slides  and what I will write on the whiteboard or document viewer in the lecture.

Friday, October 17, 2025

One hundred years of Ising

In 1925, Ising published his paper on the solution of the model in one dimension. An English translation is here.https://www.hs-augsburg.de/~harsch/anglica/Chronology/20thC/Ising/isi_fm00.html

Coincidentally, next week I am giving a lecture on the Ising model to an undergraduate class in statistical mechanics. To flesh out the significance and relevance of the model, here are some of the interesting articles I have been looking at:

The Ising model celebrates a century of interdisciplinary contributions, Michael W. Macy, Boleslaw K. Szymanski and Janusz A. Hołyst

This mostly discusses the relevance of the model to understanding basic problems in sociology, including its relation to the classic Schelling model for social segregation.

The Ising model: highlights and perspectives, Christof Külske

This mostly discusses how the model is central to some work in mathematical physics and probability theory.

The Fate of Ernst Ising and the Fate of his Model, Thomas Ising, Reinhard Folk, Ralph Kennac, Bertrand Berche, Yurij Holovatche.

This includes some nice memories of Ising from his son, Thomas.

Aside: I wanted a plot of the specific heat for the one-dimensional model. According to Google AI "In a 1D Ising model with no external magnetic field, the specific heat is zero at all temperatures."

Wednesday, October 8, 2025

2025 Nobel Prize in Physics: Macroscopic quantum effects

John Clarke, Michel H. Devoret, and John M. Martinis received the prize  “for the discovery of macroscopic quantum mechanical tunnelling and energy quantisation in an electric circuit.”

The work was published in three papers in PRL in 1984 and 1985. The New York Times has a nice discussion of the award, including comments from Clarke, Martinis, Tony Leggett, and Steve Girvin.

There is some rich, subtle, and beautiful physics here. As a theorist, I comment on the conceptual and theoretical side, but don't want to minimise that doing the experiments was a technical breakthrough.

The experiments were directly stimulated by Tony Leggett, who, beginning in the late 70s, championed the idea that Josephson junctions and SQIDs could be used to test whether quantum mechanics was valid at the macroscopic level. Many in the quantum foundations community were sceptical. Leggett and Amir Caldeira, performed some beautiful, concrete, realistic calculations of the effect of decoherence and dissipation on quantum tunneling in SQUIDs. The results suggested that macroscopic tunneling should be observable.

Aside: Leggett rightly received a Nobel in 2003 for his work on the theory of superfluid 3He. Nevertheless, I believe his work on quantum foundations is even more significant.

Subtle point 1. What do we mean by a macroscopic quantum state?

It is commonly said that superconductors and superfluids are in a macroscopic quantum state. Signatures are the quantisation of magnetic flux in a superconducting cylinder and how the current through a Josephson junction oscillates as a function of the magnetic flux through the junction. I discuss this in the chapter on Quantum Matter in my Very Short Introduction.

Leggett argued that these experiments are explained by the Josephson equations, which treat the phase of the superconducting order parameter as a classical variable. For example, in a SQUID, it satisfies a classical dynamical equation. 

If the state is truly quantum, then the phase variable should be quantised.

Aside: a nice microscopic derivation, starting from BCS theory and using path integrals, of the effective action to describe the quantum dynamics was given in 1982 by Vinay Ambegaokar, Ulrich Eckern, Gerd Schön

Subtle point 2. There are different signatures of quantum theory: energy level quantisation, tunnelling, coherence (interference), and entanglement.

In 1984-5, Clarke, DeVoret, and Martinis observed the first two. Macroscopic quantum coherence is harder to detect and was only observed in 2000. 

In a nice autobiographical article
Leggett commented in 2020,
Because of the strong prejudice in the quantum foundations community that it would never be possible to demonstrate characteristically quantum-mechanical effects at the macroscopic level, this assertion made us [Leggett and Garg, 1985] the target of repeated critical comments over the next few years. Fortunately, our experimental colleagues were more open-minded, and several groups started working toward a meaningful experiment along the lines we had suggested, resulting in the first demonstrations (29, 30) of MQC [Macroscopic Quantum Coherence] in rf SQUIDs (by then rechristened flux qubits) at the turn of the century. However, it would not be until 2016 that an experiment along the lines we had suggested (actually using a rather simpler protocol than our original one) was carried out (31) and, to my mind, definitively refuted macrorealism at that level.  
I find it rather amusing that nowadays the younger generation of experimentalists in the superconducting qubit area blithely writes papers with words like “artificial atom” in their titles, apparently unconscious of how controversial that claim once was.

Two final comments on the sociology side.

Superconductivity and superfluidity have now been the basis for Nobel Prizes in six years and four years, respectively.

The most widely cited of the three PRLs that were the basis of the Prize is the one on quantum tunnelling with about 500 citations on Google Scholar. (In contrast, Devoret has more than 20 other papers that are more widely cited). From 1986 to 1992 it was cited about a dozen times per year. Between 1993 and 2001 is was only cited a total of 30 times. Since, 2001 is has been cited about 20 times per year.

This is just one more example of how citation rates are a poor measure of the significance of work and a predictor of future success.

Why is the state of universities such an emotional issue for me?

It all about values! Universities have changed dramatically over the course of my lifetime. Australian universities are receiving increasing...