One way to answer this question is to look at the reports prepared every decade by the National Academies in the USA. I have recently been looking through the 2019 report, Frontiers of Materials Research: A Decadal Survey.
There are several reasons why I like to look at these reports. A previous post mentioned a similar 2007 report prepared for the USA Department of Energy.
I can learn a lot about materials science and engineering. See, for example, the figure below.
The reports help put condensed matter physics in the broader context of research in materials science and engineering.
[Previously, I have argued that CMP is a particular approach to materials research and is distinct from materials physics. Although there is a significant overlap in the materials studied and some of the methods used, the driving questions are distinctly different].
The reports provide choice quotes for grant applications. Here is one from pages 24-25.
Key Finding: Basic research in fundamental science directions, meaning work that neither anticipates nor seeks a specific outcome, is the deep well that both satisfies our need to understand our universe and feeds the technological advances that drive the modern world. It lays the groundwork for future advances in materials science as in other fields of science and technology. Discoveries without immediate obvious application often represent great technical challenges for further development (e.g., high-Tc superconductivity, carbon nanotubes) but can also lead to very important advances, often years in the future.
Key Recommendation: It is critically important that fundamental research remains a central component of the funding portfolio of government agencies that support materials research. Paradigm-changing advances often come from unexpected lines of work.
Here is one from page 6.
Key Finding: Quantum materials science and engineering, which can include superconductors, semiconductors, magnets, and two-dimensional and topological materials, represents a vibrant area of fundamental research. New understanding and advances in materials science hold the promise of enabling transformational future applications, in computing, data storage, communications, sensing, and other emerging areas of technology. This includes new computing directions outside Moore’s law, such as quantum computing and neuromorphic computing, critical for low-energy alternatives to traditional processors. Two of NSF’s “10 big ideas” specifically identify support of quantum materials (see The Quantum Leap: Leading the Next Quantum Revolution and Midscale Research Infrastructure).
The reports are based on the consensus of a range of experts. Hence, they arguably more objective than survey articles written in luxury journals by individuals hyping their field.
But, right now the reason I am reading this report is that I am writing the last chapter of Condensed Matter Physics: A Very Short Introduction, and need to address the question of where CMP is heading. Some earlier preliminary thoughts are here.
Here are a few of my thoughts about this report. I would love to hear the perspectives of others.
First, I should give some important caveats. I have only skimmed the report. It was written by people who know much more than I. Writing a report that is based on a diverse community of interests and perspectives is extremely difficult. The main audience for such reports is not scientists themselves but rather funding agencies and policymakers.
The Summary begins with "The past decade has seen extraordinary advances in materials research" (page 3). Chapter 2 describes "significant advances" from the past decade. There is no doubt there have been many advances. It is great to read about them. Section 2.4 concerns Quantum Materials and Strongly Correlated Systems. Most of the advances described there are incremental advances from discoveries made before 2010, such as topological insulators. This haunts me with a nagging concern that CMP has not seen a big discovery in the past decade. For quantum materials is superconductivity in twisted bilayer graphene the leading candidate? Other suggestions?
A lot of attention is given to the potential of computational materials science, including when combined with data science methods (e.g. machine learning), topological matter, and quantum information processing in solid-state devices. However, I remain skeptical about the hype associated with these subjects, particularly with regard to technological applications. Big data need big theory too.
Significant attention is given to the relevance of materials research to USA defense, national security, and economic competitiveness. I wonder if this is because the report is being pitched to a MAGA government. Although I agree on the relevance, for many of us that is not the motivation for our interest in materials.
Update. In a comment below, David Sholl pointed out that NSF is not happy with the report. The background given there is also worth reading.
Despite the hype, I do think quantum computing has the potential to generate significant progress in condensed matter and materials research. This certainly does not remove the need for theory and simple models. But in many cases, even the simple models are intractable for interesting and experimentally relevant regimes (e.g. Hubbard). And it appears that quantum computers could start to answer some of the questions about such models in the relatively near term.
ReplyDeleteI remain skeptical. Could you a reference where someone makes a realistic estimate of how many qubits and how many gate operations will be needed to answer a specific question?
DeleteFor comparison, here is an estimate for quantum chemistry.
https://www.pnas.org/content/114/29/7555.short
It looks like, even in the best case scenarios, at least 100,000 qubits are required.
This paper: https://quantum-journal.org/papers/q-2020-07-16-296/ estimates the resources for simulating the Hubbard model and jellium. It still requires a few hundred thousand physical qubits, but the gate count is several orders of magnitude lower than in the PNAS.
DeleteIn the nearer-term, this paper: https://arxiv.org/abs/2012.00921 studies single-particle states for an 18-site 1D chain on an actual quantum computer. While that is certainly not something beyond the reach of classical calculations, it seems not too difficult to extend their approach to many-body systems and larger sizes (at least it would be much easier than achieving fault-tolerance, as needed for the other papers).
It is interesting to note that the NSF, which sponsored this report, has been publicly critical of it: https://www.aip.org/fyi/2019/materials-research-decadal-survey-falls-flat-nsf
ReplyDeleteThis is a very unusual outcomes for these National Academies reports.
You should mention the reason they are critical, the report fails to identify important future areas of research worth funding except in a non-committal generic way that could be equally stated in the 2010's.
DeleteProf. Sholl, Thanks for the comment. I did not know that background. It is rather extraordinary.
DeleteAnon, Thanks for pointing that out. I agree that the report is a bit generic, particularly with regard to recommendations. I fear that is the problem with reports written by committees. They may have the potential to be less biased and more representative of a community. On the other hand, diversity leads to a lack of consensus. This is where the materials research community is behind astronomy and elementary particle physicists. The latter tend to hammer out their differences behind closed doors and then present a united front to funding agencies. On the other hand, it is much simpler because everyone in the community gains from a big new telescope or particle accelerator.
DeleteThis report and the NSF's reaction to it are to be expected given trends in the US.
ReplyDeleteFirst, it was done by the "National Academies" (of Science and Engineering). These are no long what they were 30 years ago at the zenith of science's production of breakthroughs. I have watched as they have become far too focused on current political trends (read: diversity) to the rather strong exclusion of "the best".
One expects blah reports from committees of such people.
Second is the emphasis of the NSF itself no longer on quality breakthrough producing research. The mechanism for this is not funding such research until it is already achieved its breakthrough. This leads to an obvious impossibility. When I started my research 50 years ago as a starting assistant professor, the NSF funded fully my extremely large funding request on the first application. It worked well beyond my dreams. Today that is literaly, absolutely, impossible. There is too much emphasis on group projects that really aren't.
You mention astronomy and physics, specifically appropriate to high energy "big project" physics. The agencies love that, because it generates lots of workers and in reality only one contact point between the agency and the work ... wonderful for bureaucrats. And this has done fairly well for astronomy, but has failed utterly for particle physics. The SSC was cancelled, and CERN did the measurements, finding absolutely nothing interesting. Note that the money saved by cancelling the SSC was not even partially spent elsewhere. The expected terrestrial breakthroughs in (e.g.) neutrino or muon physics have not (yet?)happened, as promised report dates come, pass unfulfilled, and are forgotten.
All this, of course, is exactly what the better people expected, except for not finding SUSY or strings.