Thursday, April 27, 2017

Is it an Unidentified Superconducting Object (USO)?

If you look on the arXiv and in Nature journals there is a continuing stream of people claiming to observe superconductivity in some new material.
There is a long history of this and it is worth considering the wise observations of Robert Cava, back in 1997, contained in a tutorial lecture.
It would have been useful indeed in the early days of the field [cuprate superconductors] to have set up a "commission" to set some minimum standard of data quality and reproducibility for reporting new superconductors. An almost countless number of "false alarms" have been reported in the past decade, some truly spectacular. Koichi Kitazawa from the University of Tokyo coined these reports "USOs", for Unidentified Superconducting Objects, in a clever cross-cultural double entendre likening them to UFOs (Unidentified Flying Objects, which certainly are their equivalent in many ways) and to "lies" in the Japanese translation of USO. 
These have caused great excitement on occasion, but more often distress. It is important, however, to keep in mind what a report of superconductivity at 130K in a ceramic material two decades ago might have looked like to rational people if it came out of the blue sky with no precedent. That having been said, it is true that all the reports of superconductivity in new materials which were later confirmed to be true did conform to some minimum standard of reproducibility and data quality. I have tried to keep up with which of the reports have turned out to be true and which haven't. 
There have been two common problems: 
1. Experimental error- due, generally, to inexperienced investigators unfamiliar with measurement methods or what is required to show that a material is superconducting. This has become more rare as the field matures. 
[n.b. you really need to observe both zero resistivity and the Meissner effect].
2. "New" superconductors are claimed in chemical systems already known to have superconductors containing some subset of the components. This is common even now, and can be difficult for even experienced researchers to avoid. The previously known superconductor is present in small proportions, sometimes in lower Tc form due to impurities added by the experimentalist trying to make a new compound. In a particularly nasty variation on this, sometimes extra components not intentionally added are present - such as Al from crucibles or CO2 from exposure to air some time during the processing. I wish I had a dollar for every false report of superconductivity in a Nb containing oxide where the authors had unintentionally made NbN in small proportions.
There is also an interesting article about the Schon scandal, where Paul Grant claims
During my research career in the field of superconducting materials, I have documented many cases of an 'unidentified superconducting object' (USO), only one of which originated from an industrial laboratory, eventually landing in Physical Review Letters. But USOs have had origins in many universities and government laboratories. Given my rather strong view of the intrinsic checks and balances inherent in industrial research, the misconduct that managed to escape notice at Bell Labs is even more singular.

Monday, April 24, 2017

Have universities lost sight of the big questions and the big picture?

Here are some biting critiques of some of the "best" research at the "best" universities, by several distinguished scholars.
The large numbers of younger faculty competing for a professorship feel forced to specialize in narrow areas of their discipline and to publish as many papers as possible during the five to ten years before a tenure decision is made. Unfortunately, most of the facts in these reports have neither practical utility nor theoretical significance; they are tiny stones looking for a place in a cathedral. The majority of ‘empirical facts’ in the social sciences have a half-life of about ten years.
Jerome Kagan [Harvard psychologist], The Three Cultures Natural Sciences, Social Sciences, and the Humanities in the 21st Century
[I thank Vinoth Ramachandra for bringing this quote to my attention].
[The distinguished philosopher Alasdair] MacIntyre provides a useful tool to test how far a university has moved to this fragmented condition. He asks whether a wonderful and effective undergraduate teacher who is able to communicate how his or her discipline contributes to an integrated account of things – but whose publishing consists of one original but brilliant article on how to teach – would receive tenure. Or would tenure be granted to a professor who is unable or unwilling to teach undergraduates, preferring to teach only advanced graduate students and engaged in ‘‘cutting-edge research.’’ MacIntyre suggests if the answers to these two inquiries are ‘‘No’’ and ‘‘Yes,’’ you can be sure you are at a university, at least if it is a Catholic university, in need of serious reform. I feel quite confident that MacIntyre learned to put the matter this way by serving on the Appointment, Promotion, and Tenure Committee of Duke University. I am confident that this is the source of his understanding of the increasing subdisciplinary character of fields, because I also served on that committee for seven years. During that time I observed people becoming ‘‘leaders’’ in their fields by making their work so narrow that the ‘‘field’’ consisted of no more than five or six people. We would often hear from the chairs of the departments that they could not understand what the person was doing, but they were sure the person to be considered for tenure was the best ‘‘in his or her field."
Stanley Hauerwas, The State of the University, page 49.

Are these reasonable criticisms of the natural sciences?

Wednesday, April 19, 2017

Commercialisation of universities

I find the following book synopsis rather disturbing.
Is everything in a university for sale if the price is right? In this book, the author cautions that the answer is all too often "yes." Taking the first comprehensive look at the growing commercialization of our academic institutions, the author probes the efforts on campus to profit financially not only from athletics but increasingly, from education and research as well. He shows how such ventures are undermining core academic values and what universities can do to limit the damage. 
Commercialization has many causes, but it could never have grown to its present state had it not been for the recent, rapid growth of money-making opportunities in a more technologically complex, knowledge-based economy. A brave new world has now emerged in which university presidents, enterprising professors, and even administrative staff can all find seductive opportunities to turn specialized knowledge into profit. 
The author argues that universities, faced with these temptations, are jeopardizing their fundamental mission in their eagerness to make money by agreeing to more and more compromises with basic academic values. He discusses the dangers posed by increased secrecy in corporate-funded research, for-profit Internet companies funded by venture capitalists, industry-subsidized educational programs for physicians, conflicts of interest in research on human subjects, and other questionable activities. 
While entrepreneurial universities may occasionally succeed in the short term, reasons the author, only those institutions that vigorously uphold academic values, even at the cost of a few lucrative ventures, will win public trust and retain the respect of faculty and students. Candid, evenhanded, and eminently readable, Universities in the Marketplace will be widely debated by all those concerned with the future of higher education in America and beyond.
What is most disturbing is that the author of Universities in the Marketplace: The Commercialization of Higher Education is Derek Bok, former President of Harvard, the richest university in the world!

There is a helpful summary and review of the book here. A longer review compares and contrasts the book to several others addressing similar issues.

How concerned should we be about these issues?

Thursday, April 13, 2017

Quantum entanglement technology hype


Last month The Economist had a cover story and large section on commercial technologies based on quantum information.

To give the flavour here is a sample from one of the articles
Very few in the field think it will take less than a decade [to build a large quantum computer], and many say far longer. But the time for investment, all agree, is now—because even the smaller and less capable machines that will soon be engineered will have the potential to earn revenue. Already, startups and consulting firms are springing up to match prospective small quantum computers to problems faced in sectors including quantitative finance, drug discovery and oil and gas. .... Quantum simulators might help in the design of room-temperature superconductors allowing electricity to be transmitted without losses, or with investigating the nitrogenase reaction used to make most of the world’s fertiliser.
I know people are making advances [which are interesting from a fundamental science point of view] but it seems to me we are a very long way from doing anything cheaper [both financially and computationally] than a classical computer.

Doug Natelson noted that at the last APS March Meeting, John Martinis said that people should not believe the hype, even from him!

Normally The Economist gives a hard-headed analysis of political and economic issues. I might not agree with it [it is too neoliberal for me] but at least I trust it to give a rigorous and accurate analysis. I found this section to be quite disappointing. I hope uncritical readers don't start throwing their retirement funds into start-ups that are going to develop the "quantum internet" because they believe that this is going to be as important as the transistor (a claim the article ends with).

Maybe I am missing something.
I welcome comments on the article.

Tuesday, April 11, 2017

Should we fund people or projects?

In Australia, grant reviewers are usually asked to score applications according to three aspects: investigator, project, and research environment. These are usually weighted by something like 40%, 40%, and 20%, respectively. Previously, I wrote how I think the research environment aspect is problematic.

I struggle to see why investigator and project should have equal weighting. For example, consider the following caricatures.

John writes highly polished proposals with well defined projects on important topics. However, he has limited technical expertise relevant to the ambitious goals in the proposal. He also tends to write superficial papers on hot topics.

Joan is not particularly well organised and does not write polished proposals. She does not plan her projects but lets her curiosity and creativity lead her. Although she does not write a lot of papers she has a good track record of moving into new areas and making substantial contributions.

This raises the question of whether we should even forget the project dimension to funding. Suppose you had the following extreme system. You just give the "best" people a grant for three years and they can do whatever they want. Three years later they apply again and are evaluated based on what they have produced. This would encourage more risks and save a lot of time in the grant preparation and evaluation process.

Are there any examples of this kind of "no strings attached" funding? The only examples I can think of are MacArthur Fellows and Royal Society Professorships. However, these are really for stellar senior people.

What do you think?

Thursday, April 6, 2017

Do you help your students debug codes?

Faculty vary greatly in their level of involvement with the details of the research projects of the undergrads, Ph.D students, and postdocs they supervise. Here are three different examples based on real senior people.

A. gives the student or postdoc a project topic and basically does not want to talk to them again until they bring a draft of a paper.

B. talks to their students regularly but boasts that they have not looked at a line of computer code since they became a faculty member. It is the sole responsibility of students to write and debug code.

C. is very involved. One night before a conference presentation they stayed up until 3 am trying to debug a students code in the hope of getting some more results to present the next day.

Similar issues arise with analytical calculations or getting experimental apparatus to work.

What is an appropriate level of involvement?
On the one hand, it is important that students take responsibility for their projects and learn to solve their own problems.
On the other hand, faculty can speed things along and sometimes quickly find "bugs" because of experience. Also a more "hands on" approach gives a better feel for how well the student knows what they are doing and is checking things.
It is fascinating and disturbing to me that in the Schon scandal, Batlogg confessed that he never went in the lab and so did not realise there was no real experiment.

I think there is no clear cut answer. Different people have different strengths and interests (both supervisors and students). Some really enjoy the level of detail and others are more interested in the big picture.
However, I must say that I think A. is problematic.
Overall, I am closer to B. than C, but this has varied depending on the person involved, the project, and the technical problems.

What do you think?

Tuesday, April 4, 2017

Some awkward history

I enjoyed watching the movie Hidden Figures. It is based on a book that recounts the little-known history of the contributions of three African-American women to NASA and the first manned space flights in the 1960s. The movie is quite entertaining and moving while raising significant issues about racism and sexism in science. I grimaced at some of the scenes. On the one hand, some would argue we have come a long way in fifty years. On the other hand, we should be concerned about how the rise of Trump will play out in science.


One minor question I have is how much of the math on the blackboards is realistic?



Something worth considering is the extent to which the movie fits the too-common white savior narrative, as highlighted in a critical review, by Marie Hicks.

Saturday, April 1, 2017

A fascinating thermodynamics demonstration: the drinking bird

I am currently helping teach a second year undergraduate course Thermodynamics and Condensed Matter Physics. For the first time I am helping out in some of the lab sessions. Two of the experiments are based on the drinking bird.



This illustrates two important topics: heat engines and liquid-vapour equilibria.

Here are a few observations fo in random order.

* I still find it fascinating to watch. Why isn't it a perpetual motion machine?

* Several more surprising things are:
a. it operates on such a small temperature difference,
b. that there is a temperature difference between the head and bulb,
c. it is so sensitive to perturbations such as warming with your fingers or changes in humidity.

* It took me quite a while to understand what is going on, which makes me wonder about the students doing the lab. How much are they following the recipe and saying the mantra...

* I try to encourage the students to think critically and scientifically about what is going on, asking some basic questions, such as "How do you know the head is cooler than the bulb? What experiment can you do right now to test your hypothesis? How can you test whether evaporative cooling is responsible for cooling the head?" Such an approach is briefly described in this old paper.

* Understanding and approximately quantifying the temperature of the head involves the concept of humidity, wet-bulb temperature and a psychometric chart. Again I find this challenging.

* This lab is a great example of how you don't necessarily need a lot of money and fancy equipment to teach a lot of important science and skills.