At one point, Koonin read an extract from the IPCC’s Fifth Assessment Report released the previous year. Computer model-simulated responses to forcings—the term used by climate scientists for changes of energy flows into and out of the climate system, such as changes in solar radiation, volcanic eruptions, and changes in the concentrations of greenhouse gases in the atmosphere—“can be scaled up or down.” This scaling included greenhouse gas forcings.
Some forcings in some computer models had to be scaled down to match computer simulations to actual climate observations. But when it came to making centennial projections on which governments rely and drive climate policy, the scaling factors were removed, probably resulting in a 25 to 30 percent over-prediction of the 2100 warming.
- Dr. Koonin: But if the model tells you that you got the response to the forcing wrong by 30 percent, you should use that same 30 percent factor when you project out a century.
- Dr. Collins: Yes. And one of the reasons we are not doing that is we are not using the models as [a] statistical projection tool.
- Dr. Koonin: What are you using them as?
- Dr. Collins: Well, we took exactly the same models that got the forcing wrong and which got sort of the projections wrong up to 2100.
- Dr. Koonin: So, why do we even show centennial-scale projections?
- Dr. Collins: Well, I mean, it is part of the [IPCC] assessment process.
On becoming BP’s chief scientist in 2004, Koonin became part of the wider climate change milieu. Assignments included explaining the physics of man-made global warming to Prince Philip at a dinner in Buckingham Palace. In 2009, Koonin was appointed an under-secretary at the Department of Energy in the Obama administration.
Koonin’s indictment of The Science starts with its reliance on unreliable computer models. Usefully describing the earth’s climate, writes Koonin, is “one of the most challenging scientific simulation problems.” Models divide the atmosphere into pancake-shaped boxes of around 100km wide and one kilometer deep. But the upward flow of energy from tropical thunder clouds, which is more than thirty times larger than that from human influences, occurs over smaller scales than the programmed boxes. This forces climate modellers to make assumptions about what happens inside those boxes. As one modeller confesses, “it’s a real challenge to model what we don’t understand.”
Inevitably, this leaves considerable scope for modelers’ subjective views and preferences. A key question climate models are meant to solve is estimating the equilibrium climate sensitivity of carbon dioxide (ECS), which aims to tell us by how much temperatures rise from a doubling of carbon dioxide in the atmosphere. Yet in 2020, climate modelers from Germany’s Max Planck Institute admitted to tuning their model by targeting an ECS of about 3° Centigrade. “Talk about cooking the books,” Koonin comments.
The proof of the pudding, as they say, is in the eating. Self-evidently, computer projections can’t be tested against a future that’s yet to happen, but they can be tested against climates present and past. Climate models can’t even agree on what the current global average temperature is. “One particularly jarring feature is that the simulated average global surface temperature,” Koonin notes, “varies among models by about 3°C, three times greater than the observed value of the twentieth century warming they’re purporting to describe and explain.”
Another embarrassing feature of climate models concerns the earlier of the two twentieth-century warmings from 1910 to 1940, when human influences were much smaller. On average, models give a warming rate of about half of what was actually observed. The failure of the latest models to warm fast enough in those decades suggest that it’s possible, even likely, that internal climate variability is a significant contributor to the warming of recent decades, Koonin suggests. “That the models can’t reproduce the past is a big red flag—it erodes confidence in their projections of future climates.” Neither is it reassuring that for the years after 1960, the latest generation of climate models show a larger spread and greater uncertainty than earlier ones—implying that, far from advancing, The Science has been going backwards. That is not how science is meant to work.
The second part of Koonin’s indictment concerns the distortion, misrepresentation, and mischaracterization of climate data to support a narrative of climate catastrophism based on increasing frequency of extreme weather events. As an example, Koonin takes a “shockingly misleading” claim and associated graph in the United States government’s 2017 Climate Science Special Report that the number of high-temperature records set in the past two decades far exceeds the number of low-temperature records across the 48 contiguous states. Koonin demonstrates that the sharp uptick in highs over the last two decades is an artifact of a methodology chosen to mislead. After re-running the data, record highs show a clear peak in the 1930s, but there is no significant trend over the 120 years of observations starting in 1895, or even since 1980, when human influences on the climate grew strongly. In contrast, the number of record cold temperatures has declined over more than a century, with the trend accelerating after 1985.
Notes Koonin, “temperature extremes in the contiguous U.S. have become less common and somewhat milder since the late nineteenth century.” Similarly, a key message in the 2014 National Climate Assessment of an upward trend in hurricane frequency and intensity, repeated in the 2017 assessment, is contradicted 728 pages later by a statement buried in an appendix stating that there has been no significant trend in the global number of tropical cyclones “nor has any trend been identified in the number of U.S. land-falling hurricanes.”
Koonin also has sharp words for the policy side of the climate change consensus, which asserts that although climate change is an existential threat, solving it by totally decarbonizing society is straightforward and relatively painless. “Two decades ago, when I was in the private sector,” Koonin writes, “I learned to say that the goal of stabilizing human influences on the climate was ‘a challenge,’ while in government it was talked about as ‘an opportunity.’ Now back in academia, I can forthrightly call it ‘a practical impossibility.’”
Unlike many scientists and most politicians, Koonin displays a sure grasp of the split between developed and developing nations, for whom decarbonization is a luxury good that they can’t afford. The fissure dates back to the earliest days of the U.N. climate process at the end of the 1980s. Indeed, it’s why developing nations insisted on the U.N. route as opposed to an intergovernmental one that produced the 1987 Montreal Protocol on ozone-depleting substances. “The economic betterment of most of humanity in the coming decades will drive energy demand even more strongly than population growth,” Koonin says. “Who will pay the developing world not to emit? I have been posing that simple question to many people for more than fifteen years and have yet to hear a convincing answer.”
At the outset of “Unsettled,” Feynman’s axiom of absolute intellectual honesty is contrasted with climate scientist Stephen Schneider’s “double ethical bind.” On the one hand, scientists are ethically bound by the scientific method to tell the truth. On the other, they are human beings who want to reduce the risk of potentially disastrous climate change. “Each of us has to decide what the right balance is between being effective and being honest,” Schneider said.
The moratorium on the asking of questions represents the death of science as understood and described by Popper, a victim of the conflicting requirements of political utility and scientific integrity. Many scientists take this lying down. Koonin won’t. For his forensic skill and making his findings accessible to non-specialists, Koonin has written the most important book on climate science in decades.