An expanded version of my recent Times column on ice ages:
Record cold in America has brought temperatures as low as minus 44C in North Dakota, frozen sharks in Massachusetts and iguanas falling from trees in Florida. Al Gore blames global warming, citing one scientist to the effect that this is “exactly what we should expect from the climate crisis”. Others beg to differ: Kevin Trenberth, of America’s National Centre for Atmospheric Research, insists that “winter storms are a manifestation of winter, not climate change”.
Forty-five years ago a run of cold winters caused a “global cooling” scare. “A global deterioration of the climate, by order of magnitude larger than any hitherto experienced by civilised mankind, is a very real possibility and indeed may be due very soon,” read a letter to President Nixon in 1972 from two scientists reporting the views of 42 “top” colleagues. “The cooling has natural causes and falls within the rank of the processes which caused the last ice age.” The administration replied that it was “seized of the matter”.
In the years that followed, newspapers, magazines and television documentaries rushed to sensationalise the coming ice age. The CIA reported a “growing consensus among leading climatologists that the world is undergoing a cooling trend”. The broadcaster Magnus Magnusson pronounced on a BBC Horizon episode that “unless we learn otherwise, it will be prudent to suppose that the next ice age could begin to bite at any time”.
Newsweek ran a cover story that read, in part: “The central fact is that, after three quarters of a century of extraordinarily mild conditions, the Earth seems to be cooling down. Meteorologists disagree about the cause and extent of the cooling trend, as well as over its specific impact on local weather conditions. But they are almost unanimous in the view that the trend will reduce agricultural productivity for the rest of the century.”
This alarm about global cooling has largely been forgotten in the age of global warming, but it has not entirely gone away. Valentina Zharkova of Northumbria University has suggested that a quiescent sun presages another Little Ice Age like that of 1300-1850. I’m not persuaded. Yet the argument that the world is slowly slipping back into a proper ice age after 10,000 years of balmy warmth is in essence true. Most interglacial periods, or times without large ice sheets, last about that long, and ice cores from Greenland show that each of the past three millennia was cooler than the one before.
However, those ice cores, and others from Antarctica, can now put our minds to rest. They reveal that interglacials start abruptly with sudden and rapid warming but end gradually with many thousands of years of slow and erratic cooling. They have also begun to clarify the cause. It is a story that reminds us how vulnerable our civilisation is. If we aspire to keep the show on the road for another 10,000 years, we will have to understand ice ages.The oldest explanation for the coming and going of ice was based on carbon dioxide. In 1895 the Swede Svante Arrhenius, one of the scientists who first championed the greenhouse theory, suggested that the ice retreated because carbon dioxide levels rose, and advanced because they fell. If this were true, he thought, then industrial emissions could head off the next ice age.
Burning coal, Arrhenius said, was therefore a good thing: “By the influence of the increasing percentage of carbonic acid in the atmosphere, we may hope to enjoy ages with more equable and better climates.”
There is indeed a correlation in the ice cores between temperature and carbon dioxide. There is less CO2 in the air when the world is colder and more when it is warmer. An ice core from Vostok in Antarctica found in the late 1990s that CO2 is in lock-step with temperature -- more CO2, warmer; less CO2, colder. As Al Gore put it sarcastically in his 2006 film An Inconvenient Truth, looking at the Vostok graphs: “Did they ever fit together? Most ridiculous thing I ever heard.” So Arrhenius was right? Is CO2 level the driver of ice ages?
Well, not so fast. Inconveniently, the correlation implies causation the wrong way round: at the end of an interglacial, such as the Eemian period, over 100,000 years ago, carbon dioxide levels remain high for many thousands of years while temperature fell steadily. Eventually CO2 followed temperature downward. Here is a chart showing that. If carbon dioxide was a powerful cause, it would not show such a pattern. The world could not cool down while CO2 remained high.
In any case, what causes the carbon dioxide levels to rise and fall? In 1990 the oceanographer John Martin came up with an ingenious explanation. During ice ages, there is lots of dust blowing around the world, because the continents are dry and glaciers are grinding rocks. Some of that dust falls in the ocean, where its iron-rich composition fertilizes plankton blooms, whose increased photosynthesis draws down the carbon dioxide from the air. When the dust stops falling, the plankton blooms fail and the carbon dioxide levels rise, warming the planet again.
Neat. But almost certainly too simplistic. We now know, from Antarctic ice cores, that in each interglacial, rapid warming began when CO2 levels were very low. Temperature and carbon dioxide rise together, and there is no evidence for a pulse of CO2 before any warming starts, if anything the reverse. Well, all right, said scientists, but carbon dioxide is a feedback factor – an amplifier. Something else starts the warming, but carbon dioxide reinforces it. Yet the ice cores show that in each interglacial cooling returned when CO2 levels were very high and they remained high for tens of thousands of years as the cooling continued. Even as a feedback, carbon dioxide looks feeble.
Here is an essay by Willis Eschenbach discussing this issue. He comes to five conclusions as to why CO2 cannot be the main driver and why the feedback effect is probably small:
The correspondence with log(CO2) is slightly worse than that with CO2. The CO2 change is about what we’d expect from oceanic degassing. CO2 lags temperature in the record. Temperature Granger-causes CO2, not the other way round. And (proof by contradiction) IF the CO2 were controlling temperature the climate sensitivity would be seven degrees per doubling, for which there is no evidence.
Now, the standard response from AGW supporters is that the CO2, when it comes along, is some kind of positive feedback that makes the temperature rise more than it would be otherwise. Is this possible? I would say sure, it’s possible … but that we have no evidence that that is the case. In fact, the changes in CO2 at the end of the last ice age argue that there is no such feedback. You can see in Figure 1 that the temperatures rise and then stabilize, while the CO2 keeps on rising. The same is shown in more detail in the Greenland ice core data, where it is clear that the temperature fell slightly while the CO2 continued to rise.
As I said, this does not negate the possibility that CO2 played a small part. Further inquiry into that angle is not encouraging, however. If we assume that the CO2 is giving 3° per doubling of warming per the IPCC hypothesis, then the problem is that raises the rate of thermal outgassing up to 17 ppmv per degree of warming instead of 15 ppmv. This is in the wrong direction, given that the cited value in the literature is lower at 12.5 ppmv
A more logical explanation for the inverse correlation between dust and CO2can be seen through the effect that CO2 concentrations have on plant life. Fig. 8 also shows that CO2 levels during each ice-age came all the way down to 190–180 ppm, and that is approaching dangerously low levels for C3 photosynthesis-pathway plant life. CO2 is a vital component of the atmosphere because it is an essential plant food, and without CO2 all plants die. In her comprehensive analysis of plant responses to reduced CO2 concentrations, Gerhart says of this fundamental issue:
It is clear that modern C3 plant genotypes grown at low CO2 (180–200 ppm) exhibit severe reductions in photosynthesis, survival, growth, and reproduction … Such findings beg the question of how glacial plants survived during low CO2 periods … Studies have shown that the average biomass production of modern C3 plants is reduced by approximately 50% when grown at low (180–220 ppm) CO2, when other conditions are optimal … (The abortion of all flower buds) suggested that 150 ppm CO2 may be near the threshold for successful completion of the life cycle in some C3 species (Gerhart and Ward, 2010 Section II).
It is clear that a number of plant species would have been under considerable stress when world CO2 concentrations reduced to 200 or 190 ppm during the glacial maximum, especially if moisture levels in those regions were low (Gerhart and Ward, 2010; Pinto et al., 2014). And palaeontological discoveries at the La Brea tar pits in southern California have confirmed this, where oxygen and carbon isotopic analysis of preserved juniperus wood dating from 50 kyr ago through to the Holocene interglacial has shown that: ‘glacial trees were undergoing carbon starvation’ (Ward et al., 2005). And yet these stresses and biomass reductions do not appear to become lethal until CO2 concentrations reach 150 ppm, which the glacial maximums did not achieve - unless we add altitude and reducing CO2 partial pressures into the equation.
Receive all my latest posts straight to your inbox. simply subscribe below:
[*] denotes a required field