In a remarkable example of human-centeredness, Stanford University geochemist Richard Nevle blames Christopher Columbus for a sharp reduction in atmospheric CO2 during the 16th and 17th centuries. It seems that man-made warming believers never tire of telling us how powerful humans are, usually for the worse, in our ability to change nature.
Nevle claims that the deaths of American Indians, due to the sudden spread of European diseases after Columbus landed, would have stopped the Indians from burning so many forests to enhance their hunting. He says this would naturally lead to re-forestation of a land area at least as big as California. He estimates the billions of tons of CO2 withdrawn from the atmosphere as the new trees grew should just about explain a sudden drop in atmospheric CO2 during the years from 1500 to 1700 AD—as measured in the Antarctic ice cores.
If Dr. Nevle can “read” the deaths of the American Indians in the Antarctic ice record, has he checked for the impact of the Black Death in Europe and the Near East during the 14th century? Roughly half the population of Europe died then, along with vast numbers of people across the Near East. It is on the record that huge tracts of European land were allowed to revert from farm to forest during this period. The Near East got 300 years of persistent drought in the same time frame. Even the scruffy environment in North Africa and Syria is capable of changing the earth’s reflectance of sunlight if its people die of plague and the vegetation dries up.
I would think a geochemist, especially one from Stanford, would understand that the oceans hold about 70 times more CO2 than does the atmosphere. He would also understand that when water gets colder, it absorbs more gas from its surroundings. Thus, if a weakening sun suddenly put less heat into the earth’s oceans, the oceans would take more CO2 from that air. That CO2 reduction would register in the Antarctic ice cores and in temperatures around the globe.
During the 16th and 17th centuries, the middle of the Little Ice Age, the sun had two extremely long “quiet periods” with very few sunspots. During these minima, the earth’s temperatures were slammed down to their lowest levels since the last big Ice Age. The Sporer Minimum lasted from 1460 to 1550, and dropped the temperatures in the subtropical Sargasso Sea by 2 degrees C. The Maunder Minimum lasted from 1645 to 1715 and dropped the Sargasso temperatures by 3 degrees C. In all, it meant nearly 200 years of declining temperatures in zillions of tons of water around the world, which then dutifully sucked CO2 out of the Antarctic air.
We’ve known about the Dansgaard-Oeschger 1500-year solar cycle of warming and cooling since 1984, and we’ve now found its evidence in ice cores, cave stalagmites, seabed sediments and fossil pollen—worldwide. The cycle is so strong that it persists even during the big Ice Ages that hit every 100,000 years and drops Antarctic temperatures by nearly 10 degrees C.
Could it be that Dr. Nevle is again over-estimating humanity’s importance? Should we be paying more attention to our currently very quiet sun? Maybe the lack of warming over the past 15 years is trying to tell him that CO2 is a minor trace gas—whose correlation with our temperatures over the past 160 years is a puny 22 percent.
January 1, 2012
~ The Author ~
Dennis T. Avery is a senior fellow for Hudson Institute in Washington, D.C. and is the Director for Center for Global Food Issues. He was formerly a senior analyst for the Department of State. Readers may write him at Post Office Box 202, Churchville, VA 24421.