Global Warming - New and Real, or Old Hat?

By Lee G. Madland
Volume 19, Number 4 (Summer 2009)
Issue theme: "Progressives for Immigration Reform"


An issue that won’t go away, and probably will not anytime soon, is that of Global Warming: Is it in fact occurring? If so, will it continue, and by how much and how rapidly? Is present human technological input either a major cause or exacerbating factor—and if that is so, can its major undesired effects be stopped or at least slowed by large-scale changes in how technology is applied worldwide? Even aside from actually attempting the latter, do we know what changes in the world’s climates will take place? If not, what unintended consequences will our intended remedies produce? Can known climatic events of the past give us clues as to what the future holds?

Here we’ll concentrate on the latter question.

The Impact of Past Climates

Weather and climate are obviously interrelated, weather being our ordinary day-to-day and month-to-month changes, and climate the summation of weather as averaged over greater periods; that is, years, decades, centuries, or much longer. The fact that meteorologists—specialists on weather—are so often frustrated in their most earnest efforts to predict weather accurately as little as a week ahead should give pause to scientists who project climate changes over much longer periods using their current favored tool, the computer model (properly, GCM or Global Circulation Model). In both cases, there are complicating factors beyond—far beyond—what scientists are now equipped to handle with confidence, or will likely be able to do for a very long time, if ever. Curiously, however, for really long periods involving geologic time scales, there are certain predictions that can confidently be made, as we’ll see.

Apart from the scientific question as to whether global warming is or is not occurring today, here we’ll focus on an approach to the question based on Earth history, which has the virtue that it uses known facts about past climates—from scientific research that has uncovered evidence of many major events long before human historical records began, evidence undoubted in scientific circles. We’ll first look at some relevant examples from recent history and go back from there.

Last year I was looking over some sheets of the detailed World Aeronautical Chart series to trace a route taken by an older brother-in-law in 1942, who was then supply officer of a naval vessel bringing men and equipment to construct an airfield on Greenland’s west coast for use as a more secure refueling stop for supplying beleaguered Britain, in case the one then being used in Iceland should be attacked by German forces. That site, where an early freeze forced him and his ship to overwinter, lies near the inland end of a very long fjord (Sondre Stromfjord) close to the vast ice sheet’s edge. It is today one of Greenland’s few major airports, used as a regional transport center and jumping-off–place for ongoing scientific research on the ice-cap. While perusing the map, I noticed that not so far to the southeast, two particular areas, each some dozens of miles wide, are prominently shown on the ice-cap near its fringes with numerous summer meltwater ponds, along with their drainage courses complete with directional arrows. I was immediately struck upon realizing that this represented exactly the situation shown in probably the most alarming scene of Al Gore’s recent movie: meltwater torrents gouging huge gullies in the ice with people looking on, probably in that very same region. But an inconvenient truth for Gore’s filmed example is that the map in my hands was published back in 1950, using data compiled years earlier. So, scenes like that one—presented in the film as a frightening new phenomenon of ice-cap melting produced by global warming—were routine seasonal occurrences even at that time during a period of global cooling, and could have been photographed then from the air and doubtless were, to be mapped with such precision.

Has this summer melting near the edges of the Greenland ice-cap increased in recent decades? It does seem so, though the period of reliable measurements or estimates covers a rather short period, much too short to confidently project into the longer future. And no matter how good the measurements may be, there is always the question: Are we seeing the beginning of a longer trend that may last hundreds of years or much more, or a short fluctuation of a few decades or so? (A very impressive recent air photo of that seasonal melting near the edge of the ice-cap and another taken at ground level, these almost surely in the same area just noted, are shown in the June 2007 issue of National Geographic as part of a pointedly worried cover story.)

But then, near the opposite Pole, what about the oft-noted simultaneous cooling and thickening of ice in much of the interior of the greater Antarctic ice-cap—which contains fully ten times Greenland’s volume of ice? It’s worth noting here that the substantial regional warming and consequent breakup of several ice shelves fringing the long and narrow Antarctic Peninsula—the subject of so much present concern and publicity—not only extends too far north, away from the Pole and thus in warmer climes, to be considered typical of the rest of Antarctica, but also that this so-much-watched peninsula accounts for only two percent of the continent’s total area.

In any case, it is clear that in terms of a feared raising of sea level, any melting of Greenland’s ice-cap cannot trump a widely cited half-century of cooling and ice gain of the far greater ice sheet covering East Antarctica.1

However, a recent cover article in the journal Nature indicates that such cooling may have been overestimated, and that according to the authors’ reconstructions this huge East region has even warmed slightly over their studied 50-year period of 1957–2006 (despite a very slight cooling they show for the 30-year 1969–2000 period)—and moreover, that the smaller but still substantial-sized West Antarctic region has warmed more than has been supposed, with an average net gain of 0.17 °C per decade for the above-noted 50 years. East Antarctica weighed in at a smaller net gain of 0.10 °C, ±0.07 °C per decade, a considerable possible-error estimate but even so showing a result on the net warming side. The continent-wide mean trend over the same 50-year period is shown at a 0.12 °C warming with the same error tolerance.2

The authors also mention a common scientific opinion that any cooling of East Antarctica might be linked to the ozone hole in the upper atmosphere centered above that continent, such opinion hinting that this “hole” may be healed by the middle of the present century as the ban on further release of CFCs (chlorofluorocarbons) takes effect to restore the ozone “ceiling,” to inhibit reflected sunlight from escaping into space and thereby reverse any recent cooling. However, we may have to wait decades to know whether, or how much, this effect will apply.

The main point here, however, is less to note present trends whose duration can be little more than speculation based on fragmentary information, but to look at definitely known events in Earth history. Such an approach may be more revealing than computer models of global circulation that attempt to project future events on the basis of general assumptions and incomplete data, which routinely must be filled in with speculative “fudge factors” to produce a general conclusion. We’ll begin with recent well-known climatic events and go back from there.

Some centuries ago, what climatologists call the “Little Ice Age”—generally dated in Europe and the North Atlantic as the half-millennium or so from roughly 1300 to around 1850 (some mark its end at about 1900 but no sharp boundaries apply)—wreaked havoc on European and Near East civilizations, well described in a book of that title by archeologist Brian Fagan.3

That cold period was preceded by what is called the Medieval Warm Period, during which average temperatures often reached levels warmer than today in most of the then known world. From data based on studies of ocean-floor sediments as well as historical reports, this warm spell ran from the mid-800s to 1300 or so, with one peak in the late 800s and a warmer peak about 1100. The 1000s through 1200s marked the most productive development and major population growth of the two Norse settlement areas on southern Greenland’s west coast—although this has been overblown by some recent commentators such as a prominent conservative talk-show host who in his eagerness to debunk supposed modern Global Warming inadvertently showed his limited historical knowledge of that period and region by offhandedly speaking of rich midlatitude crops in Greenland, including wine grapes. (Actually, the farthest north that wine grapes have ever been raised on a major commercial scale—during those warmer times, to be sure—was in the South of England, which in the 1100s and 1200s even exported much wine to France. Norse Greenland in those warmer times supported lush summer grasses used for animal feed, but nary a grape—ever.)4

Looking at earlier centuries in Europe and the Near East, those same ocean-floor sediment studies in the North Atlantic show two temperature peaks considerably warmer than today’s that centered around 1000 BC and 500 BC (with a sharply colder trough between), followed by several centuries of warmth rather similar to today’s on both sides of Year 1 of the Christian or Common era. But later a longer and definitely colder climatic trough was reached with a long bottom during the centuries from about AD 200 to 500. (Some can’t help noticing that this period marked the later stages of the Roman Empire during its decline and fall.) There was another plunge in the 600s to early 800s, though not as severe.

In any case, whatever the causes of historic climate change in earlier centuries and millennia may have been, obviously none of these bear any possibility of significant human input.

We now delve into prehistory. Modern science has revealed that for the last 10,000 years humans have been living in a prolonged overall warm period that saw an erratic but generally accelerating rise of civilization. This is the interglacial time known to science as the Holocene, the most recent (and continuing) “epoch” and by far the thinnest so-labeled slice of geologic time. But a closer look at these ten millennia, such as provided by the Greenland ice cores, shows not nice flat temperature levels but periodic, often sharp ups and downs that have often equaled and sometimes exceeded any of the fluctuations noted during the more recent span of written human records. The earliest such records found, inscribed or impressed on clay tablets, were unearthed from Sumerian Mesopotamia and have been dated back to about 3400 BC—i.e., 5,400 years ago. Is it pure happenstance that the first known appearance of writing, used to record in-kind temple tax collections of crops and animals during difficult times, coincided with one of the coldest temperature plunges of the last 10,000 years?

The time scales of such climatic wobbles varied from millennia in duration to more typically a few centuries, or even just decades. Starting with the ending of the last Ice Age 10,000 years ago, Fagan has discussed these and their effects on humans in his book The Long Summer. One of the longer-lasting and more spectacular examples of climate change during that “summer”: A retreating Sahara Desert with expanding lakes in its southern and central parts that supported hippos and crocodiles in regions such as northern Mali between about 10,000 and 4,600 years ago, with normal rainfalls during that time reaching as much as 6 to 16 inches annually in a region that today averages as little as a quarter of an inch. The final drying-up of the Sahara lakes occurred roughly at the time Egyptian pharaohs were building the first pyramids.5

A quite different major wobble, some 8,200 years ago, was a global cold snap caused by a sudden collapse (after ten millennia of slow off-and-on shrinkage) of much of the great North American ice sheet centered over today’s Hudson Bay. (Even after thousands of years of melting, the sheet still covered an area two to three times greater than that bay’s present extent.) At that time an immense build-up of meltwaters undermined the ice sheet’s southern parts and sent enormous torrents of floodwaters bounding down the Mississippi Valley into the Gulf of Mexico, while another giant freshwater outflow rushed eastward directly into the North Atlantic. All this triggered a 400-year “Mini Ice Age” by disrupting ocean currents—shutting down the Gulf Stream in the Atlantic to bring cold and drought to huge regions such as Europe and North Africa and even causing tropical ocean cooling of fully 3 °C (5.4 °F) in the Pacific Warm Pool off Indonesia, on the opposite side of the Earth. What’s more, the outflow caused worldwide rising sea levels of up to two inches per year, which transformed Britain and Ireland from part of the European mainland to the islands they are today.6

These and many other swings have been explored by Lloyd Keigwin of the Woods Hole Oceanographic Institution, showing ancient sea surface temperatures derived from the earlier mentioned sediment cores from the midocean North Atlantic floor, taken by Keigwin and his team in the mid-1990s. He was one of the first to identify a roughly 1,500-year cycle of major climate changes from warm to cold and back to warm, derived from oxygen isotope ratios in the remains of tiny planktonic organisms buried in the sea-floor sediments, from which the ancient sea-surface temperatures were derived.7

The existence of that cycle was corroborated by other researchers, as reported by Richard Kerr of the journal Science. Citing paleo-oceanographer Gerard Bond of Columbia University’s Lamont-Doherty Earth Observatory and others, Kerr observed that new evidence appears to confirm that the long cold snap (the Little Ice Age) was nothing exceptional. Instead, it was only the most recent swing in a climate oscillation that has been alternately warming and cooling the North Atlantic region, if not the globe, for ages upon ages.

Based partly on bottom cores that Bond and his team raised consisting of ice-rafted bits of rock well off Newfoundland, he found that the 1,000- to 2,000-year oscillation runs “through the Holocene and right into the Little Ice Age. The Little Ice Age was not an isolated event.”8

But what could cause such repeated cycles of more or less 1,500 years? Bond noted that evidence of these cycles is not limited to studies in the northern Atlantic. Similar links and timing have been cited by Bond and others from different kinds of evidence found from such diverse regions as other Atlantic basins off West Africa and Venezuela, in the Arabian Sea off Pakistan, in the Sulu Sea off the Philippines, and also land-based climatic evidence in places such as Germany, Yucatan, Oman, equatorial East Africa, lake sediments in southwestern Alaska, and the deep ice cores from Greenland and Antarctica. The links are global, not local.

The main cause of the 1500-year climate cycle now seems likely rooted in cycles of solar radiance whose potential importance have been given much scientific credence only in our present decade since around the year 2000. It had been thought that such variations in the Sun’s output—of only a fraction of a percent, even as low as 0.1 percent—in what was regarded as the steady “solar constant” until satellite data became available in the late 1960s, were too small to affect Earth’s climate cycles significantly. Bond suggested that such cycles, carrying through both a full Ice Age of over 100,000 years and our present much shorter warm interglacial of the last 10,000 years, including the recent Little Ice Age and the warming since, are “a pervasive feature of the climate system.”9

Since well before the end of the Little Ice Age somewhat over a century ago, scientists have known that extended cold periods coincide with low sunspot activity, though no one could explain what the connection, if any, might be or what it might mean; they basically just reported the bare fact. Only recently has it come to be realized that heightened sunspot activity indicates those very small fractional increases in solar output mentioned above, and that fewer or no sunspots for an appreciable period are a dead giveaway for diminished solar activity.

But how could such slight changes in solar output cause major fluctuations such as the Little Ice Age, flanked by warmings before and since? While these are dramatic changes to our human senses and greatly affect our land use, remember that to the whole Earth they are small. Another aspect to keep in mind is that such cooling or warming affects especially the higher latitudes, with smaller temperature changes in the tropics (where extended drought is a clearer indicator than cooling, and wetter periods a better clue than warming).

A pair of noted scientists, astrophysicist Nir Shaviv at the Racah Institute of Physics in Jerusalem and geologist Ján Veizer at the University of Ottawa in Canada, came up with some seminal and intriguing answers early in the 2000s. These are based on Veizer’s studies of Earth’s early temperatures based on isotopes in seashells going back 545 million years (about as long as shelled marine life has existed to leave fossils), and on Shaviv’s investigations of cosmic rays—electrically charged particles from interstellar space that continually reach our Solar System as we periodically pass back and forth through large arms of the Milky Way in 100- to 150-million year cycles. Each arm has concentrations of billions of stars which include exploding stars—supernovas—that shoot cosmic rays in all directions, including toward our Solar System. Shaviv and Veizer found a striking match: eras of high cosmic-ray bombardment strongly corresponded with cold eras on Earth such as ice ages, and eras of low cosmic-ray entry corresponded with warm eras. Shaviv and Veizer concluded that “once we introduce this cosmic-ray variance as a ‘driver’...we can explain up to 75 percent of the Earth’s paleo-[early] temperature variability.” They also found very little correlation of temperature with carbon dioxide levels in the atmosphere, considering that during the 545-million-year period of their study, CO2 levels have been as much as many times higher than today (in contrast with the recent quite small fractional increases now causing so much concern). Also, suggesting that cosmic-ray entry rates apply also to short-term climatic cycles, they concluded that “we can rule out with a high confidence level models that do not include the effects of a variable CRF “[cosmic ray flux, or flow].” (Emphasis is theirs.)10

But then, just how would varying levels of cosmic-ray entry into the Solar System affect temperatures on Earth? While Shaviv and Veizer suggested that cloud cover would be a likely candidate, this question has since been investigated in depth by an eminent astrophysicist, Henrik Svensmark, Director of the Center for Sun-Climate Research in the Danish National Space Center in Copenhagen.

Svensmark’s major findings and conclusions can be summarized briefly. The Sun radiates not only rays of electromagnetic energy such as heat and light, but also a stream of minute charged particles known as the solar wind, which varies in strength with cycles of solar activity. When the solar wind is strong, it intercepts and scatters many of the incoming cosmic-ray particles from interstellar space entering the Solar System, thereby weakening their effect. But when the solar wind is weak, more cosmic rays reach Earth. The Sun’s activity thus largely controls the strength of cosmic-ray activity affecting Earth. When levels of cosmic rays penetrating the Earth’s atmosphere are high, they stimulate cloud formation by colliding with tiny particles in the atmosphere that “seed” low-level clouds (those mostly below about 10,000 feet). He notes that “low-level clouds cover more than a quarter of the Earth and exert a strong cooling effect at the surface...which is not trivial.” He also reminds us that “Cloud tops have a high albedo [reflectivity] and exert their cooling effect by scattering back into the cosmos much of the sunlight that could otherwise warm the surface.”11

“A shiny Earth is cool” is a succinct summary in Svensmark/Calder of the effect of cosmic rays in cloud formation on a world scale. To follow up on his theory, Svensmark and his team conducted in 2005 a key laboratory experiment using a two-meter-high boxlike ion-reaction chamber financed initially by private benefactors and later boosted by Danish government funds. While no more than a highly simplified outline can be given here, clean air of oceanic composition was subjected to ultraviolet bursts to simulate solar activity. The natural cosmic rays continually penetrating the building produced free charged electrons (ions), causing millions of sulfuric acid vapor particles to clump together as “seeds” around which clusters of water droplets can condense, just as in the lower atmosphere. The result was a quick and visible forming of millions of water droplets in the chamber, like those producing actual clouds in the atmosphere.

A leading modeler at the chief U.S. agency for atmospheric research has frankly admitted that “Climate models do not do clouds well—they are perhaps the biggest problem we have in using climate models to make predictions about global warming.” This may explain why modelers usually dismiss clouds as passive participants in Global Warming.12

To recapitulate: A strong Sun produces a strong solar wind, repelling cosmic rays so that fewer of them reach Earth. This reduces low-cloud seeding and thus causes more surface warming, since less cloud cover allows more sunshine to get through to the Earth’s surface. But a weak Sun, and thus a weak solar wind, allows more cosmic rays to reach Earth, stimulating low-cloud seeding and causing surface cooling—since more clouds reflect more sunshine back into outer space. This ties solar activity to the heating and cooling of Earth’s surface in both short-term and very long-term cycles. It also magnifies the effect of small changes in solar activity. Svensmark has estimated this effect of cloud-forcing at four times that of small fluctuations in the Sun’s radiation.13

If the findings of Keigwin, Bond, Shaviv, Veizer, Svensmark, and others are correct, the implications for climate science might be described as the reverse of Shakespeare’s famous observation on human affairs—that in this case, the answer may lie not in ourselves but in the stars.

Continuing our journey back in time, the ending stages of the 100,000-year recent full Ice Age was a chaotic period of truly wild but by no means unique global temperature swings. The maximum extent of the ice sheets, covering half of both North America and Europe, was reached 18,000 years ago; but it took more than 10,000 years of off-and-on-again warming for them to melt down to more or less their present extent in Greenland and Antarctica. There was first an erratic cooling to full ice-age temperatures, which at 15,000 years ago was followed by a dramatic warming that briefly peaked 500 years later at close to present levels, then was succeeded by an equally erratic chilling that culminated in what is called the “Younger Dryas” event which lasted more than a millennium, dated at 12,800 to 11,600 years before the present time. At its low point this deep chill rivaled the coldest times of the great Ice Age then ending.14

But after that big freeze, there began a pronounced warming (punctuated by further fluctuations, to be sure) that brought by 10,000 years ago climates a bit warmer even than today’s that inaugurated our present relatively stable Holocene, or Recent, times.15

This, the time in which we are still living, and which has seen an unprecedented stirring of the still-ongoing explosion of human material and intellectual development, has been set into a longer view and strikingly clarified in simple, direct terms by an eminent geologist-paleontologist, Peter D. Ward at the University of Washington.

First, Ward puts it into context by noting that, according to current evolutionary ideas and knowledge, Humans equal in intelligence to any on Earth today have probably been on this planet for a minimum of 100,000 years. How many Einsteins and Newtons must have lived during our species’s long existence, and why couldn’t they figure out that putting a seed into the ground causes a plant, a food plant, to grow? Why did we— Homo sapiens sapiens—spend a minimum of 100,000 years (and as much as 200,000 years) living in the open or in caves, living at low population numbers, living by hunting and gathering, without benefit of anything but the most rudimentary technology, and most importantly, without agriculture?

For at least 90,000 years our forebears and intellectual equals seem to have stared stoically through campfires at predators and scavengers, cold and starvation. And then, about 10,000 years ago, the nature of life on Earth radically changed. As the last of the Ice Age megamammals went extinct, we as a species began to multiply and reach population numbers never seen before. Within a few short millennia we had begun to craft complex tools, to smelt metal, to domesticate animals, and to build villages and towns, and finally cities. And most important of all, humans discovered agriculture at about the same time as the last mammoths and mastodons of North and South America died out.

Large-scale agriculture first appeared in Europe and the Middle East about 9,000 years ago and in East Asia 8,500 years ago. What trigger event opened the door to agriculture and set the scene for a revolution in human lifestyle? Clues to this mystery seem to lie in the thick glacial storehouse that is Greenland.

Great scientific discoveries usually come from the most unexpected sources, and such was the case in 1993.... In that year, after twenty years of boring, bringing to the surface, and analyzing layer by layer the Greenland ice cores and their patterns of ancient wind-deposited oxygen and carbon isotopes—cores only inches wide raised from depths down to thereabouts of two miles and dating back more than 200,000 years—European and American scientists were expecting to find evidence of stable climates broken only by epochs of slow temperature changes to match advances and retreats of the ice sheets. Ward continues:

They found nothing of the sort. The numbers emerging from the great mass spectroscopes across the world showed that fluctuations of Earth’s climate have been far more severe, and have occurred much more abruptly than any scientist had postulated—until 10,000 years ago, that is. This new discovery makes possible an entirely new interpretation of the rise of human civilization, and it certainly shows that our present-day weather—one of the prime bases for the concept of Uniformitarianism—is in fact very aberrant. We are currently in a state of calm, a period that has lasted 10,000 years.

Before that things were anything but calm.

For much of the last 2.5 million years, crystals of ice in the Greenland ice cap have faithfully adsorbed minute quantities of oxygen and carbon isotopes, and in the process they have created a record of the Earth’s climate. By looking at isotopic ratios of oxygen, we can deduce ancient temperatures. The analysis of oxygen isotopes from the Greenland ice cores have shown that, contrary to popular scientific belief, the climate over the past 250,000 years has changed frequently and abruptly; the magnitude of the global temperature changes has been far greater, and their intervals far shorter, than anyone imagined.

Dr. James White of the Institute of Arctic and Alpine Research at the University of Colorado noted in a recent summary of the project that between 200,000 and 10,000 years ago, average global temperature changed as much as 18 °F in a few decades. The current average global temperature is 59 °F. Imagine that it suddenly shot up to 75 °F or sank to 40 °F in a century or less. Another of the researchers working on this problem, Dr. Minze Stuever of the University of Washington, has told me that such dramatic changes could have taken place in as little as 5 years. We have no experience of such a world; such sudden perturbations in temperature would enormously affect the atmospheric circulation patterns, the great gyres that redistribute Earth’s heat. At a minimum, these sudden changes would create catastrophic storms of unbelievable magnitude and fury. Yet such changes were common until 10,000 years ago. Imagine a world where storms that dwarf Hurricane Andrew lash the continents not once a century but several times each year, every year. Imagine a world where tropical belts are suddenly assaulted by snow each year. This was our world until 10,000 years ago, when, according to the new studies from Greenland, a miracle happened: The sudden shifts of weather stopped.

In 1993 it was discovered that 10,000 years ago, intense global weather changes that had been the norm for the past 2.5 million years suddenly disappeared; the weather entered a 10-millennium calm. Very soon after the start of this calm, we as a species began to build villages and then cities. We learned to smelt metal. And most important, we learned how to tame crops and domesticate animals. Human population numbers began to soar.... Of one thing I am sure: There must be a connection between the cessation of mad temperature swings, 10,000 years ago, and the rise of human agriculture and civilization. And as we learned to sow and reap, surely our numbers rose as never before....16

It must be emphasized that Earth’s climate during the roughly 10,000 years of our present warm interglacial period that Ward describes, has been “stable” only as compared with the truly chaotic ice age(s) that preceded it. That 10-millennia stability has even so, as we have seen, included many climatic changes of magnitudes great enough that—for both good and ill—they crucially affected and tested the course of human affairs. In our short career of 130,000 years as Homo sapiens by one definition, we have not only struggled through a full ice age, but in our most recent 10,000 years of relative warmth and climatic stability and greatly accelerated progress we have quickly become by our own measure the dominant species on Earth. But will this always be so?

Looking back yet farther, prior to the most recent full Ice Age and much of the previous one, takes our perspective to an ongoing series of ice ages—each bringing enormous ice sheets many thousands of feet thick to periodically blanket the higher latitudes of continents, with far shorter warm periods between—periods averaging more or less the length of our current period of interglacial warmth.

So far we have been looking back first in terms of decades, then centuries, then millennia, whose most consistent climatic feature has been persistent change—often major in terms of effects on species’ struggle to survive, including humans. Since the record provided by the Greenland icecap does not go back far enough to provide ice cores that indicate climates much before the most recent Ice Age, for a similarly detailed record of earlier times we must now turn to the 12,000-ft.-long ice core bored near the Russian Vostok station in Antarctica, which provides a record of the last 420,000 years and four separate ice ages lasting roughly 100,000 years each; and each interspersed by warm interglacial periods of more or less 10,000 years—only a tenth as long—whose warmth reached levels similar to the one we are still experiencing today.

Other still earlier glaciations have been postulated by less precise means. It now appears that the present round of ice ages have numbered altogether at least eighteen and more likely well into the twenties, all within what we might call the greater glacial age still continuing—which has now lasted about 2.5 million years.

Those worried about today’s presumed warming might reflect that our so-far 10,000-year warm interglacial stage we still enjoy is now due at any time to reverse itself with a quick return to yet another ice age lasting 100,000 years or thereabouts.

In a subsequent work Peter Ward and his co-author, astronomer Donald Brownlee, comment:

We humans are blinded by the moment we live in, the brief ten thousand years of aberrant calm and warmth that marks this present interglacial. The reality is that such moments are rare and quickly pass, to be replaced by, on average, ninety-thousand years of numbing cold, ice, dust, and drought. Enjoy this summer. The forecast is for a long, brutal, and seemingly never-ending winter.17

(No need to panic yet—while its onset could come within the lifetime of anyone reading this, it could also still be hundreds, or possibly a couple thousand or so, years in the future. More precise estimates are not possible at this time.)

On earlier climate change, one of the sources listed here wryly observes that long ago, even if our species did not come close to extinction, abrupt changes of climate due to sudden changes in the Sun’s mood repeatedly plagued our ancestors. The bursts of warmth or cold could take effect during one human lifetime. They acted like a long series of intelligence tests, favoring the survival of clever and adaptable people through the opportunities of warm periods and the hazards of the cold.

Archaeologists have still to trace the many links between genetics, migrations, technologies, and climate change. But among the thousands of human generations, ours may be the first that was ever frightened by a warming.18

How much longer will this by now 2.5-million-year climatic seesaw—from long ice age to disturbingly short warm interglacial to yet another long ice age—last? No one knows, but Ward and Brownlee hazard a very rough guess of another 2 to 10 million years before long-term warming resumes in earnest.19

And resume it will, they emphasize. After citing episodes of even more intense chills hundreds of millions to a couple of billion years ago (including two known episodes of “Snowball Earth” when the oceans froze from the poles to the Equator), they point out that most of the time the Earth was warmer—to our human sensibilities often much warmer (if we could survive such torrid levels at all)—than today despite a then weaker Sun, on account of atmospheric carbon dioxide levels as high as 15–20 times today’s. What ultimately changed this, starting about 400 million years ago, was the rise of vascular land plants (ferns, shrubs, trees, etc.), which sucked up CO2 and locked huge amounts of carbon in the earth’s crust as decayed vegetation (a small fraction of which in time became coal, oil, and gas beds). With such enormous amounts of carbon dioxide thereby taken out of the atmosphere, one consequence was that the Earth began to cool despite a slowly but steadily warming Sun, thus enabling the possibility of serial ice ages.20

All this brings up a paradox: While climatic changes on timescales of human lifetimes, civilizations, and even species and broader life forms much older than the human cannot be predicted with great confidence or detail, for the really distant future our science now possesses sufficient knowledge to do so with virtual certainty. The present pattern of recurring ice ages will surely continue, though we’re not sure just how much longer. We are certain that continental drift from tectonic plate movement will continue, most likely reversing to form by 250 million years from now a giant supercontinent comparable to the Pangaea of 250 million years ago, all of which will produce huge world climatic changes whose exact nature will depend on that land’s configuration.

Independently of this, we know that the Sun has been continually getting hotter from its interior thermonuclear fires since they first ignited some 4.6 billion years ago, and that this is sure to continue until, within the next billion years, all complex life on Earth will very likely have vanished due to the searing heat. In a few more billion years the Sun itself will precipitously expand into a Red Giant whose bloated surface will approach or even reach Earth itself, melting its rocks and either swallowing and vaporizing our planet or frying it until nothing is left but a burnt-out cinder. Now, that’s real global warming.21

How can we be so sure this will in fact happen? Our galaxy and other galaxies provide astronomers with a grand theater of stars in all stages. The nuclear processes that make them shine are straightforward and well known. And all stars of the Sun’s mass and type follow this pattern on a predictable timescale. While there are many mysteries in cosmic evolution, this is not one of them.

But on such time-scales no sensible person is seriously going to worry about it. Nothing material is eternal. All things are created, flourish for a time, and ultimately die, even the Sun along with the billions and billions of other stars.

None of what has been said here is to imply that we now have all the answers to these questions—far from it. But among recent developments, some certainly challenge the idea that today’s conventional view widely labeled Global Warming is both recent and caused primarily by human activities in the last century or so.

This view is currently supported by many political organizations—from the United Nations’ Intergovernmental Panel on Climate Change (IPCC) to many levels of national and regional governments, as well as a good many scientific organizations themselves, who have sadly become dependent on grants from such sources. In the end we may discover to our chagrin that human activity is a puny contributor indeed to the planetary climate compared with the Earth’s own dynamics, as well as solar, stellar, galactic, and for that matter still-undiscovered basic laws of the Universe.

A certain humility would seem to be in order.

In present-day terms: What does all this mean in terms of our current global warming fears? Simply that everything is in a state of change, rapid or slow depending on one’s perspective. Our planet’s climate is always changing in one direction or another, often in several directions at once. No reliable general conclusions can be drawn from specific weather/climate events, such as a run of hot summers in Eastern U.S., a warm winter in Europe, yearly or decadal changes in hurricane activity; or for that matter, that a couple of years ago the West Siberian industrial city of Barnaul suffered its most frigid winter in a century, with one cold snap shutting it down for weeks when it ran out of fuel; or that subtropical Buenos Aires during the winter of mid-2007 had its first snowfall in living memory, just one part of an unusually cold winter in the southern hemisphere affecting South Africa, Australia, and New Zealand as well. Stasis, desired or not, does not exist.

Meteorologists, who study fickle daily weather changes, realize this more than most. Other scientists, fascinated by their recently developed computerized global circulation models, tend to over-rely on them—even though no models yet devised can begin to capture the incredible complexity of the interrelated factors affecting the Earth’s climate, many of which are very poorly resolved or unknown.

Where clear periodicities have been established—such as the roughly 100,000-year recurring cycles of full ice ages, the last four identified in great detail by means of ice-core analysis, with those ice ages separated by only some 7,000 to 12,000 years of warm interglacials averaging around the 10,000 years that our present one has already lasted, all this during the latter stages of the so far 2.5-million-year greater glacial age that saw the slow and unsteady rise of the genus Homo that led eventually to modern human beings like ourselves—our known Earth history may be more reliable than models.

Any present warming, if sustained, might be no more than an ongoing recovery from the extremely recent 500 years of the Little Ice Age.

Also, we might consider that that brief “age” in itself—and other recent fluctuations such as the Medieval Warm Period/Climatic Optimum, including the now so-often cited slight warming after the close of the 1970s— just could themselves turn out to be forerunners of an impending full Ice Age. (Let’s not yet be so dismissive of those scientists who feared exactly that well into the 1980s as a result of the four-decade global cooling episode that took place from roughly 1940 to the late 1970s, many of whom have since “converted” to belief in Global Warming—they could still turn out to have been right in the first place.)

If that should happen—and based on long-term known timing and repeated prior performance, it’s more likely at this point than any sustained warming—will civilization be snuffed out? Conceivably.

Certainly it would be most sorely tested.

But humans have been through such changes before, and in general seem to have improved their ability to cope with them. Implicit in the exhortations of some who cling to the Panglossian view that ours is the best of all possible worlds, is the notion that any challenge, any change, must not only be feared but, at their direction, can be stopped.

However, one thing is certain: natural climatic change is constant and unending, on any time-scale one may cite. And its magnitudes can—and many times have—far exceeded the dire predictions now proffered by those who suspect our climate change is all due to modern human overreaching. The Earth and the Sun, let alone the stars, are infinitely more powerful engines than anything mankind has produced. Has awe of our high-tech abilities gone to our heads? We, or our descendants, shall see. ■

Endnotes

1. The great regions known as “East” and “West” Antarctica are effectively separated by the Trans-antarctic Mountains, or, if preferred, by a line drawn between the deep indentations of the Ross and Weddell seas and their associated ice shelves. East Antarctica comprises a good four-fifths of the continent’s area. Both regions are covered by an enormous ice sheet with an average thickness of over 5,600 feet, and 15,600 feet at its thickest. The Antarctic ice sheet alone accounts for roughly 90 percent of the world’s ice (and 70 percent of its fresh water). By contrast the Greenland ice-cap, huge as it is, contains “only” about 9 percent of Earth’s ice, leaving just some 1 percent for all the smaller ice caps and mountain glaciers around the world.

Most of the general public have a very vague conception of the size and shape of Antarctica, since it’s necessarily shown stretched across the bottom of virtually all world maps in highly distorted form. To see its true shape one must either upend a globe, or consult a map centered on that continent (found in many world atlases), or a vertical satellite photo. Its area equals that of all 50 states of the U.S. plus all of Mexico. (Or, perhaps more easily visualized, Mexico and the Alaska-sized Gulf of Mexico plus the 48 contiguous U.S. states without Alaska.)

2. Nature, v. 457, pp.459-62 (Jan. 22, 2009). “Warming of the Antarctic ice-sheet surface since the 1957 International Geophysical Year,” lead author Eric J. Steig at the University of Washington, Seattle. The third of five coauthors is Michael E. Mann of the famous “hockey stick” graph used in the U.N.’s Climate Change 2001 IPCC report, which scarcely indicates the Medieval Warm Period or Little Ice Age but depicts a 20th-century sudden steep warming to unparalleled heights. In 2004 its math was found to be in fundamental error. The previous IPCC report of 1995 had clearly graphed and labeled both the Medieval Warm Period and Little Ice Age, with the 20th century shown as cooler than MWP and leveling by 1980.

3. Brian Fagan, The Little Ice Age: How Climate Made History. Basic Books, 2000. Fagan’s three books on the general topic of historic climate change, most recently The Great Warming, focusing on the Medieval Warm Period (Bloomsbury Press, 2008), are of especial value on account of his linking such natural fluctuations to both human progress and setbacks. In doing so, Fagan for the most part stops short of any doctrinaire environmental determinism. Just where that line should be drawn, of course, is a matter of continual debate.

4. The lack of grapevines in Greenland is amply indicated in the Medieval Norse sagas by how astonished and impressed the Greenlander Leif Eiriksson and his party were in about the year 1,000 to find wild grapes growing either in today’s New England or in the most northerly possible locale given the description, New Brunswick (but not in Newfoundland as some contend), resulting in his naming the country Vinland, meaning Wine Land.

5. Fagan, The Long Summer: How Climate Changed Civilization. Basic Books, 2004, pp.151–52. The title refers to the period between the ending of the most recent full Ice Age about 10,000 years ago and about AD 1300 when the Little Ice Age began.

6. Ibid., pp. 107–08 (referring to the “Mini Ice Age”).

7. Lloyd D. Keigwin, “The Little Ice Age and Medieval Warm Period in the Sargasso Sea.” Science, v. 274, pp. 1504–08 (Nov. 29, 1996). Two charts are relevant here. First, his Figure 4-B (p. 1507), showing the major ups and downs of likely hemispheric temperatures for the last 3,000 years, is based on bottom deposits from the Bermuda Rise, northern Sargasso Sea. A second similar chart (Figure 2, p. 1506) extends this back to 10,000 years ago—i.e., the full length of our own interglacial period or Holocene, shown more roughly but with similar swings.

8. Richard A. Kerr, “The Little Ice Age—Only the Latest Big Chill.” Science, v. 284, pp. 2069 (June 25, 1999). The periodicities of full cycles from the onset of a warm period through the end of the following cold period are reckoned at 1,000 to 2,000 years, averaging about 1,500 years apart. Kerr noted Bond’s significant finding (in his investigation of 140,000 years of climate cycles in the northern Atlantic) that rock debris raised from northern Atlantic bottoms, deposited by ice-age glaciers, “jumped in abundance every 1,500 years...as the great ice sheets surged toward the sea. The oscillations continued after the ice age ended 10,000 years ago, although at greatly reduced levels.” (Emphasis mine.) This last indicates that the 1,500-year cycle had continued through at least the last full Ice Age (which began some 120,000 years ago), and with climate swings at much greater intensities than in today’s warm interglacial—as Peter Ward, to be quoted presently, has described.

9. Gerard Bond et al., “Persistent Solar Influence on North Atlantic Climate During the Holocene.” Science, v. 294, pp. 2130–36 (Dec. 7, 2001). Also see Kerr’s advance summary and comment on Bond’s paper, “A Variable Sun Paces Millennial Climate,” Science, v. 294, pp. 1431–33 (Nov. 16, 2001).

10. Nir J. Shaviv and Ján Veizer, “Celestial driver of Phanerozoic Climate?” GSA Today (Geological Society of America), v. 13, pp.4–10 (July 2003).

11. Henrik Svensmark, “Cosmoclimatology: A New Theory Emerges.” A&G (Astronomy & Geophysics), v. 48 (Feb. 2007), pp.1.19–1.24. Svensmark’s scientific papers are more readable than most for nonspecialists, without sacrifice of scientific accuracy or detail.

12. Svensmark and Nigel Calder, The Chilling Stars: A Cosmic View of Climate Change, Icon Books Ltd., Thriplow, Cambridge, U.K., 2nd ed., 2008. In this book Svensmark and Calder explain the development of the theory and the workings of these processes in Earth history—and their implications concerning climate change—in terms laudably free from abstruse scientific jargon and which most general readers can follow. Quotation and comments on clouds and climate models on pp. 63–66. The ion-box experiment is the topic of chapter 4, pp. 99–131.

13. Another overview and assessment of these developments is in S. Fred Singer and Dennis T. Avery, Unstoppable Global Warming: Every 1,500 Years, Rowman & Littlefield, updated and expanded edition, 2008 (original edition published 2007). Singer’s summaries on the 1500-year cycle, solar activity, and cosmic-ray blocking of the solar wind are on p. 4–6 of Introduction and in chapters 1 and 2, pp. 15–38 of 2008 edition.

14. Fagan, The Long Summer, p. 24. Well over a hundred-odd ups and downs of climate change from 22,000 years ago during the late stages of the recent full Ice Age up to the present, as revealed by analyses of the Greenland ice cores, are here charted in a zigzag graph—including both the big chill of the “Younger Dryas” and the sudden North American meltwater release of 8,200 years ago that brought on the 400-year “Mini Ice Age.”

15. For a different and revealing overview of change from the recent Ice Age to our warm interglacial period, including several maps showing the shrinking of the great North American ice sheets, see E. C. Pielou, After the Ice Age: The Return of Life to Glaciated North America (University of Chicago Press, 1991), pp. 5–18.

16. Peter D. Ward, The Call of Distant Mammoths. (Copernicus, 1997), pp. 198–99 and 201. (Noteworthy is that Dr. Minze Stuever, whom Ward casually mentions here, is known as the world’s second-most-cited scientist in geosciences.)

17. Ward and Donald Brownlee, The Life and Death of Planet Earth (Times Books, Henry Holt & Co., 2002), p. 86. Also, the 420,000-year temperature record from the Vostok ice core in Antarctica, which shows the four most recent full ice ages separated by five disturbingly short warm interglacials including our still-ongoing one, is charted on page 76. (Shown similarly in Fagan’s The Long Summer, p. 25.)

18. I leave it to you, Dear Reader, to guess from which of our sources this quote is taken. (Hint: It’s not Ward or Brownlee, but shouldn’t be too hard.)

19. Ward and Brownlee, ibid. (likewise on p. 86).

20. Ibid., pp.62–66 on earlier greatly elevated atmospheric CO2 levels and the causes of their later drop; on “Snowball Earth,” p. 64 and 75. See, too, Svensmark in his paper “Cosmoclimatology” (see note 11 above), p. 1.23; also his recent and wide-ranging discussion in chapter 6 of The Chilling Stars (2008). Both episodes of such worldwide superfreezes, about 2.3 billion years ago and some 700 million years ago, occurred before animals or complex plants existed.

21. Ibid., pp.23–34, 157–65. Brownlee and Ward here take the general reader on a concise and well-written journey from the birth of the Sun and Earth, through our planet’s final incineration by an expanding but dying Sun. A twelve-hour “clock” diagram on page 23 represents the Sun’s 12-billion-year career as an active star consuming its hydrogen fuel and slowly heating until near its end. On that clock’s face, the 1-billion-year time of complex plant and animal life on Earth occupies only the single hour between 4 and 5 o’clock. The present time—our “Now”—is just past 4:30. As the Sun continues to heat up, by 5 o’clock virtually all Earth’s plants and animals will be gone, with only microbial life left. By 8 o’clock the very oceans will have totally evaporated into space as the Sun heats further. Even the simplest microbes are now gone. Soon after 11 o’clock the Sun begins its enormous expansion, becoming what is called a Red Giant. At about 12 o’clock the Sun’s surface will closely approach or even reach the Earth’s orbit, to incinerate or swallow our planet completely. In its final acts as an active star, the Sun’s unimaginably dense core will undergo a series of “helium flashes,” followed by blowing half its total mass into outer space and destroying the outer planets as well.

(Our Sun will never undergo the far greater explosive stage of a supernova, a fate reserved for stars at least 8–10 times more massive than the Sun; the very recent Supernova of 1987, the first in our Milky Way galaxy since 1604, was 18 solar masses.) After a cosmically brief time as a Red Giant, what’s left of our once life-giving Sun will then collapse to a superdense White Dwarf star of one percent its present diameter, merely Earth-size, and cool slowly over eons until it’s no more than a black cinder lost in space.

This process, and its inevitability for stars like our Sun, is described in detail in numerous astronomy works at both popular and professional levels.

About the author

Lee G. Madland is currently a consultant in Missoula, Montana. His doctorate in geography is from the University of California at Los Angeles and he has taught at the University of Las Vegas in Nevada.