In general, science is strongest when it deals with repeatable, observable and limited phenomena. I explained in Genesis and Genes that
The third adjective [i.e. limited] refers to the fact that the phenomenon under study is to some extent isolated; it is more-or-less modular, and so can be studied independently, without having to take account of numerous interactions… studying the climate would not fit into this category because the climate is influenced by a huge number of factors ranging from the local density of trees to the cyclic appearance of sunspots.
Genesis and Genes is not about climate change. But I did devote substantial space to discussing how informed consumers of science should approach scientific research which is diffuse (i.e. the opposite of limited), and climate research is a paradigmatic example of such research. [Also of interest is my post The Wager, at https://torahexplorer.com/2013/01/22/the-wager/.%5D
Over the past century, climate research has blown hot and cold, with climatologists alternately declaring that we are going to freeze or fry. Readers who would like to explore the subject can begin by reading a brief but detailed history of climate change scares entitled Fire and Ice, available online.
Global Cooling: 1890s-1930s
Around 1850, America and Europe emerged from a lengthy period of cooling, called the Little Ice Age. So when The New York Times warned of new cooling in 1895, it was a prediction taken seriously. On February 24, 1895, the paper announced “Geologists Think the World May Be Frozen Up Again.” On October 7, 1912, the paper reported on page 1 that, “Prof. [Nathaniel] Schmidt [of Cornell University] Warns Us of an Encroaching Ice Age.” The same day the Los Angeles Times ran an article about Schmidt as well, entitled Fifth ice age is on the way. It was subtitled Human race will have to fight for its existence against cold. “Scientist says Arctic ice will wipe out Canada,” declared a front-page Chicago Tribune headline on August 9, 1923. “Professor Gregory” of Yale University stated that “another world ice-epoch is due.” He was the American representative to the Pan-Pacific Science Congress and warned that North America would disappear as far south as the Great Lakes, and huge parts of Asia and Europe would be “wiped out”. Switzerland would be “entirely obliterated,” and parts of South America would be “overrun”. In a New York Times article from September 20, 1922, a penguin found in France was viewed as an “ice-age harbinger”.
The Atlanta Constitution also commented on the impending ice age on July 21, 1923. It reported on the great increase of glaciers in the Arctic. Even allowing for “the provisional nature of the earlier surveys,” glacial activity had greatly augmented, “according to the men of science.” The Christian Science Monitor reported on the potential ice age as well, on July 3, 1923. “Captain MacMillan left Wicasset, Maine, two weeks ago for Sydney, the jumping-off point for the north seas, announcing that one of the purposes of his cruise was to determine whether there is beginning another ‘ice age,’ as the advance of glaciers in the last 70 years would seem to indicate.”
Swedish scientist Rutger Sernander also predicted a new ice age. He headed a Swedish committee of scientists studying “climatic development”. According to the Los Angeles Times on April 6, 1924, he claimed that the conditions “when all winds will bring snow, the sun cannot prevail against the clouds, and three winters will come in one, with no summer between,” had already begun.
Global Warming: 1930s-1970
Today’s global warming advocates – and certainly the general public – rarely realize how unoriginal their claims are. The USA entered the “longest warm spell since 1776,” according to a March 27, 1933, New York Times headline. One year earlier, the paper reported that “the earth is steadily growing warmer” in its May 15 edition. The Washington Post felt the heat as well and titled an article simply ‘Hot Weather’ on August 2, 1930.The Los Angeles Times beat both papers to the heat on March 11, 1929: “Most geologists think the world is growing warmer, and that it will continue to get warmer.”
Meteorologist J. B. Kincer of the federal weather bureau published a scholarly article on the warming world in the September 1933 Monthly Weather Review. The article began by discussing the “wide-spread and persistent tendency toward warmer weather” and asked “Is our climate changing?” Kincer proceeded to document the warming trend. Out of 21 winters examined from 1912-33 in Washington, D.C., 18 were warmer than normal and all of the past 13 were mild. New Haven, Connecticut, experienced warmer temperatures, with evidence from records that went “back to near the close of the Revolutionary War,” claimed the analysis. Using records from various other cities, Kincer showed that the world was warming.
British amateur meteorologist G. S. Callendar made a bold claim five years later that many would recognize now. He argued that humanity was responsible for heating up the planet with carbon dioxide emissions – in 1938. He published an article in the Quarterly Journal of the Royal Meteorological Society. “In the following paper I hope to show that such influence is not only possible, but is actually occurring at the present time,” Callendar wrote. He went on the lecture circuit describing carbon-dioxide-induced global warming.
On November 6 the following year, The Chicago Daily Tribune ran an article titled Experts puzzle over 20 year mercury rise. It began, “Chicago is in the front rank of thousands of cities throughout the world which have been affected by a mysterious trend toward warmer climate in the last two decades.”
The trend continued into the 1950s. The New York Times reported that “we have learned that the world has been getting warmer in the last half century” on August 10, 1952. The following year, the paper reported that studies confirmed summers and winters were getting warmer. “Arctic Findings in Particular Support Theory of Rising Global Temperatures,” announced the paper during the middle of winter, on February 15, 1959. Glaciers were melting in Alaska and the “ice in the Arctic ocean is about half as thick as it was in the late nineteenth century.” A decade later, the New York Times reaffirmed its position that “the Arctic pack ice is thinning and that the ocean at the North Pole may become an open sea within a decade or two,” according to polar explorer Colonel Bernt Bachen in the February 20, 1969, piece.
Global Cooling: 1950s-1970s
The first Earth Day was celebrated on April 22, 1970, amidst hysteria about the dangers of a new ice age. The media had been spreading warnings of a cooling period since the 1950s, but those alarms grew louder in the 1970s. Three months before, on January 11, 1970, the Washington Post told readers to “get a good grip on your long johns, cold weather haters – the worst may be yet to come”, in an article titled Colder Winters Herald Dawn of New Ice Age. The article quoted climatologist Reid Bryson, who said “there’s no relief in sight”.
Fortune magazine won a Science Writing Award from the American Institute of Physics for its own analysis of the danger. “As for the present cooling trend a number of leading climatologists have concluded that it is very bad news indeed,” Fortune announced in February 1974. The article emphasized Bryson’s extreme doomsday predictions. “There is very important climatic change going on right now, and it’s not merely something of academic interest.” Bryson continued, “It is something that, if it continues, will affect the whole human occupation of the earth – like a billion people starving. The effects are already showing up in a rather drastic way.” [Reality check: the world population has increased by 2.5 billion since that warning.]
Fortune had been emphasizing the cooling trend for 20 years. In 1954, it picked up on the idea of a frozen Earth and ran an article titled Climate – the Heat May Be Off. The story informed its readers that “despite all you may have read, heard, or imagined, it’s been growing cooler – not warmer – since the Thirties.”
The claims of global catastrophe were chilling indeed (double-entendre intended). “The cooling has already killed hundreds of thousands of people in poor nations,” wrote Lowell Ponte in his 1976 book The Cooling. If the proper measures weren’t taken, he cautioned, then the cooling would lead to “world famine, world chaos, and probably world war, and this could all come by the year 2000.” The November 15, 1969, issue of Science News quoted meteorologist Dr. J. Murray Mitchell Jr.: “How long the current cooling trend continues is one of the most important problems of our civilization,” he said. Six years later, the periodical reported that “the cooling since 1940 has been large enough and consistent enough that it will not soon be reversed.” In 1975, Nigel Calder, a former editor of New Scientist, said that “The threat of a new ice age must now stand alongside nuclear war as a likely source of wholesale death and misery for mankind.” His analysis came from “the facts [that] have emerged” about past ice ages, according to the July/August issue of International Wildlife Magazine.
The New York Times ran warming stories into the late 1950s, but it too came around to the new fears. In 1975, the paper reported that “A Major Cooling [is] Widely Considered to Be Inevitable.”
Global Warming: 1980s-Present
I don’t think it is necessary to impress on readers the extent to which we have been bombarded with the most recent version of the climate Armageddon, so I won’t elaborate. Global warming has replaced the media’s ice age claims, but the results somehow have stayed the same – the deaths of millions or even billions of people, widespread devastation and starvation. The recent slight increase in temperature could “quite literally, alter the fundamentals of life on the planet” argued the January 18, 2006, issue of the Washington Post.
The warm currents of the Gulf Stream, according to a 2005 study by the National Oceanography Centre in Southampton, U.K., have decreased 30 percent. This has raised “fears that it might fail entirely and plunge the continent into a mini ice age,” as the Gulf Stream regulates temperatures in Europe and the eastern United States. This has “long been predicted” as a potential ramification of global warming.
The above brief description of climate predictions (and there is much more in Genesis and Genes about the subject) suffices, I believe, to convince reasonable people that very little credibility can be assigned to long-term scientific predictions about the climate and, especially, about the interaction of humanity with the global climate. Climatology is a worthy scientific subject, to be sure. But the climate is influenced by hundreds of factors, and its study involves massive extrapolations, relies on complex mathematical models, is largely devoid of large-scale rigorous experimentation and is subject to massive doses of political interference. The correct posture to adopt under these conditions is one of extreme scepticism. [The implications for evolutionary biology and cosmogony/cosmology are obvious.]
I emphasised in Genesis and Genes that one of the failures of contemporary science education is its preoccupation with test tubes, microscopes and partial differential equations at the expense of even a modicum of science history. If climate researchers (and the public!) were better informed about the history of their subject, the shocking shilly-shallying of the past century would be less likely – or so we hope. Sadly, most people who do science know almost nothing about Science, and that is partly due to the fact that they have no training in the history and philosophy of science. Santayana’s adage about those who cannot remember the past being condemned to repeat it applies in science as often as it does in politics.
In chapter 1 of Genesis and Genes, I focussed on systemic features of contemporary science which should be taken into consideration by informed consumers of science when assessing the credibility of various scientific claims. But I did not discuss one extremely important source of obfuscation which mediates the interaction between science and the public – the press.
Despite all the vacillation in climate predictions over the past century, some reporters refuse to adopt a neutral stance when it comes to the climate. CBS reporter Scott Pelley went so far as to compare climate change sceptics with Holocaust deniers. “If I do an interview with [Holocaust survivor] Elie Wiesel, am I required as a journalist to find a Holocaust denier?” he said in an interview with CBS News’ PublicEye blog. He added that the whole idea of impartial journalism just didn’t work for climate stories. “There comes a point in journalism where striving for balance becomes irresponsible,” he said. The ridiculousness of this comment ignores an essential point: 30 years ago, the media were certain about the prospect of a new ice age. And, as we saw, that is only the most recent iteration of a century-long process of intellectual oscillation.
When one reads statements such as Pelley’s one is immediately struck by how familiar his attitude is to informed consumers of science. Pelley can equate doubts about climatology to Holocaust denial because the intolerance he preaches emanates from science itself: senior scientists refer to the advocates of unpopular theories as “quasi-scientists”, “crackpots”, and purveyors of “utter, damned rot”. [The references are to Nobel winners Daniel Schechtman and Robin Warren and to the pioneering geophysicist and meteorologist Alfred Wegener.] Those who are confident in their knowledge should not need to stoop to this type of vulgar vituperation; let them refute their opponents by presenting better evidence.
So don’t be surprised if, when pointing out the numerous, severe shortcomings of evolutionary biology you are declared ignorant, stupid or insane [courtesy of Richard Dawkins]. And don’t be surprised if, when pointing out the gigantic lacunae of modern cosmology, you are deemed a reactionary, fanatic or fundamentalist. Eventually, the truth will prevail.
Retrieved 11th March 2013.