Posted on 11/21/2003 1:27:36 AM PST by neverdem
"New View of Data Supports Human Link to Global Warming," the New York Times reported yesterday. Well, perhaps.
It is a scientifically established fact that, all other things being equal, extra carbon dioxide in the atmosphere will tend to trap heat from the sun and warm our planet. But the real question is how much the carbon dioxide that has been added to the atmosphere by burning fossil fuels will really warm the earth.
Senator James Inhofe (R-Okla.) has called global warming a "hoax" perpetrated by extreme environmentalists. Of course, Inhofe is from a big oil-producing state and is derided by activists as a know-nothing yahoo, but perhaps he's on to something. Buried in the Times article was an acknowledgement that "the new research is showing that, at least so far, the influence of greenhouse gases appears to have been more modest than some climate experts once predicted."
The article cites a Remote Sensing Systems study, spearheaded by Frank Wentz, that re-analyzed 25 years worth of satellite temperature data. Using 1979-2002 numbers measuring the atmosphere from the surface up to the stratosphere, Wentz' team found that the temperature trend is an increase of 0.115 degrees centigrade per decade. This contrasts with data from the NASA team, led by climatologist John Christy from the University of Alabama, indicating that the same portion of the atmosphere is warming at a rate of 0.032 degrees centigrade per decade.
Christy notes that if one were to consider just the lower troposphere (the part of the atmosphere closest to the earth's surface) Wentz' temperature trend would be about 0.15 degrees centigrade per decade, in contrast to Christy's trend of .074 degrees centigrade per decade. The disparity in the two figures arises from the differing ways the two teams handle errors in the data sets, Christy says. He believes both methods are scientifically defensible in a statistical sense. However, Christy contends that a long-running series of weather-balloon measurements strongly and independently confirms his temperature trends.
But let's set those statistical arcana aside for the moment, and just consider what either set of trends is telling us about the future of global warming. "I don't like to extrapolate," says Christy, "but we do have 25 years of good temperature data that are telling us something about how the atmosphere reacts to carbon dioxide."
So assuming that Wentz' team has gotten it right, and the lower troposphere is warming at a rate of 0.15 degrees per decade, that would mean that the earth would be 1.5 degrees centigrade warmer in 2100 than it is today. If Christy is right, he believes, "We might see a degree of warming over the next century. Neither one of those temperature increases is going to cause much of a catastrophe."
One should also take into account the somewhat spotty temperature record compiled from surface thermometers suggesting that the earth is warming by 0.17 degrees centigrade per decade. Extrapolating that trend yields an increase of 1.7 degrees centigrade by 2100. Again, no catastrophe.
Keep in mind that the earth's atmosphere warmed by 0.6 degrees centigrade over the past century. It is generally agreed by most climatologists that most of the 20th century's warming was not the result of higher levels of carbon dioxide in the atmosphere. That fact suggests that extra carbon dioxide and other greenhouse gases can't be contributing very much to warming, either.
In 2001, the United Nations Intergovernmental Panel on Climate Change (IPCC) cited various climate models that predicted the world's climate could warm between 1.4 and 5.8 degrees centigrade (2.5 to 10.4 degrees Fahrenheit) by 2100. Of course, the higher catastrophic increase was the one featured in headlines and cited by activists. Now, according to the Wentz and Christy data, it looks like the smaller increase of the IPCC's predicted range of temperatures is more likely to occur. (And if you think that the IPCC's climate models are a bit questionable, you really ought to look at how bad its economic models are, according to The Economist).
But how heavily should we rely on these temperature data sets? Thomas Karl, director of the National Climatic Data Center, suggests in the article that in order to get a true picture of future global warming, we should look at a variety of data, such as melting glaciers and sea-surface temperatures, rather than "rely on a lone line of data." Karl likens the process to taking a school test. "Any conclusion will ultimately have to look like the results of a 100 question test. If you get a 90, you're probably on track."
Christy counters that not all the questions on Karl's notional climate test will have equal weight. "Measuring the temperature trends of the bulk of the atmosphere is like a fifty-point question, while many of the others are just half-point questions," he says. If you get the atmospheric temperature trends right, then you're well on your way to acing the test.
The New York Times correctly notes that satellite data trends now more closely match the predictions of climate models. The article fails to note that that is largely because refined models are predicting lower temperature trends. It seems that the planet is telling us that the climate models most sensitive to changes in carbon dioxide have gotten it wrong and need to be revised. So OK, global warming is not a "hoax," but the danger it poses to humanity and to nature is being exaggerated by activists. There is indeed a small amount of man-made global warming, but the scientific evidence is growing stronger that it's not much of a crisis.
Ronald Bailey, Reason's science correspondent, is the editor of Global Warming and Other Eco Myths (Prima Publishing) and Earth Report 2000: Revisiting the True State of the Planet(McGraw-Hill).
I'm almost positive that the idiots have never considered the fact that the sun might be responsible for global warming? "No son, intelligent thinking like that could irreparable damage to the earth"./sarcasm>
Let me know if you wish to be added or removed from this list.
I don't get offended if you want to be removed.
Christy notes that if one were to consider just the lower troposphere (the part of the atmosphere closest to the earth's surface) Wentz' temperature trend would be about 0.15 degrees centigrade per decade, in contrast to Christy's trend of .074 degrees centigrade per decade.
The disparity in the two figures arises from the differing ways the two teams handle errors in the data sets, Christy says. He believes both methods are scientifically defensible in a statistical sense. However, Christy contends that a long-running series of weather-balloon measurements strongly and independently confirms his temperature trends.
One should also take into account the somewhat spotty temperature record compiled from surface thermometers suggesting that the earth is warming by 0.17 degrees centigrade per decade. Extrapolating that trend yields an increase of 1.7 degrees centigrade by 2100.
Hmmm! The UN/IPCC relies on the GHCN Surface Data Series for its evaluation of Global Disaster, and is dependant upon maintaining the credibility of those surface datasets.
Unfortunately for the Global Warming folks there are some big problems in those datasets:
McKitrick, Ross R. "An Economists Perspective on Climate Change and the Kyoto Protocol" pages 5&6
Presentation to the Department of Economics Annual Fall Workshop
The University of Manitoba
November 7, 2003
http://www.uoguelph.ca/~rmckitri/research/econ-persp.pdfIn the early 1990s, the collapse of the Soviet Union and the budget cuts in many OECD economies led to a sudden sharp drop in the number of active weather stations.
***
Figure 3 shows the total number of stations in the GHCN[Global Historical Climatology Network] and the raw (arithmetic) average of temperatures for those stations. Notice that at the same time as the number of stations takes a dive (around 1990) the average temperature (red bars) jumps. This is due, at least in part, to the disproportionate loss of stations in remote and rural locations, as opposed to places like airports and urban areas where it gets warmer over time because of the build-up of the urban environment.
This poses a problem for users of the data. Someone has to come up with an algorithm for deciding how much of the change in average temperature post-1990 is due to an actual change in the climate and how much is due to the change in the sample. When we hear over and over about records being set after 1990 in observed global temperatures this might mean the climate has changed, or it means an inadequate adjustment is being used, and there is no formal way to decide between these.
Nevertheless, confident assertions are routinely made about changes in the global temperature on the order of tenths of a degree C per decade. The confidence masks pervasive uncertainty in the underlying concepts and data quality.
Figure 3. Number of stations in GHCN collection (diamonds, right axis); Average temperature of annual sample (bars, left axis in C). Source: see Taken By Storm chapter 4.
Note, how well the instumental global surface temperature series tracked with Tropospheric Balloon measurements up until the 1989 time frame; then diverge, while we observe the number of remote surface stations in continued decline:
http://www.uoguelph.ca/~rmckitri/research/trc.html
Sombody's got some splain'n tah do.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.