Until recently, questions about accepted global warming models and other modeling strategies took place only in fringe journal, there was no funding for differing points of view, the discussion was over and only approved research was funded. Not even the most fantastic statements were questioned. Thus I enjoyed reading George F. Will’s editorial comments in the Washington Post yesterday when he said, “Barack Obama …promises that U.S. emissions in 2050 will be 83 percent below 2005 levels. If so, 2050 emissions will equal those in 1910, when there were 92 million Americans. But there will be 420 million Americans in 2050, so Obama's promise means that per capita emissions then will be about what they were in 1875.” It does put that promise in perspective.
The hacked emails from the University of East Anglia's Climate Research Unit (CRU) a collaborator with the U.N.'s Intergovernmental Panel on Climate Change reveals some researchers willingness to suppress or massage data and rig the peer-review process and control the publication of scholarly work. Hacked emails have shown some of the weaknesses in the climate data and models used to forecast global warming as well as some rather questionable behavior. Richard Lindzen, a meteorology professor at MIT recently wrote about his questions about the climate models.
It is likely that global average temperature has increased about 1.5 degrees Fahrenheit since the middle of the 19th century, but he points out that the quality of the data is poor, and because the changes are small, it is easy to nudge such data a few tenths of a degree in any direction. Several of the hacked emails from CRU talk about how to do just that to support the prime theory of global warming.
Though these days when you say greenhouse gasses most people think carbon dioxide, the main greenhouse substances in the earth's atmosphere are water vapor and clouds. Carbon dioxide represents less than 0.04% of the atmosphere and its increase over the past hundred years or so is no doubt due to man’s impact on earth. According to Professor Lindzen, even a doubling of CO2 would only upset the original balance between incoming and outgoing solar radiation by about 2%. Due to what he calls “climate forcing."
The main statement publicized after the last IPCC Scientific Assessment two years ago was that it was likely that most of the warming since 1957 (an anomalous cold year) was due to man’s activity. This was based on the fact that the various climate models couldn't reproduce the warming from about 1978 to 1998 without some climate forcing. They concluded that the climate forcing was due to man, not a fault in their models’ assumptions and understanding of climate.
Yet, articles from major modeling centers acknowledged that the failure of these models to anticipate the absence of warming for the past dozen years was due to the failure of these models to account for natural variability. Unaccountable variability is the reasons the climate models did not predict the failure of the earth to warm in the past decade, but unaccountable variability is the evidence that man’s activity caused the temperature warming that occurred in the previous twenty year period.
Recall that according to Professor Lindzen a doubling of CO2 would only increase incoming and outgoing radiation by about 2% over outgoing radiation; we are still talking about a two degree Fahrenheit increase associated with a doubling of CO2. The potential for alarm at increasing CO2 levels is basted on the concept of climate sensitivity-will an increase in CO2 act as a catalyst on global temperature increases.
Current climate models predict very high sensitivities of climate to changes in CO2. They do so because in these models, the main greenhouse substances (water vapor and clouds) act to amplify anything that CO2 does. This is referred to as positive feedback. But as the IPCC notes, clouds continue to be a source of major uncertainty in current models. Low sensitivity is entirely compatible with the small warming that has been observed in the past 12 years. The models with high sensitivity simulate the currently small response to the steep increase in CO2 by using aerosols to arbitrarily cancel as much greenhouse warming as needed to match the data, with each model choosing a different degree of cancellation according to the sensitivity of that model.
Many disasters associated with global warming are simply normal occurrences whose existence is claimed to be evidence of global warming and taken as omens, portending doom due to our carbon footprint. As George Will stated, “Some climate scientists …seem to suppose themselves a small clerisy entrusted with the most urgent truth ever discovered. On it, and hence on them, the planet's fate depends. So some of them consider it virtuous to embroider facts, exaggerate certitudes, suppress inconvenient data, and manipulate the peer-review process to suppress scholarly dissent and, above all, to declare that the debate is over.”
Now the debate should truly begin.
No comments:
Post a Comment