Dr. Noor van Andel, former head of research at Akzo Nobel, has a new paper out showing the available data to date contradicts the notion of greenhouse gas induced global warming or 'climate change.' He notes that while there have been extensive efforts to 'prove' the 'greenhouse' warming theory by bringing computer models and observations into agreement, this has been done "strangely only by adjusting the measurements instead of adjusting the models," in other words, via unscientific means. Dr. van Andel instead finds that ocean oscillations and the cosmic ray theory of Svensmark et al best explain climate changes.
CO2 and climate change
Noor van Andel, noor@xs4all.nl, 17-01-2011.
Abstract: It is shown that tropical Pacific sea surface temperature anomalies are closely congruent to global temperature anomalies, and that over more than a century. When we understand the cooling mechanism over the tropical Pacific, and especially its CO2 dependency, we can draw conclusions for the global CO2 climate sensitivity.
It is shown that the cooling of the tropics, or trade wind belt, is by deep convection, i.e. by a few thousand concentrated tropical thunderstorms that carry all the sensible and latent heat swept up by the trade winds all the way on to the tropopause. The physics of deep convection have been formulated since 1958 and are based on sound thermodynamics and measurements on location.
The trends of the temperature in the high atmosphere in the last half century are very negative, starting on this height where the convection reaches. That means that more CO2 has a cooling effect rather than a warming effect. Cloud tops radiate much more intense than the thin air on this height. This is the cause behind the cooling, as much as the CO2 increase.
The cooling trend is quite in discrepancy with the “greenhouse-gas-induced-global-warming” theory, but is quite in accord with increasing deep convection. The adjustment of these temperature measurements to bring them more in line with the climate models leads to unphysical conditions and processes. The response of the upper atmosphere temperature on volcanic eruptions also fits in the deep convection theory, but not in the mainstream theory.
Not CO2 increase, but two other parameters are the cause of climate change: ENSO or El Niño Southern Oscillation, a large change in the cold water upwelling along the coast of South America correlates well to short term climate change, and change in the intensity of hard, deeply penetrating Galactic Cosmic Radiation, well documented by 10Be deposits and 14C levels, correlates very well with long-term climate change including ice ages.
My conclusion is that climate changes are not caused by greenhouse gases.
selected excerpt from the paper:
The global warming started in 1976 with the “big climate shift”, the trend stopped in 1999 but the climate stayed warm until 2010. We see that in the warming period 1079-2009 not only the warming trend at the surface is higher, but the cooling trend in the high tropical troposphere is more clearly enhanced. We see even a cooling trend 1979-2009 replacing a warming trend 1958-2009 at the tropical 500-800 hPa height. We could even conclude that more CO2 cools the climate, because it cools the upper regions where the deep convection reaches, increasing the effective lapse rate over the whole height with 0.35 K/decade, over 2 decades and 12 km that means 0.07*2/12=0.012 K/km, not much, but we see in the table that a 0.1 K/km lapse rate increase at SST -302K increases the convection top 1.5 km. So this CO2 cooling trend over 2 decades brings the convection top 1.5 km/0.1*0.012=180 m higher, which is not negligible.
This behavior has been a problem for many, as it contradicts the global-warming-by-greenhouse- gases theory. So there has been a large activity to bring models and observations into agreement, strangely only by adjusting the measurements instead of adjusting the models.
and from the paper's conclusion:
Our present climate is due to an increased length of the last interglacial period, more than 10000 years, due to a low level of GCR [galactic cosmic rays] that maintains a low cloud cover, a low albedo, more absorbed sunshine and a pleasant climate. In the very long run, we need not mind about CO2 or global warming, but instead about higher GCR activity and global cooling. There is no way we can influence GCR activity, originating in active black holes and imploding supernovae.
h/t climategate.nl
see also Dr. van Andel's highly recommended presentation to the Dutch Meteorological Institute
Sunday, January 30, 2011
Friday, January 28, 2011
New Study: Mid-Atlantic Ocean has cooled "strongly" since 1998 to temperatures cooler than 1981
According to climate scientist Roger Pielke Sr, ocean heat content provides the most appropriate metric to diagnose global warming, rather than the conventional use of ground and atmospheric temperatures. A paper recently published in the Journal of Physical Oceanography finds that the Mid-Atlantic Ocean has cooled strongly since 1998 with more than half of the upper ocean warming over the 41 years from 1957-1998 erased by "strong" cooling over only 7 years from 1998-2004. As shown in the graph below, temperatures of the upper ocean within 3 different depth ranges were also found to be relatively stable since 2004 and each of the 3 depth ranges cooler than in 1981. This data is the opposite of climate model predictions of an accelerating steady rise in ocean heat content in relation to greenhouse gas levels.
Vélez-Belchí, Pedro, Alonso Hernández-Guerra, Eugenio Fraile-Nuez, Verónica Benítez-Barrios, 2010: Changes in Temperature and Salinity Tendencies of the Upper Subtropical North Atlantic Ocean at 24.5°N. J. Phys. Oceanogr., 40, 2546–2555. doi: 10.1175/2010JPO4410.1
Abstract: Strong interest in multidecadal changes in ocean temperature and heat transport has resulted in the occupation of the North Atlantic Ocean hydrographic transect along 24.5°N five times since 1957, more than any other transoceanic section in the world. This latitude is chosen because it is where the northward ocean transport of heat in the Atlantic reaches its maximum. An analysis of the five oceanographic cruises at this latitude shows that there has been a significant cooling of −0.15°C in the upper ocean (600–1800-dbar range) over the last 7 years, from 1998 to 2004, which is in contrast to the warming of 0.27°C observed from 1957 to 1998. Salinity shows a similar change in tendency, with freshening since 1998. For the upper ocean at 24.5°N, 1998 was the warmest and saltiest year since 1957. Data from the Argo network are used to corroborate the strong cooling and freshening since 1998, showing a −0.13°C cooling in the period between 1998 and 2006 and revealing interannual variability between 2005 and 2008 to be much smaller than the decadal variability estimated using the transect. The results also demonstrate that Argo is an invaluable tool for observing the oscillations in the tendencies of the ocean.
Vélez-Belchí, Pedro, Alonso Hernández-Guerra, Eugenio Fraile-Nuez, Verónica Benítez-Barrios, 2010: Changes in Temperature and Salinity Tendencies of the Upper Subtropical North Atlantic Ocean at 24.5°N. J. Phys. Oceanogr., 40, 2546–2555. doi: 10.1175/2010JPO4410.1
Abstract: Strong interest in multidecadal changes in ocean temperature and heat transport has resulted in the occupation of the North Atlantic Ocean hydrographic transect along 24.5°N five times since 1957, more than any other transoceanic section in the world. This latitude is chosen because it is where the northward ocean transport of heat in the Atlantic reaches its maximum. An analysis of the five oceanographic cruises at this latitude shows that there has been a significant cooling of −0.15°C in the upper ocean (600–1800-dbar range) over the last 7 years, from 1998 to 2004, which is in contrast to the warming of 0.27°C observed from 1957 to 1998. Salinity shows a similar change in tendency, with freshening since 1998. For the upper ocean at 24.5°N, 1998 was the warmest and saltiest year since 1957. Data from the Argo network are used to corroborate the strong cooling and freshening since 1998, showing a −0.13°C cooling in the period between 1998 and 2006 and revealing interannual variability between 2005 and 2008 to be much smaller than the decadal variability estimated using the transect. The results also demonstrate that Argo is an invaluable tool for observing the oscillations in the tendencies of the ocean.
A Skeptic's Guide to the Greenhouse Effect
A Skeptic's Guide to the Greenhouse Effect is a new recommended blog in town and appears to be a thought-provoking site, encouraging the interested reader to think outside the box that climate theorists constructed decades, even centuries ago. Here's the latest entry, which ties into an earlier one, On the temperature profile of an ideal gas under the force of gravity:
Roy Spencer defends the Greenhouse Effect
In a blogpost from Roy Spencer published some time ago, he defends the existence of a natural greenhouse effect. The entire post can be read here. I will pick out a few passages that concerns the second law of thermodynamics. Spencer writes the following:
"A second objection has to do with the Second Law of Thermodynamics. It is claimed that since the greenhouse effect depends partly upon cooler upper layers of the atmosphere emitting infrared radiation toward the warmer, lower layers of the atmosphere, that this violates the 2nd Law, which (roughly speaking) says that energy must flow from warmer objects to cooler objects, not the other way around."
There is indeed a formulation of the 2nd law that states the following:
Heat flows spontaneously from higher to lower temperature
He goes on:
"There are different ways to illustrate why this is not a valid objection. First of all, the 2nd Law applies to the behavior of whole systems, not to every part within a system, and to all forms of energy involved in the system…not just its temperature. And in the atmosphere, temperature is only one component to the energy content of an air parcel."
What Spencer wants to say with this is somewhat obscure. The formulation I stated above can be found in standard textbooks on thermodynamics and is pretty straightforward. Furthermore he states that the 2nd law applies to all forms of energy, not just the temperature. First of all, temperature is not a form of energy but relates to energy and entropy by the formula
1/T = dS/dE.
Secondly, the energy he refers to that is not included in the "temperature", does he mean the potential energy? Probably. Well, the potential energy could be included in the heat capacity of the gas, indeed, in statistical mechanics one observes that the heat capacity of an ideal gas in a gravitational field increases to 5/2kT per constituent particle to be compared with the value 3/2kT holding without the field. This accounts for the potential energy of the gas.
Furthermore he writes:
"Secondly, the idea that a cooler atmospheric layer can emit infrared energy toward a warmer atmospheric layer below it seems unphysical to many people. I suppose this is because we would not expect a cold piece of metal to transfer heat into a warm piece of metal. But the processes involved in conductive heat transfer are not the same as in radiative heat transfer. A hot star out in space will still receive, and absorb, radiant energy from a cooler nearby star…even though the NET flow of energy will be in the opposite direction. In other words, a photon being emitted by the cooler star doesn’t stick its finger out to see how warm the surroundings are before it decides to leave."
This is even more obscure. What precisely is the difference between conductive heat transfer and radiative heat transfer? Is it that when a hot metal plate loses heat to a colder plate it does so because it has first measured the temperature of the colder plate and concluded that it was lower that its own?
We will return to discuss this issue at length later. Stay tuned..
Thursday, January 27, 2011
Himalayan glaciers not melting because of climate change, report finds
Himalayan glaciers are actually advancing rather than retreating, claims the first major study since a controversial UN report said they would be melted within quarter of a century.
The Telegraph 1/27/11
By Dean Nelson and Richard Alleyne
Researchers have discovered that contrary to popular belief half of the ice flows in the Karakoram range of the mountains are actually growing rather than shrinking.
The discovery adds a new twist to the row over whether global warming is causing the world's highest mountain range to lose its ice cover.
It further challenges claims made in a 2007 report by the UN's Intergovernmental Panel on Climate Change that the glaciers would be gone by 2035.
Although the head of the panel Dr Rajendra Pachauri later admitted the claim was an error gleaned from unchecked research, he maintained that global warming was melting the glaciers at "a rapid rate", threatening floods throughout north India.
The new study by scientists at the Universities of California and Potsdam has found that half of the glaciers in the Karakoram range, in the northwestern Himlaya, are in fact advancing and that global warming is not the deciding factor in whether a glacier survives or melts.
Dr Bodo Bookhagen, Dirk Scherler and Manfred Strecker studied 286 glaciers between the Hindu Kush on the Afghan-Pakistan border to Bhutan, taking in six areas.
Their report, published in the journal Nature Geoscience, found the key factor affecting their advance or retreat is the amount of debris – rocks and mud – strewn on their surface, not the general nature of climate change.
Glaciers surrounded by high mountains and covered with more than two centimetres of debris are protected from melting.
Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher.
In contrast, more than 50 per cent of observed glaciers in the Karakoram region in the northwestern Himalaya are advancing or stable.
"Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability or global sea level," the authors concluded.
Dr Bookhagen said their report had shown "there is no stereotypical Himalayan glacier" in contrast to the UN's climate change report which, he said, "lumps all Himalayan glaciers together."
Dr Pachauri, head of the Nobel prize-winning UN Intergovernmental Panel on Climate Change, has remained silent on the matter since he was forced to admit his report's claim that the Himalayan glaciers would melt by 2035 was an error and had not been sourced from a peer-reviewed scientific journal. It came from a World Wildlife Fund report.
He angered India's environment minister and the country's leading glaciologist when he attacked those who questioned his claim as purveyors of "voodoo science".
The environment Minister Jairam Ramesh had cited research indicating some Himalayan glaciers were advancing in the face of the UN's claim.
Abstract located here:
Spatially variable response of Himalayan glaciers to climate change affected by debris cover
Authors: Dirk Scherler, Bodo Bookhagen, Manfred R. Strecker
Nature Geoscience
Controversy about the current state and future evolution of Himalayan glaciers has been stirred up by erroneous statements in the fourth report by the Intergovernmental Panel on Climate Change1, 2. Variable retreat rates3, 4, 5, 6 and a paucity of glacial mass-balance data7, 8 make it difficult to develop a coherent picture of regional climate-change impacts in the region. Here, we report remotely-sensed frontal changes and surface velocities from glaciers in the greater Himalaya between 2000 and 2008 that provide evidence for strong spatial variations in glacier behaviour which are linked to topography and climate. More than 65% of the monsoon-influenced glaciers that we observed are retreating, but heavily debris-covered glaciers with stagnant low-gradient terminus regions typically have stable fronts. Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher. In contrast, more than 50% of observed glaciers in the westerlies-influenced Karakoram region in the northwestern Himalaya are advancing or stable. Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability9, 10 or global sea level11.
The Telegraph 1/27/11
By Dean Nelson and Richard Alleyne
Researchers have discovered that contrary to popular belief half of the ice flows in the Karakoram range of the mountains are actually growing rather than shrinking.
The discovery adds a new twist to the row over whether global warming is causing the world's highest mountain range to lose its ice cover.
It further challenges claims made in a 2007 report by the UN's Intergovernmental Panel on Climate Change that the glaciers would be gone by 2035.
Although the head of the panel Dr Rajendra Pachauri later admitted the claim was an error gleaned from unchecked research, he maintained that global warming was melting the glaciers at "a rapid rate", threatening floods throughout north India.
The new study by scientists at the Universities of California and Potsdam has found that half of the glaciers in the Karakoram range, in the northwestern Himlaya, are in fact advancing and that global warming is not the deciding factor in whether a glacier survives or melts.
Dr Bodo Bookhagen, Dirk Scherler and Manfred Strecker studied 286 glaciers between the Hindu Kush on the Afghan-Pakistan border to Bhutan, taking in six areas.
Their report, published in the journal Nature Geoscience, found the key factor affecting their advance or retreat is the amount of debris – rocks and mud – strewn on their surface, not the general nature of climate change.
Glaciers surrounded by high mountains and covered with more than two centimetres of debris are protected from melting.
Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher.
In contrast, more than 50 per cent of observed glaciers in the Karakoram region in the northwestern Himalaya are advancing or stable.
"Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability or global sea level," the authors concluded.
Dr Bookhagen said their report had shown "there is no stereotypical Himalayan glacier" in contrast to the UN's climate change report which, he said, "lumps all Himalayan glaciers together."
Dr Pachauri, head of the Nobel prize-winning UN Intergovernmental Panel on Climate Change, has remained silent on the matter since he was forced to admit his report's claim that the Himalayan glaciers would melt by 2035 was an error and had not been sourced from a peer-reviewed scientific journal. It came from a World Wildlife Fund report.
He angered India's environment minister and the country's leading glaciologist when he attacked those who questioned his claim as purveyors of "voodoo science".
The environment Minister Jairam Ramesh had cited research indicating some Himalayan glaciers were advancing in the face of the UN's claim.
Abstract located here:
Spatially variable response of Himalayan glaciers to climate change affected by debris cover
Authors: Dirk Scherler, Bodo Bookhagen, Manfred R. Strecker
Nature Geoscience
Controversy about the current state and future evolution of Himalayan glaciers has been stirred up by erroneous statements in the fourth report by the Intergovernmental Panel on Climate Change1, 2. Variable retreat rates3, 4, 5, 6 and a paucity of glacial mass-balance data7, 8 make it difficult to develop a coherent picture of regional climate-change impacts in the region. Here, we report remotely-sensed frontal changes and surface velocities from glaciers in the greater Himalaya between 2000 and 2008 that provide evidence for strong spatial variations in glacier behaviour which are linked to topography and climate. More than 65% of the monsoon-influenced glaciers that we observed are retreating, but heavily debris-covered glaciers with stagnant low-gradient terminus regions typically have stable fronts. Debris-covered glaciers are common in the rugged central Himalaya, but they are almost absent in subdued landscapes on the Tibetan Plateau, where retreat rates are higher. In contrast, more than 50% of observed glaciers in the westerlies-influenced Karakoram region in the northwestern Himalaya are advancing or stable. Our study shows that there is no uniform response of Himalayan glaciers to climate change and highlights the importance of debris cover for understanding glacier retreat, an effect that has so far been neglected in predictions of future water availability9, 10 or global sea level11.
Climate change: Barack Obama less interested than Bush
Obama made no mention of climate change in his state of the union speech, appearing to signal a shift by White House
The Guardian 1/26/11
Barack Obama has paid less attention to climate change in his State of the Union addresses than any other president in the past 20 years, an analysis by a British researcher has found.
Obama made no mention of the words climate change, global warming or environment in his hour-long speech on Tuesday night – when presidents typically employ the pomp and ceremony of the annual occasion to put forward their priorities before an American television audience in the tens of millions.
Aggregate mentions of 'climate change', 'global warming' and the 'environment' in the state of the union address since 1990. The omission was in stark contrast to the presidential candidate who campaigned in 2008 warning of the existential threat posed by climate change.
But even before the speech, however, Obama was exhibiting a reluctance to use the state of the union to make an explicit reference to the issue of climate change, Matthew Hope, a researcher in American politics at the University of Bristol found.
In last night's speech, Obama did devote several minutes to the economic opportunities presented by innovations in clean energy, and the convenience that would come through developing high-speed rail. He repeated his 2009 commitment – endorsed by all G20 leaders – to end fossil fuel subsidies. "Instead of subsidising yesterday's energy, let's invest in tomorrow's," Obama said.
But in his three such addresses since becoming president, he has on average made fewer mentions of climate change or the environment than Bill Clinton or even George Bush.
"Clearly they have decided climate change is a no-go area," Hope said.
The choice of language for the most recent speech appears to signal a strategic shift by the White House. In a conversation with reporters today, Nancy Sutley, the White House council on environmental quality, avoided mention of climate change – though offering assurances Obama remained committed to the cause of clean energy. The White House has also removed reference to climate change from its website.
On average, Obama has mentioned the words environment, climate change and global warming only once in his state of the union speeches. Clinton had an average of six mentions, while the former oil man Bush – who famously used his 2006 speech to lament America's addiction to oil – mentioned climate change and environment on average twice.
The researcher rated the speeches only on the mention of the terms environment, climate change, and global warming. He did not track mention of related issues such as green jobs, or clean tech.
Hope in his study goes on to note that Obama appears to be on a downward trajectory in regard to mention of climate change.
Robert Brulle, a professor in sociology and environment science at Drexel University, said the administration appeared to be following the advice of ecoAmerica and the Breakthrough Institute which argue that reframing the problem of climate change as an energy quest would be more popular with voters.
"In my opinion, this approach has several major drawbacks," Brulle wrote in an email. He called the White House strategy intellectually dishonest and short-sighted.
"The only real reason to transform our energy systems is to address GHG emissions. But by failing to even acknowledge the threat posed by climate change, the reasoning for an energy transformation is very thin."
Despite his choice of language, Obama to date has done more than Bush or Clinton to address global warming. But Brulle warned: "Taking a technology-only approach without meaningful mechanisms to drive adoption of renewable energy means further delay in initiating the massive GHG reductions that are needed to deal with climate change."
Liberal bloggers suggested this morning that it was naive of Obama to think he could persuade Americans to act on climate change without talking about it. "I do continue to think that it is both pointless and foolish, catastrophically so, in fact, for him to refuse to talk about global warming or climate change with so much of America watching," Joe Romm, who runs the Climate Progress blog at the Centre for American Progress said.
There has been increasing concern among environmental organisations that Obama is prepared to give up on greenhouse gas measures so as to try to build better relationships with Republicans in Congress and the business community.
Such fears were amplified by Obama's failure to use his speech to signal his support for the Environmental Protection Agency, which is under assault from Republicans.
Obama instead reaffirmed a pledge last week to do away with overly complicated environmental regulations, making a joke about the bureaucracy involving salmon.
The announcement before the speech that Carol Browner, the energy and climate change adviser, is to leave the White House has also heightened fears that Obama has given up on his campaign promise to take action on climate change.
go to source for images and links
The Guardian 1/26/11
Barack Obama has paid less attention to climate change in his State of the Union addresses than any other president in the past 20 years, an analysis by a British researcher has found.
Obama made no mention of the words climate change, global warming or environment in his hour-long speech on Tuesday night – when presidents typically employ the pomp and ceremony of the annual occasion to put forward their priorities before an American television audience in the tens of millions.
Aggregate mentions of 'climate change', 'global warming' and the 'environment' in the state of the union address since 1990. The omission was in stark contrast to the presidential candidate who campaigned in 2008 warning of the existential threat posed by climate change.
But even before the speech, however, Obama was exhibiting a reluctance to use the state of the union to make an explicit reference to the issue of climate change, Matthew Hope, a researcher in American politics at the University of Bristol found.
In last night's speech, Obama did devote several minutes to the economic opportunities presented by innovations in clean energy, and the convenience that would come through developing high-speed rail. He repeated his 2009 commitment – endorsed by all G20 leaders – to end fossil fuel subsidies. "Instead of subsidising yesterday's energy, let's invest in tomorrow's," Obama said.
But in his three such addresses since becoming president, he has on average made fewer mentions of climate change or the environment than Bill Clinton or even George Bush.
"Clearly they have decided climate change is a no-go area," Hope said.
The choice of language for the most recent speech appears to signal a strategic shift by the White House. In a conversation with reporters today, Nancy Sutley, the White House council on environmental quality, avoided mention of climate change – though offering assurances Obama remained committed to the cause of clean energy. The White House has also removed reference to climate change from its website.
On average, Obama has mentioned the words environment, climate change and global warming only once in his state of the union speeches. Clinton had an average of six mentions, while the former oil man Bush – who famously used his 2006 speech to lament America's addiction to oil – mentioned climate change and environment on average twice.
The researcher rated the speeches only on the mention of the terms environment, climate change, and global warming. He did not track mention of related issues such as green jobs, or clean tech.
Hope in his study goes on to note that Obama appears to be on a downward trajectory in regard to mention of climate change.
Robert Brulle, a professor in sociology and environment science at Drexel University, said the administration appeared to be following the advice of ecoAmerica and the Breakthrough Institute which argue that reframing the problem of climate change as an energy quest would be more popular with voters.
"In my opinion, this approach has several major drawbacks," Brulle wrote in an email. He called the White House strategy intellectually dishonest and short-sighted.
"The only real reason to transform our energy systems is to address GHG emissions. But by failing to even acknowledge the threat posed by climate change, the reasoning for an energy transformation is very thin."
Despite his choice of language, Obama to date has done more than Bush or Clinton to address global warming. But Brulle warned: "Taking a technology-only approach without meaningful mechanisms to drive adoption of renewable energy means further delay in initiating the massive GHG reductions that are needed to deal with climate change."
Liberal bloggers suggested this morning that it was naive of Obama to think he could persuade Americans to act on climate change without talking about it. "I do continue to think that it is both pointless and foolish, catastrophically so, in fact, for him to refuse to talk about global warming or climate change with so much of America watching," Joe Romm, who runs the Climate Progress blog at the Centre for American Progress said.
There has been increasing concern among environmental organisations that Obama is prepared to give up on greenhouse gas measures so as to try to build better relationships with Republicans in Congress and the business community.
Such fears were amplified by Obama's failure to use his speech to signal his support for the Environmental Protection Agency, which is under assault from Republicans.
Obama instead reaffirmed a pledge last week to do away with overly complicated environmental regulations, making a joke about the bureaucracy involving salmon.
The announcement before the speech that Carol Browner, the energy and climate change adviser, is to leave the White House has also heightened fears that Obama has given up on his campaign promise to take action on climate change.
go to source for images and links
Saturday, January 22, 2011
Does Helping the Planet Hurt the Poor?
Yes, if We Listen to Green Extremists
By Bjørn Lomborg WSJ.com 1/22/11
Peter Singer poses an interesting and important question: Can we afford to both reduce poverty and clean up the environment? From an empirical standpoint, the answer is definitely yes. The developed world is sufficiently rich that doing both should be well within our means.
The key, of course, is being smart about how we tackle these big problems. Right now, the only legally binding climate policy, the European Union's 20-20 policy, will cost its members $250 billion in lost economic growth every year over the next century (according to research by the noted climate economist Richard Tol). Yet the net effect will be an almost immeasurable reduction in global temperatures of just 0.1 degrees Fahrenheit by 2100. If spent smartly, the same resources really could fix both global warming and poverty.
In a curious way, Mr. Singer's essay is an example of one of the stumbling blocks to making smarter policy decisions. He starts out saying we want to do a variety of good things, but almost reflexively he ends up focusing on green issues—and doing so in a very predictable way: The developed world has sinned and needs to atone.
Mr. Singer correctly points out that concerns over the environment and poverty are often linked. But he thinks about this only in terms of how poverty is bad for the environment, since poorer, less educated people tend to have more children, which puts more pressure on such things as forests and biodiversity.
But his argument can—and should—be taken further. As we get richer and such immediate concerns as water, food and health become less of an issue, we become more open to environmental concerns. Among other things, we become more willing to pay extra for technology that pollutes less and to accept more costly regulations to limit pollution.
We've already seen the results of this "greening" of society in the developed world, where for a number of decades air and water pollution has been dropping steadily. In London, which keeps the best statistics, air pollution maxed out in 1890 and has been declining ever since—to the point where the air is now cleaner than it has been at any time since 1585. In similar fashion, in some of the better-off developing countries, the focus has shifted from creating to cleaning up pollution. Today the air in both Mexico City and Santiago, Chile, is getting healthier.
.Mr. Singer also evades the awkward point that an excessively green approach can actually make the environment more imperiled. Consider the fate of the world's forests. As we get richer and more environmentally conscious, our growing passion for organic farming and antipathy to genetically modified crops inevitably leads us to accept decreased agricultural yields. An obvious consequence is that we end up converting more wilderness to agricultural use.
We've seen similar unintended consequences from the use of inefficient first-generation biofuels such as ethanol. As a result of pressure from environmentalists and lobbying by agricultural interests, use of these fuels was made mandatory by many governments in the industrialized world. Diverting farm products into our gas tanks has driven up food prices, resulting in more starvation and wasted resources and causing still more forests to be razed.
Mr. Singer criticizes the use of cost-benefit analysis because it doesn't value human lives at the same rate in developed and developing countries. As uncomfortable as it may be, the reality is that we don't actually think of all people as equal. If we did, we would be building all of our new hospitals in developing countries. Mr. Singer may regard this fact as shameful, but ignoring the ethical judgment of nearly everyone makes his analysis less helpful.
Similarly, Mr. Singer criticizes the way that discounting is used by economists to make future costs comparable to values in the present. He argues that we should give "equal weight to the interests of future generations." Once again, this may sound admirable. But think about the consequences of heeding Mr. Singer's advice. By choosing a discount rate close to zero, we effectively say that the desires of infinite numbers of future generations are vastly more important than our own, meaning that we should save the great bulk of our resources for the future and consume just enough to survive. Essentially, our generation should eat porridge, while we leave virtually all benefits to the future.
This was what the economist Nicholas Stern concluded in the controversial 2006 review of climate change that he conducted for the British government. Mr. Stern said, in effect, that we should be saving 97.5% of all our wealth for future generations. The silliness of this view becomes apparent when we realize that, by this logic, our children and grandchildren also would be expected to continue the cycle of bowing to future generations, leaving almost everything to their progeny and pushing forward an ever larger mountain of resources that are never to be consumed.
We don't behave this way. Partly because we are selfish and partly because we expect that future generations are likely to be much better off than we are. Compared with the future, we are the poor generation, and it is hardly moral to have the poor generation pay the most. Rather, it makes sense to leave generalized assets, such as knowledge and technology, to future generations. This gives them a much greater capacity to tackle problems that come their way. Our actual financial savings for the future tend to be about 15% of income. We could debate whether the number should be 10% or 20%, but it is far-fetched to suggest that it should be 97.5%. We all recognize that we should care for the future, but at the same time we should care for ourselves.
Mr. Singer falls into the trap of saying that global warming is so terrible that dealing with it should take priority over all other concerns. This is simply wrong. Global warming is a problem that we must confront, but according to economic modeling by Carlo Carraro of the University of Venice, its damage is likely to cost something on the order of 2% to 5% of GDP by the end of the century.
At the same time, it is helpful to recall that our fossil-fuel economy has created amazing opportunities for almost everyone in the world, lifting hundreds of millions of people out of poverty. The United Nations climate panel estimates that economic growth will enable an increase per capita GDP in developing countries by some 2,400% over the course of the century.
Global Warming saves 200,000 lives per year
Mr. Singer claims that problems related to climate change (such as an increased incidence of malaria) cause 140,000 deaths a year. Let's put aside for the moment the fact that rising temperatures are likely do more good than harm on this score, preventing so many cold-related fatalities that the net effect of global warming is likely to be a total of about 200,000 fewer people dying each year.
Even if we accept Mr. Singer's concerns, is fighting global warming through drastic carbon cuts really the best way to help people with malaria? By implementing the Kyoto protocol (at a cost of $180 billion a year), we could reduce the number of annual malaria deaths by 1,400. But we could prevent 850,000 malaria deaths a year at a cost of just $3 billion simply by providing adequate supplies of mosquito nets and medicine. For every potential malaria victim saved through climate policy, we could save 36,000 people through smarter, cheaper remedies for malaria.
From Mr. Singer's initial question of whether we can afford to both reduce poverty and clean up the environment, he ends up focusing on global warming and arguing that we simply need to "use less air-conditioning and less heat, fly and drive less, and eat less meat." This is a poor prescription, not only for those of us in developed nations but for developing countries and for future generations as well. It is an incredibly expensive way to achieve very little—and it won't happen.
Fortunately, there is a more sensible way forward that could use the same $250 billion that the European Union is expecting to waste annually on ineffective global warming policies. First, we should spend about $100 billion a year on research and development to make green energy cheaper and more widely available. Mr. Singer argues that it is not ethically defensible just to hope for a "technological miracle" that will allow us to end our reliance on fossil fuels. He is right. We must invest much more in green energy research and development, and it is the most politically realistic and economically efficient way to combat global warming.
This would leave $50 billion a year to develop adaptations for dealing with the impact of global warming and $100 billion a year for the world's poor, a sum that, according to the U.N., would go a long way toward providing them with clean drinking water, sanitation, food, health and education.
We are perfectly capable today of tackling the problems of both poverty and environmental pollution. But to do so, we must think clearly and rationally, and we must carefully weigh the costs and benefits of the approaches available to us.
—Mr. Lomborg is the author of "The Skeptical Environmentalist" and "Cool It." He directs the Copenhagen Consensus Center and is an adjunct professor at Copenhagen Business School.
Cost-Benefit Analysis
In 2008, Mr. Lomborg's Copenhagen Consensus Center convened some of the world's top economists to evaluate how $75 billion could be best used to solve global problems.
At the top:
Micronutrient supplements for children (vitamin A and zinc)
For an annual cost of $60.4 million, the economists projected a yield of more than $1 billion in benefits.
Tuberculosis management
In 22 countries with a high incidence of TB, diagnosis and treatment would yield $1.7 trillion in benefits for a cost of $18.3 billion.
At the bottom:
Global-warming mitigation
Spending $800 billion on carbon taxes was found to generate only $685 billion worth of benefit.
By Bjørn Lomborg WSJ.com 1/22/11
Peter Singer poses an interesting and important question: Can we afford to both reduce poverty and clean up the environment? From an empirical standpoint, the answer is definitely yes. The developed world is sufficiently rich that doing both should be well within our means.
The key, of course, is being smart about how we tackle these big problems. Right now, the only legally binding climate policy, the European Union's 20-20 policy, will cost its members $250 billion in lost economic growth every year over the next century (according to research by the noted climate economist Richard Tol). Yet the net effect will be an almost immeasurable reduction in global temperatures of just 0.1 degrees Fahrenheit by 2100. If spent smartly, the same resources really could fix both global warming and poverty.
In a curious way, Mr. Singer's essay is an example of one of the stumbling blocks to making smarter policy decisions. He starts out saying we want to do a variety of good things, but almost reflexively he ends up focusing on green issues—and doing so in a very predictable way: The developed world has sinned and needs to atone.
Mr. Singer correctly points out that concerns over the environment and poverty are often linked. But he thinks about this only in terms of how poverty is bad for the environment, since poorer, less educated people tend to have more children, which puts more pressure on such things as forests and biodiversity.
But his argument can—and should—be taken further. As we get richer and such immediate concerns as water, food and health become less of an issue, we become more open to environmental concerns. Among other things, we become more willing to pay extra for technology that pollutes less and to accept more costly regulations to limit pollution.
We've already seen the results of this "greening" of society in the developed world, where for a number of decades air and water pollution has been dropping steadily. In London, which keeps the best statistics, air pollution maxed out in 1890 and has been declining ever since—to the point where the air is now cleaner than it has been at any time since 1585. In similar fashion, in some of the better-off developing countries, the focus has shifted from creating to cleaning up pollution. Today the air in both Mexico City and Santiago, Chile, is getting healthier.
.Mr. Singer also evades the awkward point that an excessively green approach can actually make the environment more imperiled. Consider the fate of the world's forests. As we get richer and more environmentally conscious, our growing passion for organic farming and antipathy to genetically modified crops inevitably leads us to accept decreased agricultural yields. An obvious consequence is that we end up converting more wilderness to agricultural use.
We've seen similar unintended consequences from the use of inefficient first-generation biofuels such as ethanol. As a result of pressure from environmentalists and lobbying by agricultural interests, use of these fuels was made mandatory by many governments in the industrialized world. Diverting farm products into our gas tanks has driven up food prices, resulting in more starvation and wasted resources and causing still more forests to be razed.
Mr. Singer criticizes the use of cost-benefit analysis because it doesn't value human lives at the same rate in developed and developing countries. As uncomfortable as it may be, the reality is that we don't actually think of all people as equal. If we did, we would be building all of our new hospitals in developing countries. Mr. Singer may regard this fact as shameful, but ignoring the ethical judgment of nearly everyone makes his analysis less helpful.
Similarly, Mr. Singer criticizes the way that discounting is used by economists to make future costs comparable to values in the present. He argues that we should give "equal weight to the interests of future generations." Once again, this may sound admirable. But think about the consequences of heeding Mr. Singer's advice. By choosing a discount rate close to zero, we effectively say that the desires of infinite numbers of future generations are vastly more important than our own, meaning that we should save the great bulk of our resources for the future and consume just enough to survive. Essentially, our generation should eat porridge, while we leave virtually all benefits to the future.
This was what the economist Nicholas Stern concluded in the controversial 2006 review of climate change that he conducted for the British government. Mr. Stern said, in effect, that we should be saving 97.5% of all our wealth for future generations. The silliness of this view becomes apparent when we realize that, by this logic, our children and grandchildren also would be expected to continue the cycle of bowing to future generations, leaving almost everything to their progeny and pushing forward an ever larger mountain of resources that are never to be consumed.
We don't behave this way. Partly because we are selfish and partly because we expect that future generations are likely to be much better off than we are. Compared with the future, we are the poor generation, and it is hardly moral to have the poor generation pay the most. Rather, it makes sense to leave generalized assets, such as knowledge and technology, to future generations. This gives them a much greater capacity to tackle problems that come their way. Our actual financial savings for the future tend to be about 15% of income. We could debate whether the number should be 10% or 20%, but it is far-fetched to suggest that it should be 97.5%. We all recognize that we should care for the future, but at the same time we should care for ourselves.
Mr. Singer falls into the trap of saying that global warming is so terrible that dealing with it should take priority over all other concerns. This is simply wrong. Global warming is a problem that we must confront, but according to economic modeling by Carlo Carraro of the University of Venice, its damage is likely to cost something on the order of 2% to 5% of GDP by the end of the century.
At the same time, it is helpful to recall that our fossil-fuel economy has created amazing opportunities for almost everyone in the world, lifting hundreds of millions of people out of poverty. The United Nations climate panel estimates that economic growth will enable an increase per capita GDP in developing countries by some 2,400% over the course of the century.
Global Warming saves 200,000 lives per year
Mr. Singer claims that problems related to climate change (such as an increased incidence of malaria) cause 140,000 deaths a year. Let's put aside for the moment the fact that rising temperatures are likely do more good than harm on this score, preventing so many cold-related fatalities that the net effect of global warming is likely to be a total of about 200,000 fewer people dying each year.
Even if we accept Mr. Singer's concerns, is fighting global warming through drastic carbon cuts really the best way to help people with malaria? By implementing the Kyoto protocol (at a cost of $180 billion a year), we could reduce the number of annual malaria deaths by 1,400. But we could prevent 850,000 malaria deaths a year at a cost of just $3 billion simply by providing adequate supplies of mosquito nets and medicine. For every potential malaria victim saved through climate policy, we could save 36,000 people through smarter, cheaper remedies for malaria.
From Mr. Singer's initial question of whether we can afford to both reduce poverty and clean up the environment, he ends up focusing on global warming and arguing that we simply need to "use less air-conditioning and less heat, fly and drive less, and eat less meat." This is a poor prescription, not only for those of us in developed nations but for developing countries and for future generations as well. It is an incredibly expensive way to achieve very little—and it won't happen.
Fortunately, there is a more sensible way forward that could use the same $250 billion that the European Union is expecting to waste annually on ineffective global warming policies. First, we should spend about $100 billion a year on research and development to make green energy cheaper and more widely available. Mr. Singer argues that it is not ethically defensible just to hope for a "technological miracle" that will allow us to end our reliance on fossil fuels. He is right. We must invest much more in green energy research and development, and it is the most politically realistic and economically efficient way to combat global warming.
This would leave $50 billion a year to develop adaptations for dealing with the impact of global warming and $100 billion a year for the world's poor, a sum that, according to the U.N., would go a long way toward providing them with clean drinking water, sanitation, food, health and education.
We are perfectly capable today of tackling the problems of both poverty and environmental pollution. But to do so, we must think clearly and rationally, and we must carefully weigh the costs and benefits of the approaches available to us.
—Mr. Lomborg is the author of "The Skeptical Environmentalist" and "Cool It." He directs the Copenhagen Consensus Center and is an adjunct professor at Copenhagen Business School.
Cost-Benefit Analysis
In 2008, Mr. Lomborg's Copenhagen Consensus Center convened some of the world's top economists to evaluate how $75 billion could be best used to solve global problems.
At the top:
Micronutrient supplements for children (vitamin A and zinc)
For an annual cost of $60.4 million, the economists projected a yield of more than $1 billion in benefits.
Tuberculosis management
In 22 countries with a high incidence of TB, diagnosis and treatment would yield $1.7 trillion in benefits for a cost of $18.3 billion.
At the bottom:
Global-warming mitigation
Spending $800 billion on carbon taxes was found to generate only $685 billion worth of benefit.
Wednesday, January 19, 2011
Plant Responses to Global Warming
NIPPC Report Repost:
Reference: Lin, D., Xia, J. and Wan, S. 2010. Climate warming and biomass accumulation of terrestrial plants: a meta-analysis. New Phytologist 188: 187-198.
Lin et al. (2010) introduce the subject of their study by saying "most models predict that climate warming will increase the release of carbon dioxide from the terrestrial biosphere into the atmosphere, thus triggering positive climate-terrestrial carbon feedback which leads to a warmer climate." However, they state that the "stimulation of biomass accumulation and net primary productivity of terrestrial ecosystems under rising temperature (Rustad et al., 2001; Melillo et al., 2002; Luo et al., 2009) may enhance carbon sequestration and attenuate the positive feedback between climate warming and the terrestrial biosphere." So which view is correct?
In an effort to find out, Lin et al. conducted a meta-analysis of pertinent data they obtained from 127 individual studies that were published prior to June 2009, in order to determine if the overall impact of a substantial increase in the air's CO2 concentration on terrestrial biomass production would likely be positive or negative.
The three scientists report that for the totality of terrestrial plants included in their analysis, "warming significantly increased biomass by 12.3%," while noting there was a "significantly greater stimulation of woody (+26.7%) than herbaceous species (+5.2%)." They also found that the warming effects on plant biomass production "did not change with mean annual precipitation or experimental duration," and that "other treatments, including CO2 enrichment, nitrogen addition, drought and water addition, did not alter warming responses of plant biomass."
The Chinese researchers conclude, in their words, that "results in this and previous meta-analyses (Arft et al., 1999; Rustad et al., 2001; Dormann and Woodin, 2001; Walker et al., 2006) have revealed that warming generally increases terrestrial plant biomass, indicating enhanced terrestrial carbon uptake via plant growth and net primary productivity." Thus, we can logically expect that (1) the ongoing rise in the air's CO2 content will soften its own tendency to increase global temperatures, while at the same time (2) blessing earth's terrestrial vegetation with greater growth rates and biomass production, both in the agricultural arena and throughout the planet's many natural ecosystems.
Reference: Lin, D., Xia, J. and Wan, S. 2010. Climate warming and biomass accumulation of terrestrial plants: a meta-analysis. New Phytologist 188: 187-198.
Lin et al. (2010) introduce the subject of their study by saying "most models predict that climate warming will increase the release of carbon dioxide from the terrestrial biosphere into the atmosphere, thus triggering positive climate-terrestrial carbon feedback which leads to a warmer climate." However, they state that the "stimulation of biomass accumulation and net primary productivity of terrestrial ecosystems under rising temperature (Rustad et al., 2001; Melillo et al., 2002; Luo et al., 2009) may enhance carbon sequestration and attenuate the positive feedback between climate warming and the terrestrial biosphere." So which view is correct?
In an effort to find out, Lin et al. conducted a meta-analysis of pertinent data they obtained from 127 individual studies that were published prior to June 2009, in order to determine if the overall impact of a substantial increase in the air's CO2 concentration on terrestrial biomass production would likely be positive or negative.
The three scientists report that for the totality of terrestrial plants included in their analysis, "warming significantly increased biomass by 12.3%," while noting there was a "significantly greater stimulation of woody (+26.7%) than herbaceous species (+5.2%)." They also found that the warming effects on plant biomass production "did not change with mean annual precipitation or experimental duration," and that "other treatments, including CO2 enrichment, nitrogen addition, drought and water addition, did not alter warming responses of plant biomass."
The Chinese researchers conclude, in their words, that "results in this and previous meta-analyses (Arft et al., 1999; Rustad et al., 2001; Dormann and Woodin, 2001; Walker et al., 2006) have revealed that warming generally increases terrestrial plant biomass, indicating enhanced terrestrial carbon uptake via plant growth and net primary productivity." Thus, we can logically expect that (1) the ongoing rise in the air's CO2 content will soften its own tendency to increase global temperatures, while at the same time (2) blessing earth's terrestrial vegetation with greater growth rates and biomass production, both in the agricultural arena and throughout the planet's many natural ecosystems.
Monday, January 17, 2011
Richard Lindzen: A Case Against Precipitous Climate Action
The Global Warming Policy Foundation, 15 January 2011
The notion of a static, unchanging climate is foreign to the history of the earth or any other planet with a fluid envelope. The fact that the developed world went into hysterics over changes in global mean temperature anomaly of a few tenths of a degree will astound future generations. Such hysteria simply represents the scientific illiteracy of much of the public, the susceptibility of the public to the substitution of repetition for truth, and the exploitation of these weaknesses by politicians, environmental promoters, and, after 20 years of media drum beating, many others as well.
Climate is always changing. We have had ice ages and warmer periods when alligators were found in Spitzbergen. Ice ages have occurred in a hundred thousand year cycle for the last 700 thousand years, and there have been previous periods that appear to have been warmer than the present despite CO2 levels being lower than they are now. More recently, we have had the medieval warm period and the little ice age. During the latter, alpine glaciers advanced to the chagrin of overrun villages. Since the beginning of the 19th Century these glaciers have been retreating. Frankly, we don’t fully understand either the advance or the retreat.
For small changes in climate associated with tenths of a degree, there is no need for any external cause. The earth is never exactly in equilibrium. The motions of the massive oceans where heat is moved between deep layers and the surface provides variability on time scales from years to centuries. Recent work (Tsonis et al, 2007), suggests that this variability is enough to account for all climate change since the 19th Century.
For warming since 1979, there is a further problem. The dominant role of cumulus convection in the tropics requires that temperature approximately follow what is called a moist adiabatic profile. This requires that warming in the tropical upper troposphere be 2-3 times greater than at the surface. Indeed, all models do show this, but the data doesn't and this means that something is wrong with the data. It is well known that above about 2 km altitude, the tropical temperatures are pretty homogeneous in the horizontal so that sampling is not a problem. Below two km (roughly the height of what is referred to as the trade wind inversion), there is much more horizontal variability, and, therefore, there is a profound sampling problem. Under the circumstances, it is reasonable to conclude that the problem resides in the surface data, and that the actual trend at the surface is about 60% too large. Even the claimed trend is larger than what models would have projected but for the inclusion of an arbitrary fudge factor due to aerosol cooling. The discrepancy was reported by Lindzen (2007) and by Douglass et al (2007). Inevitably in climate science, when data conflicts with models, a small coterie of scientists can be counted upon to modify the data. Thus, Santer, et al (2008), argue that stretching uncertainties in observations and models might marginally eliminate the inconsistency. That the data should always need correcting to agree with models is totally implausible and indicative of a certain corruption within the climate science community.
It turns out that there is a much more fundamental and unambiguous check of the role of feedbacks in enhancing greenhouse warming that also shows that all models are greatly exaggerating climate sensitivity. Here, it must be noted that the greenhouse effect operates by inhibiting the cooling of the climate by reducing net outgoing radiation. However, the contribution of increasing CO2 alone does not, in fact, lead to much warming (approximately 1 deg. C for each doubling of CO2).
Full essay
The notion of a static, unchanging climate is foreign to the history of the earth or any other planet with a fluid envelope. The fact that the developed world went into hysterics over changes in global mean temperature anomaly of a few tenths of a degree will astound future generations. Such hysteria simply represents the scientific illiteracy of much of the public, the susceptibility of the public to the substitution of repetition for truth, and the exploitation of these weaknesses by politicians, environmental promoters, and, after 20 years of media drum beating, many others as well.
Climate is always changing. We have had ice ages and warmer periods when alligators were found in Spitzbergen. Ice ages have occurred in a hundred thousand year cycle for the last 700 thousand years, and there have been previous periods that appear to have been warmer than the present despite CO2 levels being lower than they are now. More recently, we have had the medieval warm period and the little ice age. During the latter, alpine glaciers advanced to the chagrin of overrun villages. Since the beginning of the 19th Century these glaciers have been retreating. Frankly, we don’t fully understand either the advance or the retreat.
For small changes in climate associated with tenths of a degree, there is no need for any external cause. The earth is never exactly in equilibrium. The motions of the massive oceans where heat is moved between deep layers and the surface provides variability on time scales from years to centuries. Recent work (Tsonis et al, 2007), suggests that this variability is enough to account for all climate change since the 19th Century.
For warming since 1979, there is a further problem. The dominant role of cumulus convection in the tropics requires that temperature approximately follow what is called a moist adiabatic profile. This requires that warming in the tropical upper troposphere be 2-3 times greater than at the surface. Indeed, all models do show this, but the data doesn't and this means that something is wrong with the data. It is well known that above about 2 km altitude, the tropical temperatures are pretty homogeneous in the horizontal so that sampling is not a problem. Below two km (roughly the height of what is referred to as the trade wind inversion), there is much more horizontal variability, and, therefore, there is a profound sampling problem. Under the circumstances, it is reasonable to conclude that the problem resides in the surface data, and that the actual trend at the surface is about 60% too large. Even the claimed trend is larger than what models would have projected but for the inclusion of an arbitrary fudge factor due to aerosol cooling. The discrepancy was reported by Lindzen (2007) and by Douglass et al (2007). Inevitably in climate science, when data conflicts with models, a small coterie of scientists can be counted upon to modify the data. Thus, Santer, et al (2008), argue that stretching uncertainties in observations and models might marginally eliminate the inconsistency. That the data should always need correcting to agree with models is totally implausible and indicative of a certain corruption within the climate science community.
It turns out that there is a much more fundamental and unambiguous check of the role of feedbacks in enhancing greenhouse warming that also shows that all models are greatly exaggerating climate sensitivity. Here, it must be noted that the greenhouse effect operates by inhibiting the cooling of the climate by reducing net outgoing radiation. However, the contribution of increasing CO2 alone does not, in fact, lead to much warming (approximately 1 deg. C for each doubling of CO2).
Full essay
Monday, January 10, 2011
Climate Models Differ on CO2 Warming Effect by over 32°F
A paper published today in the journal Climate of the Past illustrates the magnitude of confusion in climate science regarding the 'settled' 'basic physics' of the CO2 'greenhouse effect.' The climate model results of this paper are compared to 2 other recent peer-reviewed papers and show that the 3 climate models differ by over 32 degrees F (18.3°C) in explaining the 'greenhouse warming' effect of CO2 during the period of time when the entire Earth was covered by ice (the "snowball Earth"). This huge difference dwarfs the IPCC-claimed computer-modeled 0.6°C of anthropogenic global warming during the industrial age and the IPCC-claimed 3°C global warming prediction for doubled CO2 concentrations derived from the same family of computer models. As this study gingerly points out, these are "large differences" between climate models, resulting from differing "assumptions" of the "model physics," in other words, due to whatever fudge factors one chooses to plug in for the 'greenhouse effect' of CO2. All claims of catastrophic anthropogenic global warming rest upon the shaky scientific foundations and gross assumptions of these same climate models.
Three climate models compared for global temperature claimed to result from 0.2 bar CO2 atmospheric level:
1. Hu et al finds 268K = -5.15C = 22.73F
2. Pierrehumbert et al finds 255K = -18.15C = -.67F
3. Le Hir et al finds (for 50% less CO2 or 0.1 bar) 270K+3K (temp increase claimed for doubled CO2 per IPCC) = 273K = -.15C = 31.73F
Final Revised Paper (PDF, 470 KB)
Clim. Past, 7, 17-25, 2011 www.clim-past.net/7/17/2011/ doi:10.5194/cp-7-17-2011
Model-dependence of the CO2 threshold for melting the hard Snowball Earth
Y. Hu, J. Yang, F. Ding, and W. R. Peltier
Abstract. One of the critical issues of the Snowball Earth hypothesis is the CO2 threshold for triggering the deglaciation. Using Community Atmospheric Model version 3.0 (CAM3), we study the problem for the CO2 threshold. Our simulations show large differences from previous results (e.g. Pierrehumbert, 2004, 2005; Le Hir et al., 2007). At 0.2 bars of CO2, the January maximum near-surface temperature is about 268 K, about 13 K higher than that in Pierrehumbert (2004, 2005), but lower than the value of 270 K for 0.1 bar of CO2 in Le Hir et al. (2007). It is found that the difference of simulation results is mainly due to model sensitivity of greenhouse effect and longwave cloud forcing to increasing CO2 [in other words the 'basic physics' of the CO2 'greenhouse effect']. At 0.2 bars of CO2, CAM3 yields 117 Wm−2 of clear-sky greenhouse effect and 32 Wm−2 of longwave cloud forcing, versus only about 77 Wm−2 and 10.5 Wm−2 in Pierrehumbert (2004, 2005), respectively. CAM3 has comparable clear-sky greenhouse effect to that in Le Hir et al. (2007), but lower longwave cloud forcing. CAM3 also produces much stronger Hadley cells than that in Pierrehumbert (2005).
Effects of pressure broadening and collision-induced absorption are also studied using a radiative-convective model and CAM3. Both effects substantially increase surface temperature and thus lower the CO2 threshold. The radiative-convective model yields a CO2 threshold of about 0.21 bars with surface albedo of 0.663. Without considering the effects of pressure broadening and collision-induced absorption, CAM3 yields an approximate CO2 threshold of about 1.0 bar for surface albedo of about 0.6. However, the threshold is lowered to 0.38 bars as both effects are considered.
Three climate models compared for global temperature claimed to result from 0.2 bar CO2 atmospheric level:
1. Hu et al finds 268K = -5.15C = 22.73F
2. Pierrehumbert et al finds 255K = -18.15C = -.67F
3. Le Hir et al finds (for 50% less CO2 or 0.1 bar) 270K+3K (temp increase claimed for doubled CO2 per IPCC) = 273K = -.15C = 31.73F
Final Revised Paper (PDF, 470 KB)
Clim. Past, 7, 17-25, 2011 www.clim-past.net/7/17/2011/ doi:10.5194/cp-7-17-2011
Model-dependence of the CO2 threshold for melting the hard Snowball Earth
Y. Hu, J. Yang, F. Ding, and W. R. Peltier
Abstract. One of the critical issues of the Snowball Earth hypothesis is the CO2 threshold for triggering the deglaciation. Using Community Atmospheric Model version 3.0 (CAM3), we study the problem for the CO2 threshold. Our simulations show large differences from previous results (e.g. Pierrehumbert, 2004, 2005; Le Hir et al., 2007). At 0.2 bars of CO2, the January maximum near-surface temperature is about 268 K, about 13 K higher than that in Pierrehumbert (2004, 2005), but lower than the value of 270 K for 0.1 bar of CO2 in Le Hir et al. (2007). It is found that the difference of simulation results is mainly due to model sensitivity of greenhouse effect and longwave cloud forcing to increasing CO2 [in other words the 'basic physics' of the CO2 'greenhouse effect']. At 0.2 bars of CO2, CAM3 yields 117 Wm−2 of clear-sky greenhouse effect and 32 Wm−2 of longwave cloud forcing, versus only about 77 Wm−2 and 10.5 Wm−2 in Pierrehumbert (2004, 2005), respectively. CAM3 has comparable clear-sky greenhouse effect to that in Le Hir et al. (2007), but lower longwave cloud forcing. CAM3 also produces much stronger Hadley cells than that in Pierrehumbert (2005).
Effects of pressure broadening and collision-induced absorption are also studied using a radiative-convective model and CAM3. Both effects substantially increase surface temperature and thus lower the CO2 threshold. The radiative-convective model yields a CO2 threshold of about 0.21 bars with surface albedo of 0.663. Without considering the effects of pressure broadening and collision-induced absorption, CAM3 yields an approximate CO2 threshold of about 1.0 bar for surface albedo of about 0.6. However, the threshold is lowered to 0.38 bars as both effects are considered.
Christopher Booker: The Met Office Fries While The Rest Of The World Freezes
The Sunday Telegraph, 9 January 2011
First it was a national joke. Then its professional failings became a national disaster. Now, the dishonesty of its attempts to fight off a barrage of criticism has become a real national scandal. I am talking yet again of that sad organisation the UK Met Office, as it now defends its bizarre record with claims as embarrassingly absurd as any which can ever have been made by highly-paid government officials.
Let us begin with last week’s astonishing claim that, far from failing to predict the coldest November and December since records began, the Met Office had secretly warned the Cabinet Office in October that Britain was facing an early and extremely cold winter. In what looked like a concerted effort at damage limitation, this was revealed by the BBC’s environmental correspondent, Roger Harrabin, a leading evangelist for man-made climate change. But the Met Office website – as reported by the blog Autonomous Mind – still contains a chart it published in October, predicting that UK temperatures between December and February would be up to 2C warmer than average.
So if the Met Office told the Government in October the opposite of what it told the public, it seems to be admitting that its information was false and misleading. But we have no evidence of what it did tell the Government other than its own latest account. And on the model of the famous Cretan Paradox, how can we now trust that statement?
Then we have the recent claim by the Met Office’s chief scientist, Professor Julia Slingo OBE, in an interview with Nature, that if her organisation’s forecasts have shortcomings, they could be remedied by giving it another £20 million a year for better computers. As she put it, “We keep saying we need four times the computing power.”
Yet it is only two years since the Met Office was boasting of the £33 million supercomputer, the most powerful in Britain, that it had installed in Exeter. This, as Prof Slingo confirmed to the parliamentary inquiry into Climategate, is what provides the Met Office both with its weather forecasting and its projections of what the world’s climate will be like in 100 years (relied on, in turn, by the UN’s Intergovernmental Panel on Climate Change). Prof Slingo fails to recognise that the fatal flaw of her computer models is that they assume that the main forcing factor determining climate is the rise in CO2 levels. So giving her yet more money would only compound the errors her computers come up with.
In another interview, just before Christmas, when the whole country was grinding to a halt in ice and snow, Prof Slingo claimed that this was merely a local event, “very much confined to the UK and Western Europe”. Do these Met Office experts ever look beyond those computer models which tell them that 2010 was the second hottest year in history? Only a few days after she made this remark, the east coast of the USA suffered one of the worst snowstorms ever recorded. There have been similar freezing disasters in south China, Japan, central Russia and right round the northern hemisphere.
The only evidence the Met Office and its warmist allies can adduce to support their belief in the warmth of 2010 is that in certain parts of the world, such as Greenland, Baffin Island and the southern half of Hudson Bay, it was warmer than average. Yet even there temperatures are currently plummeting: Hudson Bay and Baffin Island are rapidly freezing, at well below zero.
The desperate attempt to establish 2010 as an outstandingly warm year also relies on increasingly questionable official data records, such as that run by Dr James Hansen, partly based on large areas of the world which have no weather stations (more than 60 per cent of these have been lost since 1990). The gaps are filled in by the guesswork of computer models, designed by people who have an interest in showing that the Earth is continuing to warm.
It is this kind of increasingly suspect modelling that the Met Office depends on for its forecasts and the IPCC for its projections of climate a century ahead. And from them our politicians get their obsession with global warming, on which they base their schemes to spend hundreds of billions of pounds on a suicidal energy policy, centred on building tens of thousands of grotesquely expensive and useless windmills.
A vivid little reflection of how our whole official system has gone off the rails was the award in the New Year’s Honours List of a CBE, one rank lower than a knighthood, to Robert Napier, the climate activist and former head of the global warming pressure group WWF-UK, who is now the Met Office’s chairman. The more the once-respected Met Office gets lost in the greenie bubble into which it has been hijacked, the worse it becomes at doing the job for which we pay it nearly £200 million a year, and the more our Government showers it with cash and honours.
Meanwhile, in the real world, another weather-related disaster is unfolding in the Sea of Okhotsk, off the coast of Russia north of Japan, where the BBC last week reported that a group of Russian “fishing trawlers” had got stuck in “30 centimetres” (a foot) of ice. It didn’t sound anything too serious. But, as my colleague Richard North has been reporting on his EU Referendum blog, the BBC underestimated the scale of what is happening by several orders of magnitude.
Although several smaller ships have now escaped, the two largest are still trapped in up to six feet (two metres) of ice – including one of the world’s biggest factory ships, the 32,000-ton Sodruzhestvo. They still have more than 400 men on board. Three Russian ice-breakers, including two huge 14,000-tonners, are engaged in what looks like a forlorn bid to free them. A 14,000-ton ice-breaker can scarcely clear the way for a ship well over twice its size. And as the weather worsens, with gales, blizzards and visibility often reduced to zero, the chances of helicoptering the men to safety seem sadly remote.
The mystery is why the Russians should, in the middle of winter, have allowed such a fleet of ships into a stretch of sea known as ''the factory of ice”. This is because all the rivers which empty into it from the Russian coast lower its salinity, making it prone to rapid freezing. But the Sea of Okhotsk has long been held out by the world’s warmists as an example, like the Arctic, of waters which, thanks to global warming, will soon be ice-free.
As we know from Prof Slingo, however, all this cold weather we are having at the moment is a local event, “very much confined to the UK and Western Europe”. Perhaps the Russian fishing fleet took the word of the Met Office, assuming that ice was a thing of the past. As the ice-breakers struggle to reach the hundreds of trapped men, and still-thickening ice threatens to start crushing the hulls of their ships, it seems that, short of a miracle like that which saved the Chilean miners, a major tragedy could be unfolding.
Meanwhile, the sad little nonentities in charge of our Met Office prattle on, extending their begging bowls – and our politicians who have put them there remain smugly and inanely oblivious to anything happening out there in the real world.
First it was a national joke. Then its professional failings became a national disaster. Now, the dishonesty of its attempts to fight off a barrage of criticism has become a real national scandal. I am talking yet again of that sad organisation the UK Met Office, as it now defends its bizarre record with claims as embarrassingly absurd as any which can ever have been made by highly-paid government officials.
Let us begin with last week’s astonishing claim that, far from failing to predict the coldest November and December since records began, the Met Office had secretly warned the Cabinet Office in October that Britain was facing an early and extremely cold winter. In what looked like a concerted effort at damage limitation, this was revealed by the BBC’s environmental correspondent, Roger Harrabin, a leading evangelist for man-made climate change. But the Met Office website – as reported by the blog Autonomous Mind – still contains a chart it published in October, predicting that UK temperatures between December and February would be up to 2C warmer than average.
So if the Met Office told the Government in October the opposite of what it told the public, it seems to be admitting that its information was false and misleading. But we have no evidence of what it did tell the Government other than its own latest account. And on the model of the famous Cretan Paradox, how can we now trust that statement?
Then we have the recent claim by the Met Office’s chief scientist, Professor Julia Slingo OBE, in an interview with Nature, that if her organisation’s forecasts have shortcomings, they could be remedied by giving it another £20 million a year for better computers. As she put it, “We keep saying we need four times the computing power.”
Yet it is only two years since the Met Office was boasting of the £33 million supercomputer, the most powerful in Britain, that it had installed in Exeter. This, as Prof Slingo confirmed to the parliamentary inquiry into Climategate, is what provides the Met Office both with its weather forecasting and its projections of what the world’s climate will be like in 100 years (relied on, in turn, by the UN’s Intergovernmental Panel on Climate Change). Prof Slingo fails to recognise that the fatal flaw of her computer models is that they assume that the main forcing factor determining climate is the rise in CO2 levels. So giving her yet more money would only compound the errors her computers come up with.
In another interview, just before Christmas, when the whole country was grinding to a halt in ice and snow, Prof Slingo claimed that this was merely a local event, “very much confined to the UK and Western Europe”. Do these Met Office experts ever look beyond those computer models which tell them that 2010 was the second hottest year in history? Only a few days after she made this remark, the east coast of the USA suffered one of the worst snowstorms ever recorded. There have been similar freezing disasters in south China, Japan, central Russia and right round the northern hemisphere.
The only evidence the Met Office and its warmist allies can adduce to support their belief in the warmth of 2010 is that in certain parts of the world, such as Greenland, Baffin Island and the southern half of Hudson Bay, it was warmer than average. Yet even there temperatures are currently plummeting: Hudson Bay and Baffin Island are rapidly freezing, at well below zero.
The desperate attempt to establish 2010 as an outstandingly warm year also relies on increasingly questionable official data records, such as that run by Dr James Hansen, partly based on large areas of the world which have no weather stations (more than 60 per cent of these have been lost since 1990). The gaps are filled in by the guesswork of computer models, designed by people who have an interest in showing that the Earth is continuing to warm.
It is this kind of increasingly suspect modelling that the Met Office depends on for its forecasts and the IPCC for its projections of climate a century ahead. And from them our politicians get their obsession with global warming, on which they base their schemes to spend hundreds of billions of pounds on a suicidal energy policy, centred on building tens of thousands of grotesquely expensive and useless windmills.
A vivid little reflection of how our whole official system has gone off the rails was the award in the New Year’s Honours List of a CBE, one rank lower than a knighthood, to Robert Napier, the climate activist and former head of the global warming pressure group WWF-UK, who is now the Met Office’s chairman. The more the once-respected Met Office gets lost in the greenie bubble into which it has been hijacked, the worse it becomes at doing the job for which we pay it nearly £200 million a year, and the more our Government showers it with cash and honours.
Meanwhile, in the real world, another weather-related disaster is unfolding in the Sea of Okhotsk, off the coast of Russia north of Japan, where the BBC last week reported that a group of Russian “fishing trawlers” had got stuck in “30 centimetres” (a foot) of ice. It didn’t sound anything too serious. But, as my colleague Richard North has been reporting on his EU Referendum blog, the BBC underestimated the scale of what is happening by several orders of magnitude.
Although several smaller ships have now escaped, the two largest are still trapped in up to six feet (two metres) of ice – including one of the world’s biggest factory ships, the 32,000-ton Sodruzhestvo. They still have more than 400 men on board. Three Russian ice-breakers, including two huge 14,000-tonners, are engaged in what looks like a forlorn bid to free them. A 14,000-ton ice-breaker can scarcely clear the way for a ship well over twice its size. And as the weather worsens, with gales, blizzards and visibility often reduced to zero, the chances of helicoptering the men to safety seem sadly remote.
The mystery is why the Russians should, in the middle of winter, have allowed such a fleet of ships into a stretch of sea known as ''the factory of ice”. This is because all the rivers which empty into it from the Russian coast lower its salinity, making it prone to rapid freezing. But the Sea of Okhotsk has long been held out by the world’s warmists as an example, like the Arctic, of waters which, thanks to global warming, will soon be ice-free.
As we know from Prof Slingo, however, all this cold weather we are having at the moment is a local event, “very much confined to the UK and Western Europe”. Perhaps the Russian fishing fleet took the word of the Met Office, assuming that ice was a thing of the past. As the ice-breakers struggle to reach the hundreds of trapped men, and still-thickening ice threatens to start crushing the hulls of their ships, it seems that, short of a miracle like that which saved the Chilean miners, a major tragedy could be unfolding.
Meanwhile, the sad little nonentities in charge of our Met Office prattle on, extending their begging bowls – and our politicians who have put them there remain smugly and inanely oblivious to anything happening out there in the real world.
Friday, January 7, 2011
Natural Variability, Not CO2, Accounts for Late 20th Century Warming
An article posted today on the NIPPC website cites recent research findings that natural variability such as ocean oscillations are strong enough to overwhelm any signal of global warming from anthropogenic CO2 in the historical sea surface temperature record:
A critical but difficult question is how much of the warming of the past 100 years is due to human activity. When multiple forcings are varying and poorly characterized, and there is also internal variation (such as ocean oscillations), this question becomes even more difficult to answer. In this paper, the authors use a spatial fingerprinting technique in an attempt to accomplish this.
Specifically, a set of climate models run in "control" or unforced mode were used to develop a 300 year dataset of spatial ocean temperature data. It was found that an internal pattern, detectable using a spatial fingerprinting technique, could be identified in the simulated data. This spatial pattern of ocean temperature anomalies was labeled the Internal Multidecadal Pattern (IMP). It was found that this pattern is highly coherent with the Atlantic Multidecadal Oscillation (AMO) historical patterns and predicted the Pacific Decadal Oscillation (PDO), suggesting that the models were able to match the internal dynamics of the real Earth system.
Next, the authors extracted, also with discriminant fingerprinting, the forced component of the spatial patterns produced in the absence of the IMP as an orthogonal function, which they demonstrated has only a minor effect (less than 1/7 amplitude) on the IMP. They then used historical sea surface temperature data to evaluate the relative importance of the forced vs. IMP components of change from 1850.
In considering the latter portion of the record (1946-2008), results indicated that the internal variability component of climate change (the IMP) operated in a cooling mode between 1946 and 1977, but switched to a warming mode thereafter (between 1977 and 2008), suggesting that the IMP is strong enough to overwhelm any anthropogenic signal. Of this the authors state: "Specifically, the trend due to only the forced component is statistically the same in the two 32-year periods and in the 63-year period. That is, the forced part is not accelerating. Taken together, these results imply that the observed trend differs between the periods 1946-1977 and 1977-2008 not because the forced response accelerated, but because internal variability lead to relative cooling in the earlier period and relative warming in the later period" [italics added].
With respect to the entire record, the authors state that the 150 year-long trend of temperature is not explained by the IMP. In their Figure 4, it is seen that the forced component spatial fingerprint began to deviate from no trend sometime after 1920. But, this type of analysis does not distinguish between types of forcing (e.g., solar vs. anthropogenic). Nevertheless, the results in this paper suggest that simple extrapolations of rates of warming from 1980 onward overestimate the forced component of warming. Using this period without factoring out internal variability will likely lead to unrealistic values of climate sensitivity.
Reference: DelSole, T., Tippett, M.K., Shukla, J. 2010. A significant component of unforced multidecadal variability in the recent acceleration of global warming. Journal of Climate doi: 10.1175/2010JCLI3659.1.
A critical but difficult question is how much of the warming of the past 100 years is due to human activity. When multiple forcings are varying and poorly characterized, and there is also internal variation (such as ocean oscillations), this question becomes even more difficult to answer. In this paper, the authors use a spatial fingerprinting technique in an attempt to accomplish this.
Specifically, a set of climate models run in "control" or unforced mode were used to develop a 300 year dataset of spatial ocean temperature data. It was found that an internal pattern, detectable using a spatial fingerprinting technique, could be identified in the simulated data. This spatial pattern of ocean temperature anomalies was labeled the Internal Multidecadal Pattern (IMP). It was found that this pattern is highly coherent with the Atlantic Multidecadal Oscillation (AMO) historical patterns and predicted the Pacific Decadal Oscillation (PDO), suggesting that the models were able to match the internal dynamics of the real Earth system.
Next, the authors extracted, also with discriminant fingerprinting, the forced component of the spatial patterns produced in the absence of the IMP as an orthogonal function, which they demonstrated has only a minor effect (less than 1/7 amplitude) on the IMP. They then used historical sea surface temperature data to evaluate the relative importance of the forced vs. IMP components of change from 1850.
In considering the latter portion of the record (1946-2008), results indicated that the internal variability component of climate change (the IMP) operated in a cooling mode between 1946 and 1977, but switched to a warming mode thereafter (between 1977 and 2008), suggesting that the IMP is strong enough to overwhelm any anthropogenic signal. Of this the authors state: "Specifically, the trend due to only the forced component is statistically the same in the two 32-year periods and in the 63-year period. That is, the forced part is not accelerating. Taken together, these results imply that the observed trend differs between the periods 1946-1977 and 1977-2008 not because the forced response accelerated, but because internal variability lead to relative cooling in the earlier period and relative warming in the later period" [italics added].
With respect to the entire record, the authors state that the 150 year-long trend of temperature is not explained by the IMP. In their Figure 4, it is seen that the forced component spatial fingerprint began to deviate from no trend sometime after 1920. But, this type of analysis does not distinguish between types of forcing (e.g., solar vs. anthropogenic). Nevertheless, the results in this paper suggest that simple extrapolations of rates of warming from 1980 onward overestimate the forced component of warming. Using this period without factoring out internal variability will likely lead to unrealistic values of climate sensitivity.
Reference: DelSole, T., Tippett, M.K., Shukla, J. 2010. A significant component of unforced multidecadal variability in the recent acceleration of global warming. Journal of Climate doi: 10.1175/2010JCLI3659.1.
Sunday, January 2, 2011
Hottest Year Ever Update: Germany has 3rd coldest December on record & coldest year since 1996
December Temperature Anomalies 1900-2010 |
1 January 2011 Google translation from Readers Edition
A December and a year with a difference. Not warm, but cold.
The average temperature in December 2010 was -3.5 to -4.3 degrees Celsius degrees below the air many years of 0.8 ° C. So that Germany experienced the coldest December since 1969 (-4.7 ° C).
At many monitoring stations featured the DWD experts new monthly record of "maximum snow depth" set: for example, on 26 Gera-Leumnitz with 70 cm. All over Germany people were celebrating the first time since 1981, a white Christmas.
Also at the weather station Potsdam Telegrafenberg (DWD) - situated directly at the Potsdam Institute for Climate Impact Research (PIK) - many new records. On 28.12.2010 in Potsdam, a snow height was measured from 41 cm. This has never happened before in Potsdam in December month since records began in 1893. The are eighteen inches more than the previous highest December snow cover of 23 cm, which at 31 was measured in December 1913. -4.4 ° C, the zweitkälteste December in Potsdam was recorded since records began in 1893. Colder was only in December 1969 in Potsdam.
Many climate researchers had promised us an increase of warm and snow-less winters. Now they do the third cold winter and the result directly in front of their door. The second last winter was -0.4 ° C and the last winter to -1.5 ° C below the long-term average. The present is the beginning of winter so far in every way. The December 2010, the drittkälteste since 1900. Colder, only in December 1969 and of December 1933.
But not only the part was far too cold, even in 2010 was in Germany as a whole is too cold. The first time in 14 years (since 1996), is a year from getting colder, than the long-term average (average of 1961-90 to 8.2 ° C). The average temperature in 2010 was 7.9 ° C at - 0.3 ° C below the air many years worth of 8.2 ° C.
All in all, a year that does not meet the climate predictions of climate scientists, because they predicted one thing in advance, a continuous temperature increase.
Germany monthly temperature anomalies show no significant temperature increase - let alone an accelerated temperature rise - over the last 20 years |
Subscribe to:
Posts (Atom)