Harold Hamm, discoverer of the Bakken fields of the northern Great Plains, on America's oil future and why OPEC's days are numbered.
By STEPHEN MOORE WSJ.com 10/1/11
Harold Hamm, the Oklahoma-based founder and CEO of Continental Resources, the 14th-largest oil company in America, is a man who thinks big. He came to Washington last month to spread a needed message of economic optimism: With the right set of national energy policies, the United States could be "completely energy independent by the end of the decade. We can be the Saudi Arabia of oil and natural gas in the 21st century."
"President Obama is riding the wrong horse on energy," he adds. We can't come anywhere near the scale of energy production to achieve energy independence by pouring tax dollars into "green energy" sources like wind and solar, he argues. It has to come from oil and gas.
You'd expect an oilman to make the "drill, baby, drill" pitch. But since 2005 America truly has been in the midst of a revolution in oil and natural gas, which is the nation's fastest-growing manufacturing sector. No one is more responsible for that resurgence than Mr. Hamm. He was the original discoverer of the gigantic and prolific Bakken oil fields of Montana and North Dakota that have already helped move the U.S. into third place among world oil producers.
How much oil does Bakken have? The official estimate of the U.S. Geological Survey a few years ago was between four and five billion barrels. Mr. Hamm disagrees: "No way. We estimate that the entire field, fully developed, in Bakken is 24 billion barrels."
Friday, September 30, 2011
Report: North American oil production can double by 2035
The Lessons of the Shale Gas Revolution WSJ.com 9/30/11
By LUCIAN PUGLIARESI
In response to a 2009 request from Secretary of Energy Steven Chu, the National Petroleum Council (NPC) reported earlier this month that oil production in North America could double by 2035—to 20 million barrels per day.
Where can all this oil come from? For one, the hydraulic fracturing (fracking) technique used in shale gas production is now being applied to extract oil. The vast oil reserves in Canada's Alberta Province are increasingly being tapped. There is more oil to be had with greater access to federal lands in Alaska and the western U.S., and accelerated drilling in the deep waters in the Gulf of Mexico.
But to realize the enormous potential outlined in the NPC report, we need to understand how the policies of the federal government act as a serious brake on access to the reserves and the exploitation of new technologies to tap them.
The shale gas revolution started in Texas, migrated quickly to Arkansas, Oklahoma, Virginia, West Virginia and Pennsylvania and then leaped to North Dakota—where the technology for producing shale gas was applied to oil development. Even New York Gov. Andrew Cuomo, no longer wishing to miss out on the economic opportunity for his state, has pulled back from his state's comprehensive ban on hydraulic fracturing and horizontal drilling for shale gas.
What do these states all have in common besides interesting geology? Their federal land holdings are extremely small and mineral rights are in private hands.
Thus landowners were not prohibited from coming to terms with oil and gas companies, providing immediate opportunities to test new drilling technologies. Knowledge gained in one region could move quickly to another. Regulatory and environmental reviews were largely the responsibilities of state and local governments, and disagreements could often be resolved at the local level.
Contrast the shale gas revolution to oil and gas development on the vast lands owned by the federal government. There access to reserves is burdened by endless federal environmental reviews, congressional oversight, permitting delays and bureaucrats who insist that oil and gas resources do not exist in areas of interest to oil and gas companies.
Shell Oil, the winning bidder on a federal lease sale in Alaska, has spent over four years and billions of dollars and is only now getting the final permits to proceed with exploratory drilling in the Arctic Ocean's Beaufort Sea. Further court challenges remain likely.
Shell USA President Marvin Odum has stated that his board members in The Hague (Shell USA is a subsidiary of Royal Dutch Shell) are now raising serious concerns over political and regulatory risk attached to investment in the United States. Court challenges over the adequacy of environmental reviews, as well as other interventions not permitted on private lands, make the process of bringing new oil and gas production from federal lands to market both slow and costly.
President Obama's criticism of the federal oil and gas leasing program, and his call for "use it or lose it" when referring to undeveloped leases on federal lands, are the exact opposite of what is needed. We need to open more lands and minimize the regulatory burden to ensure that the oil and gas potential outlined by the NPC can be realized.
Those proponents of "peak oil" who claim the NPC report is unrealistic need only revisit our recent history with shale gas. Natural gas production has surged by more than 25% in the last four years. Yet just a few years ago, government reports and long hours of expert testimony on Capitol Hill outlined the need for the U.S. to take action to address a growing shortage of natural gas.
A crash program was called for to build receiving facilities to import foreign supplies of liquefied natural gas (LNG). Many receiving facilities were built at a cost of billions of dollars as investors bought into the government assessments. Today these facilities are operating at less than 10% capacity.
Ample supplies of oil and gas, combined with taxpayer fatigue over green subsidies, means that a range of costly and uncompetitive technologies such as biofuels and electric cars now face the prospect of financial failure. To be sure, investments in the oil and gas industry are not immune from surprises and technology advances. LNG receiving facilities in the U.S. are suffering large financial losses. The good news is that unlike the bankrupt Solyndra solar plant that received over $500 million in federal loans, losses at the LNG receiving facilities will not be picked up by the taxpayers.
Mr. Pugliaresi is president of the Energy Policy Research Foundation and a former staff member of the National Security Council under President Reagan.
By LUCIAN PUGLIARESI
In response to a 2009 request from Secretary of Energy Steven Chu, the National Petroleum Council (NPC) reported earlier this month that oil production in North America could double by 2035—to 20 million barrels per day.
Where can all this oil come from? For one, the hydraulic fracturing (fracking) technique used in shale gas production is now being applied to extract oil. The vast oil reserves in Canada's Alberta Province are increasingly being tapped. There is more oil to be had with greater access to federal lands in Alaska and the western U.S., and accelerated drilling in the deep waters in the Gulf of Mexico.
But to realize the enormous potential outlined in the NPC report, we need to understand how the policies of the federal government act as a serious brake on access to the reserves and the exploitation of new technologies to tap them.
The shale gas revolution started in Texas, migrated quickly to Arkansas, Oklahoma, Virginia, West Virginia and Pennsylvania and then leaped to North Dakota—where the technology for producing shale gas was applied to oil development. Even New York Gov. Andrew Cuomo, no longer wishing to miss out on the economic opportunity for his state, has pulled back from his state's comprehensive ban on hydraulic fracturing and horizontal drilling for shale gas.
What do these states all have in common besides interesting geology? Their federal land holdings are extremely small and mineral rights are in private hands.
Thus landowners were not prohibited from coming to terms with oil and gas companies, providing immediate opportunities to test new drilling technologies. Knowledge gained in one region could move quickly to another. Regulatory and environmental reviews were largely the responsibilities of state and local governments, and disagreements could often be resolved at the local level.
Contrast the shale gas revolution to oil and gas development on the vast lands owned by the federal government. There access to reserves is burdened by endless federal environmental reviews, congressional oversight, permitting delays and bureaucrats who insist that oil and gas resources do not exist in areas of interest to oil and gas companies.
Shell Oil, the winning bidder on a federal lease sale in Alaska, has spent over four years and billions of dollars and is only now getting the final permits to proceed with exploratory drilling in the Arctic Ocean's Beaufort Sea. Further court challenges remain likely.
Shell USA President Marvin Odum has stated that his board members in The Hague (Shell USA is a subsidiary of Royal Dutch Shell) are now raising serious concerns over political and regulatory risk attached to investment in the United States. Court challenges over the adequacy of environmental reviews, as well as other interventions not permitted on private lands, make the process of bringing new oil and gas production from federal lands to market both slow and costly.
President Obama's criticism of the federal oil and gas leasing program, and his call for "use it or lose it" when referring to undeveloped leases on federal lands, are the exact opposite of what is needed. We need to open more lands and minimize the regulatory burden to ensure that the oil and gas potential outlined by the NPC can be realized.
Those proponents of "peak oil" who claim the NPC report is unrealistic need only revisit our recent history with shale gas. Natural gas production has surged by more than 25% in the last four years. Yet just a few years ago, government reports and long hours of expert testimony on Capitol Hill outlined the need for the U.S. to take action to address a growing shortage of natural gas.
A crash program was called for to build receiving facilities to import foreign supplies of liquefied natural gas (LNG). Many receiving facilities were built at a cost of billions of dollars as investors bought into the government assessments. Today these facilities are operating at less than 10% capacity.
Ample supplies of oil and gas, combined with taxpayer fatigue over green subsidies, means that a range of costly and uncompetitive technologies such as biofuels and electric cars now face the prospect of financial failure. To be sure, investments in the oil and gas industry are not immune from surprises and technology advances. LNG receiving facilities in the U.S. are suffering large financial losses. The good news is that unlike the bankrupt Solyndra solar plant that received over $500 million in federal loans, losses at the LNG receiving facilities will not be picked up by the taxpayers.
Mr. Pugliaresi is president of the Energy Policy Research Foundation and a former staff member of the National Security Council under President Reagan.
Wednesday, September 28, 2011
New apparent record for govt. waste: $24.4 million per 'green job'
What Solyndra Fiasco?
Review & Outlook WSJ.com 9/28/11
The Department of Energy keeps shoveling out taxpayer money
If you thought the $535 million Solyndra scandal had chastened the fearless venture capitalists of the Obama Administration, think again. The Department of Energy shovelled out $1.1 billion in new loan guarantees to solar projects in Nevada and Arizona Wednesday, and more deals are pending before the $18 billion program funded by the 2009 stimulus expires Friday.
We'll go out on a limb and say the rush raises questions about how carefully these outlays are being vetted, especially in light of solar-panel-maker Solyndra's August bankruptcy. The FBI, Treasury Department and Congress are all investigating who approved the politically connected California company's loan guarantee and why. The case is an embarrassment for the White House, which touted Solyndra as a model for its green jobs agenda.
Yet the Department of Energy seems oddly removed from the uproar. In a statement yesterday, Secretary Steven Chu said: "If we want to be a player in the global clean energy race, we must continue to invest in innovative technologies that enable commercial-scale deployment of clean, renewable power like solar." Translation: China is throwing taxpayer money into solar, so Americans should, too.
That comparison isn't straightforward; without a free media, it's impossible to know how many Solyndras Beijing is creating, much less how many are making any money. We doubt most Americans want its government to get in the business of competing dollar-for-subsidy-dollar with the politically directed credit decisions of the Chinese Communist Party. If solar energy collection technology has a chance to be a commercial winner, someone will invest in it. If no one does, there may be a very good reason.
One of those reasons may be this: The Energy Information Administration estimates that new natural gas-fired plants will create electricity at a cost of $63.10 per megawatt hour, compared to the Administration's "green" favorites, offshore wind and solar thermal plants—like the one in Nevada funded yesterday—which cost $243.20 and $311.80.
Even if you believe in the "green job" mantra, here's some more math: Yesterday's $737 million loan guarantee to Tonopah Solar Energy will create "600 construction jobs and 45 permanent jobs," according to the company. The $337 million loan guarantee to Sempra Energy "will fund up to 300 construction jobs." That's $1.1 billion for 45 permanent jobs. [or $24.4 million per permanent 'green job']
By comparison, the proposed Keystone XL pipeline to carry crude oil from Western Canada to refineries on the U.S. Gulf Coast would create some 13,000 union jobs and around 118,000 "spin-off" jobs—if the U.S. State Department ever gets around to approving it. And taxpayers wouldn't have to risk a dime.
It's always possible that some of the Energy Department's latest investments will turn out to be winners, but if they do then the profits will go to the private shareholders. If they fail like Solyndra, then taxpayers will get stuck with the bill. Come to think of it, that really isn't all that different from China's political business model, a free press and democratic Congress aside.
Review & Outlook WSJ.com 9/28/11
The Department of Energy keeps shoveling out taxpayer money
If you thought the $535 million Solyndra scandal had chastened the fearless venture capitalists of the Obama Administration, think again. The Department of Energy shovelled out $1.1 billion in new loan guarantees to solar projects in Nevada and Arizona Wednesday, and more deals are pending before the $18 billion program funded by the 2009 stimulus expires Friday.
We'll go out on a limb and say the rush raises questions about how carefully these outlays are being vetted, especially in light of solar-panel-maker Solyndra's August bankruptcy. The FBI, Treasury Department and Congress are all investigating who approved the politically connected California company's loan guarantee and why. The case is an embarrassment for the White House, which touted Solyndra as a model for its green jobs agenda.
Yet the Department of Energy seems oddly removed from the uproar. In a statement yesterday, Secretary Steven Chu said: "If we want to be a player in the global clean energy race, we must continue to invest in innovative technologies that enable commercial-scale deployment of clean, renewable power like solar." Translation: China is throwing taxpayer money into solar, so Americans should, too.
That comparison isn't straightforward; without a free media, it's impossible to know how many Solyndras Beijing is creating, much less how many are making any money. We doubt most Americans want its government to get in the business of competing dollar-for-subsidy-dollar with the politically directed credit decisions of the Chinese Communist Party. If solar energy collection technology has a chance to be a commercial winner, someone will invest in it. If no one does, there may be a very good reason.
One of those reasons may be this: The Energy Information Administration estimates that new natural gas-fired plants will create electricity at a cost of $63.10 per megawatt hour, compared to the Administration's "green" favorites, offshore wind and solar thermal plants—like the one in Nevada funded yesterday—which cost $243.20 and $311.80.
Even if you believe in the "green job" mantra, here's some more math: Yesterday's $737 million loan guarantee to Tonopah Solar Energy will create "600 construction jobs and 45 permanent jobs," according to the company. The $337 million loan guarantee to Sempra Energy "will fund up to 300 construction jobs." That's $1.1 billion for 45 permanent jobs. [or $24.4 million per permanent 'green job']
By comparison, the proposed Keystone XL pipeline to carry crude oil from Western Canada to refineries on the U.S. Gulf Coast would create some 13,000 union jobs and around 118,000 "spin-off" jobs—if the U.S. State Department ever gets around to approving it. And taxpayers wouldn't have to risk a dime.
It's always possible that some of the Energy Department's latest investments will turn out to be winners, but if they do then the profits will go to the private shareholders. If they fail like Solyndra, then taxpayers will get stuck with the bill. Come to think of it, that really isn't all that different from China's political business model, a free press and democratic Congress aside.
Tuesday, September 27, 2011
MUST SEE: Video exposes the 'greenhouse effect' myth
A recommended video "Greenhouse in A Bottle Experiment - Reconsidered" explains in very simple terms why the so-called 'greenhouse effect' is simply due to the compression of the atmosphere from gravity, and why adding 'greenhouse gases' will not warm the planet.
Script from the video
A related post, Shattering the Greenhouse Effect, by Swedish climatologist Dr. Hans Jelbring offers a high school through advanced level debunking of the so-called 'greenhouse effect' using only the physics of pressure, gravity, volume, and the adiabatic lapse rate.
Script from the video
A related post, Shattering the Greenhouse Effect, by Swedish climatologist Dr. Hans Jelbring offers a high school through advanced level debunking of the so-called 'greenhouse effect' using only the physics of pressure, gravity, volume, and the adiabatic lapse rate.
Monday, September 26, 2011
Straight from the horse's mouth: Hansen says global warming of late 20th century was NOT due CO2 or fossil fuel burning
In a paper published in PNAS in 2000, global warming religion high priest James Hansen argues that "rapid warming in recent decades has been driven mainly by non-CO2 greenhouse gases... not by the products of fossil fuel burning, CO2 and aerosols..." Furthermore, Hansen correctly points out "The growth rate of non-CO2 greenhouse gases has declined in the past decade." The summary of the paper states,
H/T to Bill Wallace of Global Hot Air for this discovery
"Non-CO2 Greenhouse Gases: These gases are probably the main cause of observed global warming, with CH4 [methane] causing the largest net climate forcing."Hansen appears unaware methane represents an extremely tiny 0.0000017 mole fraction of the atmosphere and contributes almost nothing to the posited 'greenhouse' effect.
H/T to Bill Wallace of Global Hot Air for this discovery
Global warming in the twenty-first century: An alternative scenario
- *National Aeronautics and Space Administration Goddard Institute for Space Studies, ‡Center for Climate Systems Research, Columbia University Earth Institute, and §Center for Environmental Prediction, Rutgers University, 2880 Broadway, New York, NY 10025
- Contributed by James Hansen
Abstract
A common view is that the current global warming rate will continue or accelerate. But we argue that rapid warming in recent decades has been driven mainly by non-CO2 greenhouse gases (GHGs), such as chlorofluorocarbons, CH4, and N2O, not by the products of fossil fuel burning, CO2 and aerosols, the positive and negative climate forcings of which are partially offsetting. The growth rate of non-CO2 GHGs has declined in the past decade. If sources of CH4 and O3 precursors were reduced in the future, the change in climate forcing by non-CO2 GHGs in the next 50 years could be near zero. Combined with a reduction of black carbon emissions and plausible success in slowing CO2 emissions, this reduction of non-CO2 GHGs could lead to a decline in the rate of global warming, reducing the danger of dramatic climate change. Such a focus on air pollution has practical benefits that unite the interests of developed and developing countries. However, assessment of ongoing and future climate change requires composition-specific long-term global monitoring of aerosol properties.
Add it to The List: Evil trace gas CO2 is affecting artistic expression
Add it to The List: the 0.65C in global warming recovery from the Little Ice Age has been found to affect artistic expression in a study of classical music:
Musical Weather Shows Climate Influence
ScienceDaily (Sep. 26, 2011) — Scientists at the Universities of Oxford and Reading have catalogued and analysed depictions of weather in classical music from the 17th Century [during the Little Ice Age] to the present day to help understand how climate affects how people think.
Dr Karen Aplin, of Oxford University's Department of Physics, and Dr Paul Williams, from Reading University's Meteorology Department, both combine careers as atmospheric scientists with a love of classical music. Dr Aplin was inspired by the regular portrayal of weather-related phenomena in orchestral music she has played, she said: 'as all music lovers know, the hint of a distant storm from a drum roll can be just as evocative as the skies depicted by Constable and Monet.'
The researchers were so convinced that classical music is influenced by climate that they pursued this pilot study in their own spare time, outside of their normal scientific work. A report of their study is published on 23 September in the journal Weather.
Dr Williams said: 'We found that composers are generally influenced by their own environment in the type of weather they choose to represent. As befits the national stereotype, British composers seem disproportionately keen to depict the UK's variable weather patterns and stormy coastline.'
The research showed British composers easily lead the way with musical weather, followed by the French and the Germans.
Generally, the most popular type of weather represented in music is the storm, presumably because of the use of storms by composers as an allegory for emotional turbulence, such as in Benjamin Britten's Four Sea Interludes from the opera Peter Grimes.
Wind was found to be the second most popular type of weather to feature in music. Wind can have a variety of characters, from a gentle breeze rustling the trees, as in the beginning of the third movement of Berlioz's Symphonie Fantastique, to a full-blown Antarctic gale, as in Sinfonia Antarctica by Vaughan Williams.
The research also charts the development of musical instruments as aids to evoke a particular sound, for example a thunder sheet or wind machine, and the effect weather had on composers.
Strauss needed both sunshine and the Alpine landscape to inspire him. Several other composers, such as Berlioz, Schubert and Wagner, were also dependent on fair weather conditions, associated with high pressure, for their best output. Wagner, for example, referred to 'bad-weather unemployment' and wrote: 'This is awful weather. My work has been put aside for two days, and the brain is stubbornly declining its services.'
The study provides a baseline of cultural responses to weather before climate change. It seems inevitable that our changing climate will influence artistic expression: will UK composers writing music for a 2050 Proms programme still be interested in representing our warmer, wetter weather? The team believe their research will provide a basis for comparison.
'Meteorological phenomena in Western classical orchestral music', by Karen Aplin and Paul Williams, is published in the journal Weather on 23 September 2011.
Musical Weather Shows Climate Influence
ScienceDaily (Sep. 26, 2011) — Scientists at the Universities of Oxford and Reading have catalogued and analysed depictions of weather in classical music from the 17th Century [during the Little Ice Age] to the present day to help understand how climate affects how people think.
Dr Karen Aplin, of Oxford University's Department of Physics, and Dr Paul Williams, from Reading University's Meteorology Department, both combine careers as atmospheric scientists with a love of classical music. Dr Aplin was inspired by the regular portrayal of weather-related phenomena in orchestral music she has played, she said: 'as all music lovers know, the hint of a distant storm from a drum roll can be just as evocative as the skies depicted by Constable and Monet.'
The researchers were so convinced that classical music is influenced by climate that they pursued this pilot study in their own spare time, outside of their normal scientific work. A report of their study is published on 23 September in the journal Weather.
Dr Williams said: 'We found that composers are generally influenced by their own environment in the type of weather they choose to represent. As befits the national stereotype, British composers seem disproportionately keen to depict the UK's variable weather patterns and stormy coastline.'
The research showed British composers easily lead the way with musical weather, followed by the French and the Germans.
Generally, the most popular type of weather represented in music is the storm, presumably because of the use of storms by composers as an allegory for emotional turbulence, such as in Benjamin Britten's Four Sea Interludes from the opera Peter Grimes.
Wind was found to be the second most popular type of weather to feature in music. Wind can have a variety of characters, from a gentle breeze rustling the trees, as in the beginning of the third movement of Berlioz's Symphonie Fantastique, to a full-blown Antarctic gale, as in Sinfonia Antarctica by Vaughan Williams.
The research also charts the development of musical instruments as aids to evoke a particular sound, for example a thunder sheet or wind machine, and the effect weather had on composers.
Strauss needed both sunshine and the Alpine landscape to inspire him. Several other composers, such as Berlioz, Schubert and Wagner, were also dependent on fair weather conditions, associated with high pressure, for their best output. Wagner, for example, referred to 'bad-weather unemployment' and wrote: 'This is awful weather. My work has been put aside for two days, and the brain is stubbornly declining its services.'
The study provides a baseline of cultural responses to weather before climate change. It seems inevitable that our changing climate will influence artistic expression: will UK composers writing music for a 2050 Proms programme still be interested in representing our warmer, wetter weather? The team believe their research will provide a basis for comparison.
'Meteorological phenomena in Western classical orchestral music', by Karen Aplin and Paul Williams, is published in the journal Weather on 23 September 2011.
There He Goes Again, Version 4.0: Mann Claims His Hockey Stick was Affirmed by the NAS
In a letter to the editor published this month in 'Scientific' American, spinmeister Michael Mann fires off ad hominem attacks and yet again claims his "hockey stick" was affirmed by the National Academy of Sciences. The NAS report did nothing of the sort, and in fact validated all of the significant criticisms of McIntyre & McKitrick (M&M) and the Wegman Report:
1. The NAS indicated that the hockey stick method systematically underestimated the uncertainties in the data (p. 107).
2. In subtle wording, the NAS agreed with the M&M assertion that the hockey stick had no statistical significance, and was no more informative about the distant past than a table of random numbers. The NAS found that Mann's methods had no validation (CE) skill significantly different from zero. In the past, however, it has always been claimed that the method has a significant nonzero validation skill. Methods without a validation skill are usually considered useless. Mann’s data set does not have enough information to verify its ‘skill’ at resolving the past, and has such wide uncertainty bounds as to be no better than the simple mean of the data (p. 91). M&M said that the appearance of significance was created by ignoring all but one type of test score, thereby failing to quantify all the relevant uncertainties. The NAS agreed (p. 110), but, again, did so in subtle wording.
3. M&M argued that the hockey stick relied for its shape on the inclusion of a small set of invalid proxy data (called bristlecone, or “strip-bark” records). If they are removed, the conclusion that the 20th century is unusually warm compared to the pre-1450 interval is reversed. Hence the conclusion of unique late 20th century warmth is not robust—in other word it does not hold up under minor variations in data or methods. The NAS panel agreed, saying Mann’s results are “strongly dependent” on the strip-bark data (pp. 106-107), and they went further, warning that strip-bark data should not be used in this type of research (p. 50).
4. The NAS said " Mann et al. used a type of principal component analysis that tends to bias the shape of the reconstructions", i.e. produce hockey sticks from baseball statistics, telephone book numbers, and monte carlo random numbers.
5. The NAS said Mann downplayed the "uncertainties of the published reconstructions...Even less confidence can be placed in the original conclusions by Mann et al. (1999) that ‘the 1990s are likely the warmest decade, and 1998 the warmest year, in at least a millennium.’
Mann never mentions that a subsequent House Energy and Commerce Committee report chaired by Edward Wegman totally destroyed the credibility of the ‘hockey stick’ and devastatingly ripped apart Mann’s methodology as ‘bad mathematics’. Furthermore, when Gerald North, the chairman of the NAS panel -- which Mann claims ‘vindicated him’ – was asked at the House Committee hearings whether or not they agreed with Wegman’s harsh criticisms, he said they did:
STICKING TO CLIMATE SCIENCE by Michael Mann
As an undergraduate physics major in the mid-1980s at the University of California, Berkeley, I knew about Richard Muller—the physics professor who was the subject of Michael D. Lemonick’s interview, “‘I Stick to the Science’”—and his controversial theory that a “death star” was responsible for major mass extinctions. Later, as a graduate student studying climate, I became aware of Muller’s work attempting to overthrow the traditional Earth orbital theory of the ice ages—that, too, didn’t pan out. To be clear, there is nothing wrong in science with putting forth bold hypotheses that ultimately turn out to be wrong. Indeed, science thrives on novel, innovative ideas that—even if ultimately wrong—may lead researchers in productive new directions.
One might hope, however, that a scientist known for big ideas that didn’t stand the test of time might be more circumspect when it comes to his critiques of other scientists. Muller is on record accusing climate scientists at the University of East Anglia Climatic Research Unit of hiding data—a charge that was rejected in three separate [whitewashed] investigations. In his interview, Muller even maligned my own work on the “hockey stick” reconstruction of past temperatures. He falsely claimed “the hockey-stick chart was in fact incorrect” when in fact the National Academy of Sciences affirmed our findings in a major 2006 report that Nature summarized as “Academy affirms hockey-stick graph.” Scientific American itself recently ran [pre-climategate] an article it billed as “Novel analysis confirms climate ‘hockey stick’ graph” [“Still Hotter Than Ever,” by [uber-warmist] David Appell, News Scan; Scientific American, November 2009].
Rather than providing a platform for Muller to cast aspersions on other scientists, Lemonick could have sought some introspection from him. How, for example, have the lessons learned from his past failures influenced the approach he has taken in his more recent forays into the science of human-caused climate change? More than anything else, the interview was simply a lost opportunity. Not only can Scientific American do better, it will need to.
Michael E. Mann
Pennsylvania State University
1. The NAS indicated that the hockey stick method systematically underestimated the uncertainties in the data (p. 107).
2. In subtle wording, the NAS agreed with the M&M assertion that the hockey stick had no statistical significance, and was no more informative about the distant past than a table of random numbers. The NAS found that Mann's methods had no validation (CE) skill significantly different from zero. In the past, however, it has always been claimed that the method has a significant nonzero validation skill. Methods without a validation skill are usually considered useless. Mann’s data set does not have enough information to verify its ‘skill’ at resolving the past, and has such wide uncertainty bounds as to be no better than the simple mean of the data (p. 91). M&M said that the appearance of significance was created by ignoring all but one type of test score, thereby failing to quantify all the relevant uncertainties. The NAS agreed (p. 110), but, again, did so in subtle wording.
3. M&M argued that the hockey stick relied for its shape on the inclusion of a small set of invalid proxy data (called bristlecone, or “strip-bark” records). If they are removed, the conclusion that the 20th century is unusually warm compared to the pre-1450 interval is reversed. Hence the conclusion of unique late 20th century warmth is not robust—in other word it does not hold up under minor variations in data or methods. The NAS panel agreed, saying Mann’s results are “strongly dependent” on the strip-bark data (pp. 106-107), and they went further, warning that strip-bark data should not be used in this type of research (p. 50).
4. The NAS said " Mann et al. used a type of principal component analysis that tends to bias the shape of the reconstructions", i.e. produce hockey sticks from baseball statistics, telephone book numbers, and monte carlo random numbers.
5. The NAS said Mann downplayed the "uncertainties of the published reconstructions...Even less confidence can be placed in the original conclusions by Mann et al. (1999) that ‘the 1990s are likely the warmest decade, and 1998 the warmest year, in at least a millennium.’
Mann never mentions that a subsequent House Energy and Commerce Committee report chaired by Edward Wegman totally destroyed the credibility of the ‘hockey stick’ and devastatingly ripped apart Mann’s methodology as ‘bad mathematics’. Furthermore, when Gerald North, the chairman of the NAS panel -- which Mann claims ‘vindicated him’ – was asked at the House Committee hearings whether or not they agreed with Wegman’s harsh criticisms, he said they did:
CHAIRMAN BARTON: Dr. North, do you dispute the conclusions or the methodology of Dr. Wegman’s report?
DR. NORTH [Head of the NAS panel]: No, we don’t. We don’t disagree with their criticism. In fact, pretty much the same thing is said in our report.
DR. BLOOMFIELD [of the Royal Statistical Society]: Our committee reviewed the methodology used by Dr. Mann and his co-workers and we felt that some of the choices they made were inappropriate. We had much the same misgivings about his work that was documented at much greater length by Dr. Wegman.
WALLACE [of the American Statistical Association]: ‘the two reports [Wegman's and NAS] were complementary, and to the extent that they overlapped, the conclusions were quite consistent.’Mann uses the 5 rules of propaganda in his defense, including the rule of orchestration: endlessly repeating the same messages in different variations and combinations [e.g. the NAS gave my hockey stick a clean bill of health].
STICKING TO CLIMATE SCIENCE by Michael Mann
As an undergraduate physics major in the mid-1980s at the University of California, Berkeley, I knew about Richard Muller—the physics professor who was the subject of Michael D. Lemonick’s interview, “‘I Stick to the Science’”—and his controversial theory that a “death star” was responsible for major mass extinctions. Later, as a graduate student studying climate, I became aware of Muller’s work attempting to overthrow the traditional Earth orbital theory of the ice ages—that, too, didn’t pan out. To be clear, there is nothing wrong in science with putting forth bold hypotheses that ultimately turn out to be wrong. Indeed, science thrives on novel, innovative ideas that—even if ultimately wrong—may lead researchers in productive new directions.
One might hope, however, that a scientist known for big ideas that didn’t stand the test of time might be more circumspect when it comes to his critiques of other scientists. Muller is on record accusing climate scientists at the University of East Anglia Climatic Research Unit of hiding data—a charge that was rejected in three separate [whitewashed] investigations. In his interview, Muller even maligned my own work on the “hockey stick” reconstruction of past temperatures. He falsely claimed “the hockey-stick chart was in fact incorrect” when in fact the National Academy of Sciences affirmed our findings in a major 2006 report that Nature summarized as “Academy affirms hockey-stick graph.” Scientific American itself recently ran [pre-climategate] an article it billed as “Novel analysis confirms climate ‘hockey stick’ graph” [“Still Hotter Than Ever,” by [uber-warmist] David Appell, News Scan; Scientific American, November 2009].
Rather than providing a platform for Muller to cast aspersions on other scientists, Lemonick could have sought some introspection from him. How, for example, have the lessons learned from his past failures influenced the approach he has taken in his more recent forays into the science of human-caused climate change? More than anything else, the interview was simply a lost opportunity. Not only can Scientific American do better, it will need to.
Michael E. Mann
Pennsylvania State University
Professor Nasif Nahle Publishes New Paper Discrediting Basis of Theory of Man-Made Global Warming
The fundamental basis of the theory of catastrophic man-made global warming is the notion that colder 'greenhouse' gases like CO2 'back-radiate' infrared capable of heating the hotter Earth surface. Professor Nasif Nahle has a new paper out explaining why this notion is false and unphysical.
Abstract: Through a series of real time measurements of thermal radiation from the atmosphere and surface materials during nighttime and daytime, I demonstrate that warming backradiation emitted from Earth’s atmosphere back toward the earth’s surface and the idea that a cooler system can warm a warmer system are unphysical concepts.
http://principia-scientific.org/publications/New_Concise_Experiment_on_Backradiation.pdf
Abstract: Through a series of real time measurements of thermal radiation from the atmosphere and surface materials during nighttime and daytime, I demonstrate that warming backradiation emitted from Earth’s atmosphere back toward the earth’s surface and the idea that a cooler system can warm a warmer system are unphysical concepts.
http://principia-scientific.org/publications/New_Concise_Experiment_on_Backradiation.pdf
EPA insiders accuse EPA of undermining the security & reliability of US electric power
Inside the EPA Review & Outlook WSJ.com 9/26/11
Memos show that even other regulators worry about its rule-making.
The Environmental Protection Agency claims that the critics of its campaign to remake U.S. electricity are partisans, but it turns out that they include other regulators and even some in the Obama Administration. In particular, a trove of documents uncovered by Congressional investigators reveals that these internal critics think the EPA is undermining the security and reliability of the U.S. electric power supply.
With its unprecedented wave of rules, the EPA is abusing traditional air-quality laws to force a large share of the coal-fired fleet to shut down. Amid these sacrifices on the anticarbon altar, Alaska Republican Lisa Murkowski and several House committees have been asking, well, what happens after as much as 8% of U.S. generating capacity is taken off the grid?
A special focus of their inquiry has been the Federal Energy Regulatory Commission, or FERC, which since 2005 has been charged with ensuring that the (compact florescent) lights stay on. That 8% figure comes from FERC itself in a confidential 2010 assessment of the EPA's regulatory bender—or about 81 gigawatts that FERC's Office of Electric Reliability estimated is "very likely" or "likely" to enter involuntary retirement over the next several years. FERC disclosed the estimate in August in response to Senator Murkowski's questions, along with a slew of memos and emails.
FERC Chairman Jon Wellinghoff, a Democrat, has since disavowed the study as nothing more than back-of-the-envelope scribblings that are now "irrelevant," as he told a recent House hearing. OK, but then could FERC come up with a relevant number? Since he made the study public, Mr. Wellinghoff has disowned responsibility for scrutinizing the EPA rules and now says that FERC will only protect electric reliability ex post facto once the rules are permanent, somehow.
This abdication is all the more striking because the documents show that EPA's blandishments about reliability can't be trusted. In its initial 2010 analysis—a rigorous document—FERC notes in a "next steps" section that the reliability office and industry must "assess the reliability and adequacy impacts of retirement of at risk units." In part, this was because the office believed the EPA analyses to be deficient. One undated memo specifies multiple weaknesses in EPA reliability modelling.
However much power is lost, whether 81 gigawatts or something else, the electric grid is highly local. Even subtracting a small plant could have much larger effects for regions, such as blackouts. The older and less efficient coal plants that are slated for closure are often the crucial nodes that connect the hubs and spokes of the grid. If these "sensitive" interconnections are taken out, as the memo puts it, the power system becomes less stable, harder to manage and may not be able to meet peak-load demand or withstand unexpected disturbances.
When large swaths of Arizona, New Mexico and parts of southern California including San Diego went dark this month, preliminary reports blamed it on a Homer Simpson who flipped the wrong switch. But the incident shows that even minor mistakes or degraded systems can ramify throughout the grid. The EPA scanted these technical, regional issues when writing the rules, even though another "summary of interagency working comments" within the Administration explicitly told the EPA that reliability needed "more discussion."
And according to the FERC minutes of a 2010 meeting between its reliability office and the EPA, EPA staffers waved off those concerns. "The EPA concluded the discussion by stating that it felt the Clean Air Transport Rule and Mercury MACT rule"—two of the most destructive new regulations—"were the highest priority given that these regulations were more finalized." In other words, the agency's green political goals are more important than the real-world outcomes, never mind the danger.
For our part, we've opposed this "highest priority" because the rules are written in a way that maximizes the economic costs, with terrible effects on growth, hiring, investment and consumer prices. And well, well: More than a few people in the Administration seem to agree.
The interagency memo explains that the EPA used its "discretion" to structure one rule so that it is more "stringent" than it needs to be. The agency could achieve the same environmental benefits with "substantial" cost-savings, which "would be far more preferable to the proposed approach," says the memo. It sensibly adds that, "The current economic climate dictates a balancing of economic and environmental interests."
Under pressure from Democrats and the EPA to disavow his own agency's analysis, Mr. Wellinghoff now says that FERC favors only a "safety valve" that would give it the authority to overrule the EPA on a case-by-case basis if its regulations might lead to blackouts. But even this is a tacit admission of EPA's overkill. You don't need a safety valve if there isn't a threat to safety.
The best option would be for the EPA to write less destructive rules that don't jeopardize reliability in the first place. Failing that, we should at least know the risks before it is too late. In a letter to Mr. Wellingoff last week, Mrs. Murkowski simply asks that FERC undertake some kind of study of the EPA's agenda in line with its statutory obligations and the warnings of its own experts. If FERC won't do it, someone else should.
Memos show that even other regulators worry about its rule-making.
The Environmental Protection Agency claims that the critics of its campaign to remake U.S. electricity are partisans, but it turns out that they include other regulators and even some in the Obama Administration. In particular, a trove of documents uncovered by Congressional investigators reveals that these internal critics think the EPA is undermining the security and reliability of the U.S. electric power supply.
With its unprecedented wave of rules, the EPA is abusing traditional air-quality laws to force a large share of the coal-fired fleet to shut down. Amid these sacrifices on the anticarbon altar, Alaska Republican Lisa Murkowski and several House committees have been asking, well, what happens after as much as 8% of U.S. generating capacity is taken off the grid?
A special focus of their inquiry has been the Federal Energy Regulatory Commission, or FERC, which since 2005 has been charged with ensuring that the (compact florescent) lights stay on. That 8% figure comes from FERC itself in a confidential 2010 assessment of the EPA's regulatory bender—or about 81 gigawatts that FERC's Office of Electric Reliability estimated is "very likely" or "likely" to enter involuntary retirement over the next several years. FERC disclosed the estimate in August in response to Senator Murkowski's questions, along with a slew of memos and emails.
FERC Chairman Jon Wellinghoff, a Democrat, has since disavowed the study as nothing more than back-of-the-envelope scribblings that are now "irrelevant," as he told a recent House hearing. OK, but then could FERC come up with a relevant number? Since he made the study public, Mr. Wellinghoff has disowned responsibility for scrutinizing the EPA rules and now says that FERC will only protect electric reliability ex post facto once the rules are permanent, somehow.
This abdication is all the more striking because the documents show that EPA's blandishments about reliability can't be trusted. In its initial 2010 analysis—a rigorous document—FERC notes in a "next steps" section that the reliability office and industry must "assess the reliability and adequacy impacts of retirement of at risk units." In part, this was because the office believed the EPA analyses to be deficient. One undated memo specifies multiple weaknesses in EPA reliability modelling.
However much power is lost, whether 81 gigawatts or something else, the electric grid is highly local. Even subtracting a small plant could have much larger effects for regions, such as blackouts. The older and less efficient coal plants that are slated for closure are often the crucial nodes that connect the hubs and spokes of the grid. If these "sensitive" interconnections are taken out, as the memo puts it, the power system becomes less stable, harder to manage and may not be able to meet peak-load demand or withstand unexpected disturbances.
When large swaths of Arizona, New Mexico and parts of southern California including San Diego went dark this month, preliminary reports blamed it on a Homer Simpson who flipped the wrong switch. But the incident shows that even minor mistakes or degraded systems can ramify throughout the grid. The EPA scanted these technical, regional issues when writing the rules, even though another "summary of interagency working comments" within the Administration explicitly told the EPA that reliability needed "more discussion."
And according to the FERC minutes of a 2010 meeting between its reliability office and the EPA, EPA staffers waved off those concerns. "The EPA concluded the discussion by stating that it felt the Clean Air Transport Rule and Mercury MACT rule"—two of the most destructive new regulations—"were the highest priority given that these regulations were more finalized." In other words, the agency's green political goals are more important than the real-world outcomes, never mind the danger.
For our part, we've opposed this "highest priority" because the rules are written in a way that maximizes the economic costs, with terrible effects on growth, hiring, investment and consumer prices. And well, well: More than a few people in the Administration seem to agree.
The interagency memo explains that the EPA used its "discretion" to structure one rule so that it is more "stringent" than it needs to be. The agency could achieve the same environmental benefits with "substantial" cost-savings, which "would be far more preferable to the proposed approach," says the memo. It sensibly adds that, "The current economic climate dictates a balancing of economic and environmental interests."
Under pressure from Democrats and the EPA to disavow his own agency's analysis, Mr. Wellinghoff now says that FERC favors only a "safety valve" that would give it the authority to overrule the EPA on a case-by-case basis if its regulations might lead to blackouts. But even this is a tacit admission of EPA's overkill. You don't need a safety valve if there isn't a threat to safety.
The best option would be for the EPA to write less destructive rules that don't jeopardize reliability in the first place. Failing that, we should at least know the risks before it is too late. In a letter to Mr. Wellingoff last week, Mrs. Murkowski simply asks that FERC undertake some kind of study of the EPA's agenda in line with its statutory obligations and the warnings of its own experts. If FERC won't do it, someone else should.
Thursday, September 22, 2011
Obama pads numbers of 'green jobs'
What are green jobs?
From FoxNews.com, Sept. 22:
In a series of tense exchanges, Republicans on a House oversight panel sharply questioned whether the Obama administration was looking to inflate the number of "green" jobs by using a broad definition—which, as it turns out, counts virtually anybody working in mass transit. . . .
Official data on green jobs are hard to come by. The Bureau of Labor Statistics currently is trying to come up with a workable definition and formula to track green-jobs employment. The bureau expects to have its first estimate out early next year.
Labor Secretary Hilda Solis pointed to a separate study claiming 2.7 million Americans are in "clean economy" positions.
But Republicans said the working term the government is using is far too broad, suggesting officials were trying to pad the figures.
"It's offensive," Rep. Connie Mack, R-Fla., said, raising his voice while questioning Solis.
Mack argued that just because a bus driver is driving a hybrid bus doesn't mean it's a green job.
"Yes it is," Solis countered.
"It's only a green job if it fits into your sales pitch," Mack shot back.
New paper finds solar energy at Earth's surface greatly increased between 1973 and 1998
A paper published today in the Journal of Geophysical Research notes observations of the solar energy at the European earth surface significantly increased ~ 3.4 W/m2 per decade during the period 1973-1998. That would be a total of 8.5 W/m2 over the 25 year period. By way of contrast, the IPCC claims a doubling of CO2 levels results in 3.7 W/m2 additional forcing. CO2 increased from 330 to 366 ppm (11%) during that period, and 11% of 3.7 is 0.41 W/m2 in claimed CO2 forcing. Thus, the change in solar radiation impacting the Earth surface during that 25 year period of global warming is about 21 times greater than the alleged effect of CO2. Alarmists who constantly say they can't find any other possible explanation for global warming between the 1970's and 1998 besides the trace evil gas CO2 please take note. It's the Sun, stupid.
[along with amplification of variability in solar radiation due to clouds/aerosols]
[along with amplification of variability in solar radiation due to clouds/aerosols]
JOURNAL OF GEOPHYSICAL RESEARCH, VOL. 116, D18205, 13 PP., 2011
doi:10.1029/2010JD015396
doi:10.1029/2010JD015396
Key Points
- RCM-simulated clear-sky dimming/brightening in line with observations
- All-sky signal dominated by cloud forcing in contrast to observations
- Temperature trends could not be improved with transient aerosol emissions
E. M. Zubler et al
The present study applies a regional climate model with coupled aerosol microphysics and transport in order to simulate dimming and brightening in Europe from 1958 to 2001. Two simulations are performed, one with transient emissions and another with climatological mean emissions over the same period. Both simulations are driven at the lateral boundaries by the ERA-40 reanalysis and by large-scale aerosol concentrations stemming from a global simulation. We find distinct patterns of dimming and brightening in the aerosol optical depth and thus clear-sky downward surface shortwave radiation (SSR) in all analyzed subregions. The strongest brightening between 1973 and 1998 under clear-sky conditions is found in mid-Europe (+3.4 W m−2 per decade, in line with observations). However, the simulated all-sky SSR is dominated by the surface shortwave cloud radiative forcing (CRF). The correlation coefficient R between 5 year moving averages of the CRF and all-sky SSR equals 0.87 for all of Europe. Both model simulations show a similar evolution of cloud fraction and thus all-sky SSR due to the constrained circulation induced by the reanalysis at the lateral boundaries. For most subregions, the modeled differences in all-sky SSR due to transient versus climatological emissions are insignificant in comparison with estimates of the model's internal variability.
Yale says Mann's discredited hockey stick should continue to be used as a "particularly effective climate graphic"
Despite being one of the most thoroughly discredited papers of the modern age and dropped in shame by the IPCC, the Yale Forum on Climate Change and the Media says today that Michael Mann's hockey stick graph should still be used as a "particularly effective climate [hoax] graphic."
Interactive Graphics Illustrate Benefits Of Visualizations on Climate Change Issues
In real estate, it’s location, location, location. In climate change communications … it’s visualizations, visualizations, visualizations. Here we post some of the most iconic in the field and some having the most communications and information impact.
Strategic use of visualizations and graphics, particularly when they are designed to be interactive, can be key to presenting large amounts of climate information in an easily digestible form. With outstanding graphics, audiences can engage directly with the information being presented, helping them make sense of large data sets and helping them see connections in complex phenomena.
Here’s an initial listing of some particularly effective climate graphics. If you don’t see your favorites below, please add a comment at the end of this feature identifying them.
Climate scientists discover magical unlimited power source: The Greenhouse Effect
To hell with the 1st Law of Thermodynamics: 1 plus 0 equals 2. Climate scientists have made the remarkable discovery that the greenhouse effect is an unlimited source of free and perpetual energy, as shown in this powerpoint presentation The role of satellite data in estimating the impact of anthropogenic activity on climate change, by Jean-Louis Dufresne, Director of Research at CNRS (National Center of Scientific Research) in France.
email from Alan Siddons:
Speaking of “heat from nowhere,” here’s a charming energy budget that tries to show that there’s no funny business going on with the greenhouse effect.
See? Half goes out to space but half goes down to earth. Then half of that and half of that and so on. It’s 50/50 all the way. What could be wrong in that? Problem is, when you extrapolate this process and add up all the OUT values and the DOWN values you get this:
In other words, a DOUBLING of energy has occurred.
Incidentally, the depiction isn’t of the atmospheric greenhouse effect per se but of what happens when “A sheet of glass opaque to infrared radiation covers a surface exposed to sunlight.” Ho ho ho. Another ignoramus.
email from Alan Siddons:
Speaking of “heat from nowhere,” here’s a charming energy budget that tries to show that there’s no funny business going on with the greenhouse effect.
See? Half goes out to space but half goes down to earth. Then half of that and half of that and so on. It’s 50/50 all the way. What could be wrong in that? Problem is, when you extrapolate this process and add up all the OUT values and the DOWN values you get this:
In other words, a DOUBLING of energy has occurred.
Incidentally, the depiction isn’t of the atmospheric greenhouse effect per se but of what happens when “A sheet of glass opaque to infrared radiation covers a surface exposed to sunlight.” Ho ho ho. Another ignoramus.
Sleep disturbances from noisy wind turbines found in 2 of 3 studies
Paper published today in Environmental Research Letters:
Karl Bolin et al 2011 Environ. Res. Lett. 6 035103 doi:10.1088/1748-9326/6/3/035103
Infrasound and low frequency noise from wind turbines: exposure and health effects
FOCUS ON THE ENVIRONMENTAL IMPACT OF WIND ENERGYKarl Bolin1, Gösta Bluhm2, Gabriella Eriksson3 and Mats E Nilsson2,4
Wind turbines emit low frequency noise (LFN) and large turbines generally generate more LFN than small turbines. The dominant source of LFN is the interaction between incoming turbulence and the blades. Measurements suggest that indoor levels of LFN in dwellings typically are within recommended guideline values, provided that the outdoor level does not exceed corresponding guidelines for facade exposure. Three cross-sectional questionnaire studies show that annoyance from wind turbine noise is related to the immission level, but several explanations other than low frequency noise are probable. A statistically significant association between noise levels and self-reported sleep disturbance was found in two of the three studies. It has been suggested that LFN from wind turbines causes other, and more serious, health problems, but empirical support for these claims is lacking.
In the long term, the clinical consequences of untreated sleep disorders are large indeed. They are associated with numerous, serious medical illnesses, including:
High blood pressure
Heart attack
Heart failure
Stroke
Obesity
Psychiatric problems, including depression and other mood disorders
Attention Deficit Disorder (ADD)
Mental impairment
Fetal and childhood growth retardation
Injury from accidents
Disruption of bed partner's sleep quality
Poor quality of life
Wednesday, September 21, 2011
Alan Siddons on the negative-feedback cooling effect of clouds
email from Alan Siddons:
Someone asked me today to comment on the recent Richard P Allan paper, which is discussed on WUWT [and originally posted on The Hockey Schtick]. Allan’s finding is that clouds have a negative influence on the earth’s temperature (duh). But here is what wrote back. Hope you like it.
Someone asked me today to comment on the recent Richard P Allan paper, which is discussed on WUWT [and originally posted on The Hockey Schtick]. Allan’s finding is that clouds have a negative influence on the earth’s temperature (duh). But here is what wrote back. Hope you like it.
Alan
---------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------
Anthony Watts offers an important statement about the Allan paper:
While Dessler and Trenberth (among others) claim clouds have an overall positive feedback warming effect upon climate due to the long-wave back-radiation, this new paper shows that clouds have a large net cooling effect by blocking incoming solar radiation and increasing radiative cooling outside the tropics.
Now, I always focus on the basic claim that back-radiating greenhouse gases make the earth’s surface warmer. The earth’s SURFACE. Liquid clouds are often included as greenhouse agents because at nighttime they’re thought to reflect heat rays back to the earth and at least retard surface cooling if not actually raise the temperature.
In other words, the same heating mechanism, that of back-radiation, is attributed to clouds because ‘radiative forcing’ theory assumes that opposing flows of infrared actively warm the surface —and always the surface alone, please note. Thus, for instance, Lindzen argues that 240 watts from the surface matched by 240 from the sky will make the surface radiate 480 watts per each square meter. But won’t make the sky radiate 480 in turn.
If you aim your attention on temperature, however, the impossibility of such a thing becomes apparent. For Lindzen’s scenario has a 255 Kelvin sky facing a 255 Kelvin surface — yet it is known that two bodies at the same temperature aren’t able to transfer heat to each other. Nor, of course, can one of those bodies raise the other’s temperature. 303 Kelvin resulting on the surface simply cannot happen, then. Likewise, Trenberth’s sky is at 275 K while the surface (radiating 66 watts per square meter) is at 185 K. Thus the surface can’t possibly get warmer than 275 K, or 1.8° Celsius. And at 1.8° the surface could only radiate 324 W/m² at most, not 390.
From this it’s obvious that opposing flows of radiation do NOT add, which exposes ‘radiative forcing’ as a fiction.
A focus on temperature is also revealing when it comes to water. As you know, water has a very high specific heat. This means that it takes a lot of thermal energy to raise its temperature. Heating water is a slow process because it stores more heat than almost any other substance. In a manner of speaking, water “hides” much of the heat you put into it, so we call this “latent heat.” Surprisingly too, when water cools off (also slowly) itreleases more heat than almost any other substance. The hidden heat breaks out, now becoming overt. These are two sides of the same strange coin.
If water had only a high specific heat and nothing else it would be remarkable enough. Since our planet’s surface is 71% water, after all, and most of it pretty deep, water plays a dominant role in keeping us cool in daytime and warm at night. But water also evaporates when it’s heated, the resultant vapor carrying hidden and overt heat along with it. The direct impact of evaporation, then, is to reduce the surface temperature. As we know by the experience of sweating, evaporation takes thermal energy away.
In other words, the generation of water vapor makes the earth cooler. This should give one pause, for water vapor is considered the principal ‘greenhouse gas,’ i.e., our planet’s foremost warming agent. Yes, water vapor lets loose its hidden heat when it cools and condenses again, but where does this release generally occur?
Up there. High above our heads. Water vapor is light and buoyant, and as it steadily rises in the atmosphere it encounters steady cooling. When it meets the condensation point it turns into a cloud, a mist of very fine water droplets. Much of water’s latent heat then escapes. Does this heat head back to the earth’s surface, though? No, it can’t. Because the earth below is warmer – in fact surface heat is driving the evaporation process. So the heat released by a cloud can only migrate to ever colder regions, as heat always does. The surface doesn’t profit.
Adding to this the simple fact that a cloud reflects sunlight and shades the surface below, and it should come as no surprise that Allan’s study reached the conclusion that it did. Neither clouds nor the water vapor they form from can increase the earth’s surface temperature. Not just clouds themselves, then, but the thermodynamics of making clouds constitute a vast cooling process on the surface.
This doesn’t completely negate any heating effect, however. At night, what was warm before and what was cooler can reverse. That is, the surface temperature often drops below the sky temperature.
This inversion can also involve clouds that have moved in. In such cases it is perfectly permissible for a cloud to transfer heat to the surface, even radiatively. Indeed, as outlined above, ‘back-radiation’ onto the surface can’t perform such a task, only a body with a higher relative temperature. In other cases, where the nighttime surface is the warmer body, it remains that water has a high specific heat. Thus the mass of water that a cloud represents will tend to cool rather slowly, like a nighttime lake. Consequently, a surface that’s sending heat to that cloud will cool more slowly too.
Nevertheless, one mustn’t lose sight of the big picture. Vaporized water steals heat from the earth’s surface. Becoming a cloud, it steals even more. Only after it has lost heat to its colder surroundings is a cloud born in daytime able to give back some heat to a landscape at night. But what it gives back doesn’t come close to what it took away. So the net effect of clouds and of cloud formation is, reasonably enough, to cool. This should have been understood a long time ago.
Alan
Subscribe to:
Posts (Atom)