Wednesday, April 24, 2013

New paper finds ocean microbes can influence cloud formation & climate

A new paper finds ocean microorganisms "alter the chemical composition of sea spray in ways that influence their ability to form clouds over the ocean. That's the conclusion of a team of scientists using a new approach to study tiny atmospheric particles called aerosols that can influence climate by absorbing or reflecting sunlight and seeding clouds." Prior studies have also shown that increased CO2 causes a significant increase in ocean microorganisms such as plankton. This begs the question, could increased CO2 result in more plankton and seed more clouds as a negative feedback to warming? The more we learn about clouds and the climate, the less we know.

Biological activity alters the ability of sea spray to seed clouds

by Staff Writers San Diego CA (SPX) Apr 23, 2013



File image.

Ocean biology alters the chemical composition of sea spray in ways that influence their ability to form clouds over the ocean. That's the conclusion of a team of scientists using a new approach to study tiny atmospheric particles called aerosols that can influence climate by absorbing or reflecting sunlight and seeding clouds.

"After many decades of attempting to understand how the ocean impacts the atmosphere and clouds above it, it became clear a new approach was needed to investigate the complex ocean-atmosphere system-so moving the chemical complexity of the ocean to the laboratory represented a major advance that will enable many new studies to be performed," said Kimberly Prather, director of the Center for Aerosol Impacts on Climate and the Environment, who led the team of more than 30 scientists involved in this project.

They report their findings in the early, online edition of the Proceedings of the National Academy of Sciences the week of April 22.

Tiny air bubbles form in the ocean when waves break and then rise to the surface and burst, releasing gases and aerosols into the atmosphere. This study demonstrates how sea spray aerosols come in a wide variety of sizes and shapes with chemical complexity ranging from simple salts to complex biological mixtures to bacterial cells.

For decades, scientists have been studying how their chemical make-up affects their ability to take up water, seed clouds, and react in the atmosphere. It is has been difficult to isolate and study marine aerosols in the real world, however, because aerosols from other sources overwhelm field measurements.

"Once the ocean-atmosphere system was isolated, we can systematically probe how changes in the seawater due to biological activity affect the composition and climate properties of the sea spray aerosol," said Prather, a professor in the Department of Chemistry and Biochemistry who holds a joint appointment at Scripps Institution of Oceanography.

In their studies, seawater is pumped directly from the Pacific Ocean into a specially modified enclosed wave flume in the Hydraulics Laboratory at Scripps Oceanography. By stringently filtering the air within the wave chamber, the team eliminated contamination from other sources allowing them to probe sea spray aerosol directly for the first time after it was produced by breaking waves.

Over five days, the team systematically altered biological communities within the flume by adding various combinations of cultures of marine bacteria and microscopic marine algae, or phytoplankton. Then, as a hydraulic paddle sent waves breaking over an artificial shoal, instruments positioned along the 33 meter long flume measured the chemistry of the seawater, air, and aerosols.

As the seawater changed and bacteria levels increased, the resulting aerosols showed a major change in composition leading to a reduction in their ability to form clouds. In particular, a day after new cultures were added, bacteria levels rose fivefold and cloud-seeding potential fell by about a third.

These changes were happening even as the concentration of phytoplankton fell, along with levels of chlorophyll-a, the pigment essential to photosynthesis. This is an important finding because current estimates of biological activity in surface waters of the ocean rely on instruments aboard satellites that measure the color of the sea surface, which changes along with levels of chlorophyll-a.

The findings demonstrate the value of the Center's novel approach for sorting through the interdependent factors governing the effects of the ocean and sea spray on climate.

Airline passengers being used as political pawns by Obama Administration


Flying the Government Skies

The 4% FAA spending cut that somehow delays 40% of flights.

WSJ.COM  4/24/13: As travellers nationwide are learning, the White House has decided to express its dislike of the sequester—otherwise known as modestly smaller government—by choosing to cut basic air traffic control services. We wrote about this human- rights violation on Tuesday in "Flight Delays as Political Strategy," but the story gets worse the closer we look.
Start with the Federal Aviation Administration, better known as the Postal Service without the modern technology. Flyers directly fund two-thirds of the FAA's budget through 17 airline taxes and fees—about 20% of the cost of a $300 domestic ticket, up from 7% in the 1970s. Yet now the White House wants to make this agency that can't deliver what passengers are supposedly paying for even more dysfunctional.

Ponder this logic, if that's the right word: The sequester cuts about $637 million from the FAA, which is less than 4% of its $15.9 billion 2012 budget, and it limits the agency to what it spent in 2010. The White House decided to translate this 4% cut that it has the legal discretion to avoid into a 10% cut for air traffic controllers. Though controllers will be furloughed for one of every 10 working days, four of every 10 flights won't arrive on time.
image
Reuters
Airline passengers at San Francisco International Airport on Monday

The FAA projects the delays will rob one out of every three travellers of up to four hours of their lives waiting at the major hubs. Congress passed a law in 2009 that makes such delays illegal, at least if they are the responsibility of an airline. Under President Obama's "passenger bill of rights," the carriers are fined millions of dollars per plane that sits on the tarmac for more than three hours. But sauce for the goose is apparently an open bar for the FAA gander.
The White House claims the sequester applies to the budget category known as "projects, programs and activities" and thus it lacks flexibility. Not so: This is a political pose to make the sequester more disruptive. Legally speaking, the sequester applies at a more general level known as "accounts." The air traffic account includes 15,000 controllers out of 31,000 employees. The White House could keep the controllers on duty simply by allocating more furlough days to these other non-essential workers.
Instead, the FAA is even imposing the controller furlough on every airport equally, not prioritizing among the largest and busiest airports. San Francisco's Napa Valley airport with no commercial service will absorb the same proportion of the cuts as the central New York radar terminal, which covers La Guardia, JFK and Newark International, as well as MacArthur, Teterboro, New Haven, Republic and other regional fields.

Related Video

Assistant editorial page editor James Freeman on how President Obama could have prevented flight delays. Photo credit: Associated Press.
Anyone who has flown in or out of those terminals knows that they are hardly models of efficiency, and one reason is the pre-modern U.S. traffic control system. The FAA still uses ground radar and voice-based communications that were the best technology the 1950s had to offer. Many planes are now equipped with advanced avionics that enable more direct and precise flight paths, but they aren't allowed to fly these faster, safer routes because the FAA can't track their navigation methods.
For more than a decade the FAA has promised to modernize and make the civil aviation system more efficient and reliable, but the only things it has reliably generated are delays or cost overruns or usually both. The project, known as NextGen, is four years off schedule with no end in sight.

The FAA's troubles are the result of bad management and a lack of oversight, according to multiple Department of Transportation Inspector General audits. A 2011 investigation found that one part of NextGen ran $330 million over budget—or half of the FAA sequester—and then the FAA paid the contractor responsible $150 million in bonuses that were supposed to be an incentive for making the budget targets. The overruns are now approaching $500 million, and that's merely one item.

Meanwhile, ever since Al Gore launched a training initiative to increase the productivity of air traffic controllers in 1998, productivity has continued to fall. A larger workforce is now in charge of a smaller workload as the number of flights has dropped by 23%. As the Inspector General reports, "FAA data suggest that its overall staffing may not be optimal."
A rational government would use the sequester to improve on this sorry record. But instead this White House is responding to the FAA's failures by making the flying experience for millions of Americans even more unfriendly. It is actively creating even more delays, cancellations and missed connections in order to incite a public outcry on behalf of bigger government.
All of this deserves to backfire, and it will if Republicans break from their circular immigration firing squads and explain what Mr. Obama is doing. For all of its rough edges, the sequester is proving to be educational. It is showing Americans how broken so much of government is, and it is revealing how our politicians refuse to distinguish between essential services and needless waste.
Video


WSJ.COM 4/24/13: The Federal Aviation Administration claims the sequester spending cuts are forcing it to delay some 6,700 flights a day, but rarely has a bureaucracy taken such joy in inconveniencing the public.
Though the FAA says it is strapped for cash, the air traffic control agency managed to find the dollars to update its interactive "command center" tool on its website so passengers can check if their airports are behind schedule due to what it calls sequester-related "staffing" problems. Oklahoma Senator Tom Coburn noticed this rare case of FAA technological entrepreneurship and fired off a letter Wednesday protesting what he called the agency's "full blown media rollout" to hype the flight delays.

Related Video

Editorial board member Joe Rago on how President Obama is making the Federal Aviation Administration more dysfunctional. Photos: Getty Images
That had zero impact on FAA bosses, who were on Capitol Hill rationalizing their dereliction. But after Mr. Coburn published his letter on his website, FAA regional employees wrote to blow the whistle on their bosses. As one email put it, "the FAA management has stated in meetings that they need to make the furloughs as hard as possible for the public so that they understand how serious it is."
Strategies include encouraging union workers to take the same furlough day to increase congestion. "I am disgusted with everything that I see since the sequester took place," another FAA employee wrote. "Whether in HQ or at the field level it is clear that our management has no intention of managing anything. The only effort that I see is geared towards generating fear and demonstrating failure." Just so.

US Taxpayers may wind up owning Gore's bankrupt Fisker Automotive

How the Wheels Came Off for Fisker

Untested Electric-Car Firm Was Ripe for the Times; U.S. Loans Saddled It With Factory Never Used



For a few months in 2012, Bruce Simon, the chief executive of gourmet food retailer Omaha Steaks International Inc., drove a $100,000 plug-in hybrid electric car known as the Fisker Karma. No longer.

Mr. Simon says his car broke down four times over the span of a few months. Each time, Fisker Automotive Inc. picked it up and sent it by trailer from his home in Omaha, Neb., to a dealer in Minneapolis.
The Karma was "so vulnerable to software errors, and the parts used were of such poor quality that eventually I insisted they take the car back and return my purchase price, which they did," he says. "It's a real shame, the car itself was beautiful."
The near collapse of the Anaheim, Calif., company—it missed a loan payment on Monday, earlier dismissed most of its staff and has hired bankruptcy advisors—comes as affluent buyers like Mr. Simon have turned away from the once promising startup and falling gasoline prices have chipped away at demand for electric cars.
Barring a last-minute rescue, Fisker is poised to become another DeLorean Motor Co. or Tucker Corp., a symbol of the difficulties of creating entirely new car companies. Unlike those others, it also represents one of the most prominent failures of the government's use of public funds to wean American industry from fossil fuels—and of how that government interest pushed Fisker to reach too far.
Originally, Fisker wanted to start small. But, says investor David Anderson, the U.S. asked it to think big. '"We can't loan you money to make a low volume car [in Finland],'" he said the U.S. argued. '"But if you wanted to bring forward in time your idea of the small car to be produced here in the U.S.,' then, they'd say 'OK,'" Mr. Anderson said.
A spokesperson for the Department of Energy declined to comment.
At its peak, tiny Fisker was one of the largest U.S. venture capital backed companies ever. Its founders raised more than $1 billion from highly regarded Silicon Valley venture funds including Kleiner Perkins Caufield & Byers. It also recruited a roster of prominent backers including former Vice President Al Gore and former Oracle Corp. president Ray Lane.
Its biggest single investor was the U.S. In 2009, the Obama administration's interest in cultivating electric cars got the untested Fisker loans totalling $529 million, more than the company had initially requested, and an amount that encouraged private backers to chip in more funds. At one point, backers valued the company at $1.8 billion.
image
Reuters
The Fisker automotive electric Atlantic sedan logo is seen during its unveiling ahead of the 2012 International Auto Show in New York.
The company had applied in 2009 for a $169 million loan from a $25 billion program set up in the wake of the financial crisis to boost alternative-energy vehicles. Energy Department officials recommended that if Fisker was willing to build in the U.S., the agency would fund the development of the Karma and the company's proposed second, less expensive model, according to people familiar with the matter.
Fisker executives agreed to acquire a shuttered General Motors Co. assembly plant in Wilmington, Del., where it hoped to build a $60,000 sedan.
Today, Fisker looks headed toward a bankruptcy restructuring. The U.S. could wind up owning all or part of the company's assets because its loans were backed by Fisker assets. So precarious is the company that the U.S. seized $21 million this month from Fisker in anticipation of a default.
How did the wheels come off so quickly? Fisker got its start in 2007, a year before U.S. gasoline prices hit $4.11 a gallon and seemed headed to $5 a gallon. The fuel spike and global cash crunch helped put General Motors and Chrysler Group in government-led bankruptcies. Its co-founder, Henrik Fisker, was a highly regarded designer for luxury brands Aston Martin and BMW and armed with an idea ripe for the times.
He lured investors with a hand-sculpted clay model of his dream car and the promise of an high-tech answer to what then was seen as an inexorable rise in fuel prices. Fisker and its promising Karma luxury plug-in appeared as the opposite of Detroit's plodding big car makers.

Earlier

Tony Posawatz, President and CEO of Fisker Automotive tells WSJ's Alan Murray at the 2013 ECO:nomics conference that adoption of new clean technologies will take years and that investors should be patient. (Originally published 3/21/13)
In September 2009, the Energy Department gave preliminary approval to a $529 million in loans for the company. The amount was more than the $465 million it had earlier agreed to loan to rival electric-car startup Tesla Motors Inc. The DOE also awarded Ford Motor Co. and Nissan Motor Co. $5.9 billion and $1.4 billion, respectively, to fund their electric and hybrid vehicle programs. The DOE's decision to increase the Fisker loan will be a topic of a hearing scheduled on Wednesday before a House committee.
Even with its wealthy backers, Fisker had plenty of problems. Troubles with suppliers and regulatory requirements added months to the Karma's release. Its engineers expressed concerns that the software that ran the Karma's display screens and phone connections wasn't ready, people familiar with the situation say. Still, the Karma went out to customers. The company said that its problems were expected of any new model.
In May 2011, the Obama administration, under pressure from critics of its alternative energy spending and after the high-profile failure of U.S.-backed solar panel maker solar panel maker Solyndra LLC, froze disbursements to Fisker citing delays in the Karma's rollout.
Nonetheless, Fisker kept ordering parts to build Karmas, piling up costs even as the company struggled to fix software and other problems that prompted complaints from early buyers, and led to critical reviews in auto publications.
In the fall of 2011, Fisker's battery supplier, A123 Systems Inc., was informed Fisker had run out of cash and wouldn't be able to take more deliveries.
A123, which also received a government grant to finance U.S. factories, had shipped about 3,000 battery packs to the company. The Waltham, Mass., company was ready to ramp production to 15,000 packs annually for its top customer. Its own market miscalculations and quality problems led A123 to seek bankruptcy protection last fall.
Fisker stopped production of the Karma at a factory in Finland in July 2012 in an attempt to negotiate a cost-saving contract. The following month, Fisker recalled its cars for a second time to fix a cooling system flaw that was linked to battery fires.
It hasn't built a car since.

Monday, April 22, 2013

MET office now admits Arctic sea ice didn't cause unusually cold weather

In a new report entitled "Why was the start to spring 2013 so cold?," the chief of the UK MET Office now admits that decreased Arctic sea ice or "Arctic amplification" was not responsible for the unusually cold spring 2013 in Europe, finding "little evidence [of a difference] from the comparison between the cold spring of 1962 and this year." The report also finds little evidence that "Arctic amplification" is responsible for any "increased probability of extreme weather events that result from prolonged conditions, such as drought, flooding, cold spells, and heat waves."

According to the report, "Figure 13 shows the midtroposphere temperature anomalies for 1962 and 2013; over the Arctic they are almost identical and reflect the negative NAO [natural North Atlantic Oscillation] pattern. It is hard to argue that Arctic amplification had changed the equator to pole temperature in a systematic way to affect the circulation this spring."

Full report here

Excerpt:
There have been some suggestions that the rapid decline of Arctic sea ice, especially during summer, is responsible for this year’s cold spring. It is argued [8] that amplification of global warming over the Arctic is reducing the equator to pole temperature gradient, thereby weakening the strength of the mid-latitude jet streams. In turn this may lead to slower progression of upper-level waves and would cause associated weather patterns in midlatitudes to be more persistent, potentially leading to an increased probability of extreme weather events that result from prolonged conditions, such as drought, flooding, cold spells, and heat waves.  
This hypothesis remains contentious [9], however, and there is little evidence from the comparison between the cold spring of 1962 and this year that the Arctic has been a contributory factor in terms of the hypothesis proposed above. Figure 13 shows the midtroposphere temperature anomalies for 1962 and 2013; over the Arctic they are almost identical and reflect the negative NAO pattern. It is hard to argue that Arctic amplification had changed the equator to pole temperature in a systematic way to affect the circulation this spring. 

Shock: Scientific American publishes article on the failure of climate models

...while continuing to be faithful believers in the garbage output by said models.

Climate change models fail to accurately simulate droughts

By Ashutosh Jogalekar | April 18, 2013
Most of my day job involves simulating the behavior of molecules like drugs and proteins using computer models. The field is more an art than a science, partially because the systems that are being modeled are too complex and ill-understood to succumb to exact solutions. Success often depends on experience and intuition gained by working on similar systems. That does not mean there are no correct predictions, but it does mean that surprises are more common than we think and that many phenomena are impossible to model within a very precise window of accuracy. The failure of a model can sometimes be traced to a simple inability to simulate the behavior of an essential component of the system. In several cases this component is simply the water that surrounds a protein; water remains a substance that’s as enigmatic as any other. In other cases it could be the entropy of the system. The problem is that these factors are very hard to calculate even when we know that they are responsible for the limitations of our model.

A recent report on the failure of climate change models to predict the timing of major droughts in the Southwest made me think of some of the problems in my own field. Unfortunately the actual paper is not out yet so we will have to wait for the details, but the news piece in Nature has a good summary.
Sloan Coats of Columbia University’s Lamont-Doherty Earth Observatory in Palisades, New York, and his colleagues tested whether a state-of-the-art climate model could simulate the droughts known to have occurred in the southwest during the past millennium. The model incorporated realistic numbers for factors that affect temperature and rainfall, such as atmospheric carbon dioxide levels, changes in solar radiation and ash from volcanic eruptions. It also incorporated changes in the El Niño/Southern Oscillation (ENSO). 
The results were puzzling. Although the simulation produced a number of pronounced droughts lasting several decades each, these did not match the timing of known megadroughts. In fact, drought occurrences were no more in agreement when the model was fed realistic values for variables that influence rainfall than when it ran control simulations in which the values were unrealistically held constant. “The model seems to miss some of the dynamics that drive large droughts,” says study participant Jason Smerdon, a researcher at Lamont-Doherty who studies historical climate patterns. 
Other climate models tested by the team fared no better, he says. In particular, the models failed to reproduce a series of multi-decadal droughts that occurred in the southwest during the Medieval Climate Anomaly, a period between AD 900 and 1200 when global temperatures were about as high as they are today.

The team goes on to provide several possible explanations for the failure of the models, most likely related to their inability to account for details in the ENSO cycle. The researchers also note that the models may not capture some important features of the biosphere.

In addition to their failure to reproduce El Niño and La Niña, existing models do not fully capture other factors that influence rainfall, such as clouds and vegetation. But Smerdon adds that the atmospheric and oceanic dynamics that inhibit rainfall and favour prolonged drought may be essentially random and so almost unpredictable.


This is in fact a problem that has plagued computer models of climate since their very inception in the 1950s. The early general circulation models (GCMs) included the motion of the atmosphere and factors like wind speed, temperature and pressure. Over time these atmospheric circulation models became quite sophisticated, account for radiation transport and the opacity of various gases. The strengths and weakness of these models largely carried over into modern day climate modeling.

In general the models are quite good at simulating the motions of the atmosphere but are still inadequate in accounting for the complex processes in the biosphere, including the behavior of the soil, forests, rivers, mountains and the various plants and animals that inhabit these environments. This discrepancy between accurate atmospheric simulation and lackluster biospheric simulation may be responsible for many of the defects in climate modeling. And as the researchers say, the models are still not great at capturing fine-grained details of clouds and their influence on water. It’s striking to me that both molecular models and climate models struggle in modeling that simplest and most ubiquitous of substances – water. No wonder they have a hard time predicting droughts and precipitation. Finally, the lack of difference in the results when the key factors are held constant and when they are allowed to vary points to an independent and possibly unknown set of factors that are influencing model results.

Nonetheless as the article says, the major predictions about global precipitation seems to be clearer and are based on extensive field studies across the globe; climate change is much more than computer modeling. The problems though are in predicting local precipitation patterns and unfortunately it’s these kinds of predictions that drive public policy at local and state levels. The most important result from such modeling data of course is the knowledge it provides about the strengths and limitations of climate change models. And knowledge is always useful.

About the Author: Ashutosh (Ash) Jogalekar is a chemist interested in the history and philosophy of science.

Another green energy fiasco: Gore's Fisker automotive burned through $1.3 Billion in taxpayer & venture capital on way to bankruptcy

Report: Each Karma Hybrid Sedan Built Cost Fisker $660,000

Shane McGlaun (Blog) - April 22, 2013 10:51 AM

Fisker burned through $1.3B in private and government funding

The latest massive failure in the automotive industry to take a huge chunk of taxpayer money with it is Fisker. The automaker has been struggling and earlier this month laid off 75% of its workers. The company is also expected to file bankruptcy, seeking protection from its creditors.

Through all of its troubles, Fisker has only produced 2,500 of its Karma plug-in hybrid sedans (it hasn’t even begun production of the smaller Atlantic plug-in hybrid sedan). When you consider the amount of investor and U.S. taxpayer money given to Fisker in the form of government-backed loans, each of those 2,500 Karma electric vehicles cost $660,000 to produce compared to a retail price of $103,000.

Fisker had planned to spend part of the $529 million loan from the U.S. government to reopen an old General Motors manufacturing factory in Delaware. Despite the fact that Fisker had violated loan terms for the use of government-backed funds from being Energy Department, it was allowed to continue using the money according to a report released last week by a company called PrivCo.



Fisker Karma

“They made a mistake” in awarding the loan, PrivCo Chief Executive Officer Sam Hamadeh said of the Energy Department in an interview yesterday with Bloomberg. “Should they have fought this sooner? Obviously -- as soon as it became evident that they had begun to default.”

However, the Energy Department takes issue with the PrivCo report stating that the report contained errors. The Energy Department says that it halted Fisker's funding in late June of 2011 after the company had used about $193 million from the government loan.

Overall, Fisker spent $1.3 billion in venture capital and taxpayer money according to the report. Fisker is supposed to make the first repayment of $20.2 million on the loan granted from the Energy Department today. It remains unclear whether or not that will happen.

Source: Bloomberg


Al GoreThe wasteful and incomprehensible "green" energy policies of the Obama Administration continue to be exposed as a rip-off of American taxpayers. The latest insane venture involves hybrid auto start-up company, Fisker. While the story of Fisker receiving a $529 million loan from the Department of Energy has been widely reported, less known is the fact that green energy charlatan, Al Gore, may have played a key role in obtaining the loan.

Saturday, April 20, 2013

New paper finds another potential solar amplification mechanism

A paper published today in Theoretical and Applied Climatology finds the 11-year solar cycle is correlated to the quasi-biennial oscillation (QBO), a wind reversal that "dominates" variability of the lower stratosphere and in turn "affects a variety of extratropical phenomena including the strength and stability of the winter polar vortex." The IPCC AR4 states that the IPCC climate models do not include the quasi-biennial oscillation due to inadequate understanding of the causes, and "Due to the computational cost associated with the requirement of a well-resolved stratosphere." The paper adds to many others finding solar amplification mechanisms that are not included in the climate models the IPCC uses to dismiss the role of the Sun.

From the IPCC AR4


8.4.9 Quasi-Biennial Oscillation


The Quasi-Biennial Oscillation (QBO; see Chapter 3) is a quasi-periodic wave-driven zonal mean wind reversal that dominates the low-frequency variability of the lower equatorial stratosphere (3 to 100 hPa) and affects a variety of extratropical phenomena including the strength and stability of the winter polar vortex (e.g., Baldwin et al., 2001). Theory and observations indicate that a broad spectrum of vertically propagating waves in the equatorial atmosphere must be considered to explain the QBO. Realistic simulation of the QBO in GCMs therefore depends on three important conditions: (i) sufficient vertical resolution in the stratosphere to allow the representation of equatorial waves at the horizontally resolved scales of a GCM, (ii) a realistic excitation of resolved equatorial waves by simulated tropical weather and (iii) parametrization of the effects of unresolved gravity waves. Due to the computational cost associated with the requirement of a well-resolved stratosphere, the models employed for the current assessment do not generally include the QBO.
The inability of resolved wave driving to induce a spontaneous QBO in GCMs has been a long-standing issue (Boville and Randel, 1992). Only recently (Takahashi, 1996, 1999; Horinouchi and Yoden, 1998; Hamilton et al., 2001) have two necessary conditions been identified that allow resolved waves to induce a QBO: high vertical resolution in the lower stratosphere (roughly 0.5 km), and a parametrization of deep cumulus convection with sufficiently large temporal variability. However, recent analysis of satellite and radar observations of deep tropical convection (Horinouchi, 2002) indicates that the forcing of a QBO by resolved waves alone requires a parametrization of deep convection with an unrealistically large amount of temporal variability. Consequently, it is currently thought that a combination of resolved and parametrized waves is required to properly model the QBO. The utility of parametrized non-orographic gravity wave drag to force a QBO has now been demonstrated by a number of studies (Scaife et al., 2000; Giorgetta et al., 2002, 2006). Often an enhancement of input momentum flux in the tropics relative to that needed in the extratropics is required. Such an enhancement, however, depends implicitly on the amount of resolved waves and in turn, the spatial and temporal properties of parametrized deep convection employed in each model (Horinouchi et al., 2003; Scinocca and McFarlane, 2004).

From Wikipedia: 

The quasi-biennial oscillation (QBO) isa quasi-periodic oscillation of the equatorial zonal wind between easterlies and westerlies in the tropical stratosphere with a mean period of 28 to 29 months. The alternating wind regimes develop at the top of the lower stratosphere and propagate downwards at about 1 km (0.6 mi) per month until they are dissipated at the tropical tropopause. Downward motion of the easterlies is usually more irregular than that of the westerlies. The amplitude of the easterly phase is about twice as strong as that of the westerly phase. At the top of the vertical QBO domain, easterlies dominate, while at the bottom, westerlies are more likely to be found.

The QBO was discovered in the 1950s, but its cause remained unclear for some time. Radiosonde soundings showed that its phase was not related to the annual cycle, as is the case for all other stratospheric circulation patterns. In the 1970s it was recognized by Richard Lindzen and James Holton that the periodic wind reversal was driven by atmospheric waves emanating from the tropical troposphere that travel upwards and are dissipated in the stratosphere by radiative cooling. The precise nature of the waves responsible for this effect was heavily debated; in recent years, however, gravity waves have come to be seen as a major contributor.


Effects of the QBO include mixing of stratospheric ozone by the secondary circulation caused by the QBO, modification of monsoon precipitation, and an influence on stratospheric circulation in northern hemisphere winter (the sudden stratospheric warmings).



Manifestation of reanalyzed QBO and SSC signals

Abstract


Global spatial distribution of oscillations in the period bands linked to the quasi-biennial oscillation (QBO) and to the 11-year sunspot cycle (SSC) was investigated using the pseudo-2D wavelet transform. The results were obtained for the ERA-40, NCEP-DOE 2, NCEP/NCAR, and Twentieth Century Reanalysis V2 datasets. Those included time series of air temperature and zonal and meridional wind velocities were examined for all reanalyzed series from 1,000 up to 10 hPa. Most of the datasets covered the second half of the twentieth century. The results are generally in agreement with other related studies, and they point to the presence of the QBO in the tropical stratosphere along with the regions of induced changes in residual circulation, temperature, or ozone amount across extratropics. The SSC [11-year sunspot cycle] imprint is located mainly over similar locations showing that the cycles’ signals are mutually affected there.

Analysis finds planetary harmonics control solar activity and subsequent climate change

A new post at ClimateMonitor.it by Carlo Tosti demonstrates that  the global temperature record since 1880 is highly correlated to solar activity, and that solar activity is in turn highly correlated to the harmonics of planetary motion. These correlations and accumulating evidence of an amplified solar effect on Earth's climate would tend to suggest a "unified theory" of climate change, whereby gravitational effects from planetary motions cause small changes in solar activity, which are then amplified via cosmic rays/clouds [Svensmark's theory of cosmoclimatology], ozone, and ocean oscillations to cause large changes in Earth's climate.

Global temperature anomaly [Blue] vs. signal of planetary modulation of solar activity [Red]

Sunspot Number [SSN, Red] vs. 22-yr Gaussian-filtered planetary harmonics [Blue]

The Climate Circus Leaves Town



The Climate Circus Leaves Town