- clouds,
- ocean eddies,
- convection,
- water cycle,
- thunderstorms,
- "crucial components of the oceans" such as "the Gulf Stream, and the Antarctic Circumpolar Current" [and ocean oscillations]
- etc
As the article mentions, typical climate models use a low resolution of 100 km, but much higher resolutions of 1 km or higher are required to skillfully model convection and clouds, far beyond the capability of current supercomputers. The author recommends a quarter billion dollars be spent to create international supercomputing centers for climate models, before the world spends trillions on mitigation based on the Precautionary Principle that may or may not be necessary.
As climate scientist Dr. Roger Pielke Sr. has pointed out, and contrary to popular belief, climate models are not based on "basic physics," rather are almost entirely comprised of parameterizations/fudge factors for most critical aspects of climate including convection and clouds. As the article below notes,
"simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems"However, even if supercomputers are developed over the next decade capable of handling such high resolution, substantial doubt remains of the benefits for climate prediction due to the inherent limitations of chaos theory, multiple flawed assumptions in the model code, and inadequate observations to initialize such numeric models. These are some of the reasons why two recent papers instead call for a new stochastic approach to climate modeling.
Climate forecasting: Build high-resolution global climate models
International supercomputing centres dedicated to climate prediction are needed to reduce uncertainties in global warming, says Tim Palmer.
Local effects such as thunderstorms, crucial for predicting global warming, could be simulated by fine-scale global climate models.
Excerpts:
Local effects such as thunderstorms, crucial for predicting global warming, could be simulated by fine-scale global climate models.
Excerpts:
The drive to decarbonize the global economy is usually justified by appealing to the precautionary principle: reducing emissions is warranted because the risk of doing nothing is unacceptably high. By emphasizing the idea of risk, this framing recognizes uncertainty in the magnitude and timing of global warming.
This uncertainty is substantial. If warming occurs at the upper end of the range projected in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report1, then unmitigated climate change will probably prove disastrous worldwide, and rapid global decarbonization is paramount. If warming occurs at the lower end of this range, then decarbonization could proceed more slowly and some societies' resources may be better focused on local adaptation measures.
Reducing these uncertainties substantially will take a new generation of global climate simulators capable of resolving finer details, including cloud systems and ocean eddies. The technical challenges will be great, requiring dedicated supercomputers faster than the best today. Greater international collaboration will be needed to pool skills and funds.
Against the cost of mitigating climate change — conceivably trillions of dollars — investing, say, one quarter of the cost of the Large Hadron Collider (whose annual budget is just under US$1 billion) to reduce uncertainty in climate-change projections is surely warranted. Such an investment will also improve regional estimates of climate change — needed for adaptation strategies — and our ability to forecast extreme weather.
Grand challenges
The greatest uncertainty in climate projections is the role of the water cycle — cloud formation in particular — in amplifying or damping the warming effect of CO2 in the atmosphere2. Clouds are influenced strongly by two types of circulation in the atmosphere: mid-latitude, low-pressure weather systems that transport heat from the tropics to the poles; and convection, which conveys heat and moisture vertically.
Global climate simulators calculate the evolution of variables such as temperature, humidity, wind and ocean currents over a grid of cells. The horizontal size of cells in current global climate models is roughly 100 kilometres. This resolution is fine enough to simulate mid-latitude weather systems, which stretch for thousands of kilometres. But it is insufficiently fine to describe convective cloud systems that rarely extend beyond a few tens of kilometres.
Simplified formulae known as 'parameterizations' [i.e. fudge factors] are used to approximate the average effects of convective clouds or other small-scale processes within a cell. These approximations are the main source of errors and uncertainties in climate simulations3. As such, many of the parameters used in these formulae are impossible to determine precisely from observations of the real world. This matters, because simulations of climate change are very sensitive to some of the parameters [fudge factors] associated with these approximate representations of convective cloud systems4.
Decreasing the size of grid cells to 1 kilometre or less would allow major convective cloud systems to be resolved. It would also allow crucial components of the oceans to be modelled more directly. For example, ocean eddies, which are important for maintaining the strength of larger-scale currents such as the Gulf Stream and the Antarctic Circumpolar Current, would be resolved.
Simulation of convective cloud systems in a limited-area high-resolution climate model.
The goal of creating a global simulator with kilometre resolution was mooted at a climate-modelling summit in 20095. But no institute has had the resources to pursue it. And, in any case, current computers are not up to the task. Modelling efforts have instead focused on developing better representations of ice sheets and biological and chemical processes (needed, for example, to represent the carbon cycle) as well as quantifying climate uncertainties by running simulators multiple times with a range of parameter values.
Running a climate simulator with 1-kilometre cells over a timescale of a century will require 'exascale' computers capable of handling more than 1018 calculations per second. Such computers should become available within the present decade, but may not become affordable for individual institutes for another decade or more.
Climate facilities
The number of low-resolution climate simulators has grown: 22 global models contributed to the IPCC Fourth Assessment Report in 2007; 59 to the Fifth Assessment Report in 2014. European climate institutes alone contributed 19 different climate model integrations to the Fifth Assessment database (go.nature.com/3gu8co). Meanwhile, systematic biases and errors in climate models have been only modestly reduced in the past ten years6...
Even with 1-kilometre cells, unresolved cloud processes such as turbulence and the effects of droplets and ice crystals will have to be parameterized [fudge-factored] (using stochastic modelling to represent uncertainty in these parameterizations9). How, therefore, can one be certain that global-warming uncertainty can be reduced? The answer lies in the use of 'data assimilation' software — computationally demanding optimization algorithms that use meteorological observations to create accurate initial conditions for weather forecasts. Such software will allow detailed comparisons between cloud-scale variables in the high-resolution climate models and corresponding observations of real clouds, thus reducing uncertainty and error in the climate models10.
High-resolution climate simulations will have many benefits beyond guiding mitigation policy. They will help regional adaptation, improve forecasts of extreme weather, minimize the unforeseen consequences of climate geoengineering, and be key to attributing current weather events to climate change.
High-energy physicists and astronomers have long appreciated that international cooperation is crucial for realizing the infrastructure they need to do cutting-edge science. It is time to recognize that climate prediction is 'big science' of a similar league.
If they ever bother to model convection properly they will have to include the thermal effect of adiabatic warming on descent which is what really determines surface temperature.
ReplyDeleteThe energy budget diagrams I have seen all omit that aspect and replace it with net warming from DWIR to make the budget balance.
In fact, changes in the amount of energy being returned to the surface in adiabatic descent always cancel out changes in the radiative capabilities of an atmosphere.
"The energy budget diagrams I have seen all omit that aspect and replace it with net warming from DWIR to make the budget balance."
DeleteYes, you are right, and that's how the current models falsely "parameterize" convection, and no doubt this same GIGO error will be repeated with their $250M supercomputers
I am gratified that you can see the point since I've had some pretty torrid responses to it on other sceptic sites.
DeleteThe lack of understanding of adiabatic ascent and descent even amongst 'experts' is astonishing.
No one seems to understand that work done against or with gravity by moving mass up or down cannot by definition of the term adiabatic also involve work being done with surrounding molecules at the same time.
Once work is done with gravity on uplift it is in the form of gravitational potential energy which is no longer able to do any work on the surroundings and which does not radiate. Furthermore it returns as kinetic energy on the subsequent descent.
Some think it is 'magic' and a breach of the laws of thermodynamics but it is they who lack the relevant knowledge and understanding of what was once basic physics of gases moving within a gravitational field.
Hockey Schtick,
ReplyDeleteThanks for the comment over at Steve's Real Science. He seems to have the 'Greenhouse effect' turned upside down and backwards and is digging his heels in.
It is very frustrating since he doesn't even seem to know what the 'atmospheric window' is even though I just posted an image and also the link to Dr Happer's slides.
I wouldn't care so much except for the threatened ban of those who don't agree with his version of the 'greenhouse effect' which closes off any further attempts to correct the problem or at least let others know the Science is not settled.
Dr. Happer just trashed the equations for the climate models BTW.
David Burton put up on his website an audio and slides of Dr Happer's presentation: link
SLIDES: link
Slides 22, 42, 43 and 44 are the critical slides.
John Locke Foundation also has a video someone said it was at:
http://jlf.streamhammer.com/speakers/williamhapper090814.mp4
(My computer is too old to support video)
Gail Combs
Thank you Gail, I've been meaning to take the time to review Happer's presentation
DeleteIt's a shame what is going on over there, I have otherwise been a big supporter of Steve/Tony, but hate to see claims that anyone who doesn't fully agree with an argument is therefore a "dummy" or "idiot"
Re the atmospheric window above, check out this circuit analogy that finds if the atmospheric window is between 0-50 Wm-2 [currently shown by Kiehl-Trenberth to be 40Wm-2], addition of GHEs will cool the surface
Deletehttp://hockeyschtick.blogspot.com/2014/11/modeling-of-earths-planetary-heat.html
Curious if you've heard of the International Centre for Earth Simulation in Geneva... they are working to make this vision a reality.
ReplyDelete;-)
http://www.icesfoundation.org
I haven't, but please read above why faster computers are unlikely to solve chaos theory
DeleteModels as opposed to complete climate simulations have been generated that do adequately predict today's global temperatures. These models have nothing to do with CO2 and they do not require some special supper computer.
ReplyDelete