The authors note that to properly simulate these features with the current crop of numerical climate models requires "extremely high resolutions" of 16 kilometers or less, in comparison to state-of-the-art climate models which use much lower grid resolutions of 50 to 100 to up to 10,000 kilometers in size. For example, prior work has demonstrated that proper simulation of convection requires model resolutions of 1-2 km, up to 2 orders of magnitude [100X] higher resolution than today's fastest supercomputer models are capable of attaining. The same is true for proper simulation of clouds, the Earth's sunshade.
To try to get around this limitation, climate models consist almost entirely of "parameterizations," which is a fancy word for fudge factors. As climate scientist Dr. Roger Pielke Sr. pointed out in a recent comment, and contrary to popular belief, climate models are not based on "basic physics" and are almost entirely comprised of parameterizations/fudge factors:
"there is a clear need for systematic stochastic approaches in weather and climate modelling. In this review we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspectives. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models."
"The last few decades have seen a considerable increase in computing power which allows the simulation of numerical weather and climate prediction models with ever higher resolution and the inclusion of ever more physical processes and climate components (e.g. cryosphere, chemistry). Despite this increase in computer power many important physical processes (e.g. tropical convection, gravity wave drag, clouds) are still not or only partially resolved in these numerical models. Despite the projected exponential increase in computer power these processes will not be explicitly resolved in numerical weather and climate models in the foreseeable future. For instance, Dawson et al. have demonstrated using the ECMWF integrated forecast system that extremely high resolutions (which corresponds to a grid spacing of about 16km) are required to accurately simulate the observed Northern hemispheric circulation regime structure. This resolution, typical for limited area weather and climate models used for short term prediction, remains unfeasible for the current generation of high resolution global climate models due to computational and data storage requirements. Hence, these missing processes need to be parameterized, i.e. they need to be represented in terms of resolved processes and scales 153. This representation is important because small-scale (unresolved) features can impact the larger (resolved) scales 84,162 and lead to error growth, uncertainty and biases.
At present, these parameterizations [fudge factors] are typically deterministic, relating the resolved state of the model to a unique tendency representing the integrated effect of the unresolved processes. These “bulk parameterizations” are based on the notion that the properties of the unresolved subgrid-scales are determined by the large-scales. However, studies have shown that resolved states are associated with many possible unresolved states 22,144,167. This calls for stochastic methods for numerical weather and climate prediction which potentially allow a proper representation of the uncertainties, a reduction of systematic biases and improved representation of long-term climate variability. Furthermore, while current deterministic parameterization schemes are inconsistent with the observed power-law scaling of the energy spectrum 5,142 new statistical dynamical approaches that are underpinned by exact stochastic model representations have emerged that overcome this limitation. The observed power spectrum structure is caused by cascade processes. Recent theoretical studies suggest that these cascade processes can be best represented by a stochastic non-Markovian Ansatz. Non-Markovian terms are necessary to model memory effects due to model reduction19. It means that in order to make skillful predictions the model has to take into account also past states and not only the current state (as for a Markov process)."
"To some extent, [conventional climate model] numerical simulations come to the rescue, by allowing us to perform virtual experiments. However, the grid spacing in climate models is orders of magnitude larger than the smallest energized scales in the atmosphere and ocean, introducing biases.
This need [for stochastic modeling] arises since even state-of-the-art weather and climate models cannot resolve all necessary processes and scales. Stochastic parameterizations have been shown to provide more skillful weather forecasts than traditional ensemble prediction methods, at least on timescales where verification data exists. In addition, they have been shown to reduce longstanding climate biases, which play an important role especially for climate and climate change predictions.
The fact that according to the last two assessment reports (AR) of the IPCC (AR4 and AR5) the uncertainty in climate predictions and projections has not decreased may be a sign that we might be reaching the limit of climate predictability, which is the result of the intrinsically nonlinear character of the climate system (as first suggested by Lorenz [father of chaos theory]).
Our hope is that basing stochastic-dynamic prediction on sound mathematical and statistical physics concepts will lead to substantial improvements, not only in our ability to accurately simulate weather and climate, but even more importantly to give proper estimates on the uncertainty of these predictions."
Stochastic Climate Theory and Modelling
(Submitted on 1 Sep 2014)
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations as well as for model error representation, uncertainty quantification, data assimilation and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modelling. In this review we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspectives. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Related papers demonstrating stochastic modeling outperforms conventional numeric climate models:
Simple climate model outperforms IPCC models, demonstrates climate effect of CO2 is miniscule
Paper: Global 'warming since 1850 is mainly the result of natural climate variations'
New paper seeks a grand "unification" of "quite different model physics" of convection
New paper finds sea surface temperatures were controlled by natural 60-year climate cycle during 20th century
How the journal Nature plays fast & loose with the facts about the "pause" in global warming
New paper finds models have a high rate of 'false alarms' in predicting drought
Natural Climate Change has been Hiding in Plain Sight
New paper finds the data do not support the theory of man-made global warming [AGW]
New paper finds a non-linear relationship between sunspots and global temperatures
What are climate models missing?
ReplyDeletehttp://www.euclipse.eu/Publications/Stevens,%20Bony_What%20are%20climate%20models%20missing.pdf
http://www.lse.ac.uk/CATS/Publications/Publications%20PDFs/Smith-Petersen-Variations-on-reliability-2014.pdf
Yes, yes, yes. The physics have been wrong from the start.
ReplyDeleteProfessor Emeritus of Atmospheric Science Dr Bill Gray explains why climate models cannot predict future climate
ReplyDeletehttp://stevengoddard.wordpress.com/2014/09/16/guest-post-from-dr-bill-gray/
Physicist Dr. Steven Koonin explains many of the problems with climate models:
ReplyDeletehttp://hockeyschtick.blogspot.com/2014/09/wsj-op-ed-climate-science-is-not.html
Must read related post by Dr. Robert Brown:
ReplyDeletehttp://wattsupwiththat.com/2014/10/06/real-science-debates-are-not-rare
another semi-stochastic model based on ocean oscillations
ReplyDeletehttp://onlinelibrary.wiley.com/doi/10.1002/grl.50185/abstract
Another paper published yesterday also calling for stochastic weather/climate models instead of conventional numerical models, due to the same problems as outlined by Dr. Brown
ReplyDeletehttp://onlinelibrary.wiley.com/doi/10.1002/wcc.318/abstract
paper showing climate models unable to realistically simulate clouds
ReplyDeletehttp://journals.ametsoc.org/doi/abs/10.1175/JCLI-D-12-00451.1?af=R