Sunday, November 2, 2014

A Physicist Ponders the "Pause" & explains the false assumptions behind IPCC climate models

Australian physicist John Reid has published a highly recommended essay "A Physicist Ponders the Pause" at Quadrant Online which discusses many of the points made in prior Hockey Schtick posts, including 
A Physicist Ponders the Pause

By JOHN REID   [emphasis and links added]

After surviving a storm-tossed voyage, King James I concluded that witches must have conjured tempests to do him ill because nothing ever happens by chance. In promoting the notion that climate trends are shaped by an industrialised world's CO2 emissions, warmists are in the same boat.

What bothers me, in the light of the continued denial by some of The Pause — the global climate’s prolonged refusal to grow warmer as the “settled science” predicted — is how this whole climate issue reflects a deeper malaise. It suggests a sort of Calvinistic determinism in which the future is cast in concrete and all that remains for us to do is to remove the form-work. This is in sharp contrast to Eastern philosophy, such as Taoism and the I Ching, which are based on the idea of continuous change. As Heraclitus noted quite some time ago, we may never step twice into the same river.

Determinism has long been there, underlying Western Christian thought, but it has recently come to dominate (or perhaps replace) scientific thinking. I believe that this is an unintended consequence of numerical modeling which is now widespread in science. Computers have, in general, been such a boon to science that no-one any longer questions the validity of some applications, particularly those numerical models which are based on differential equations. All such models rest on certain assumptions — assumptions which are very rarely questioned or even acknowledged. These include assumptions about the complete absence of discontinuities — cliffs and fronts and shocks — which are, in reality, widespread in nature.

However, by far the most subtle and far-reaching hidden assumption is that of determinism, the idea put forward by Pierre-Simon Laplace that if one intelligence knows the precise state of the universe at one instant it can predict the state of the universe at any future time. This idea underlies computer modeling and, in my view, is the root cause of much of the vitriol expressed by warmists. It goes hand-in-hand with ideas of omniscience and perfectability.

The other edge of this deterministic sword is the idea of the Malevolent Force. Under this mentality nothing ever happens by chance and so, when things go wrong and our predictions fail, then there must be a reason. The reason is usually human. If a divinely appointed king is threatened by a storm at sea then it must be the fault of witches, as James I concluded after a pair of tempest-tossed voyages. If a Communist utopia fails, then it must be the fault of recidivists. If a climate model is called into question, it must be the mischief of deniers.

This is not science. This is not physics. Physicists have understood the underlying stochastic (i.e.random) propensities of nature for more than a century. To a physicist, deterministic, numerical models of natural processes may have their uses, but they are known to be limited in scope. Meteorological models cannot predict beyond about a week ahead. These models are typically time-domain models and their underlying assumption of continuity is known to be wrong, no more than a useful approximation. 
 [Father of chaos theory explains why models cannot predict weather or climate beyond 3 weeks]

On the other hand stochastic models (i.e. models which contain some random elements) are usually frequency-domain models and are much more powerful. If the theory doesn’t fit the data, then the theory is wrong; there is no room for special pleading. Stochastic models frequently involve an examination of the distribution of energy or variance with frequency known as a “power spectrum”. It was this sort of modeling which led to the invention of quantum mechanics in the last decade of the nineteenth century, one of the great triumphs of modern physics. [New paper explains why a new approach [stochastic models] to climate modeling is necessary]

Today the climate field is once again dominated by time-domain, deterministic modeling; computer programmers have replaced physicists. A deterministic modeler looks at the graph of global average temperature for the last century and sees that it is increasing. This small change in temperature must have a cause because everything has to have a cause, according to his or her world view. A good candidate must be increasing levels of atmospheric CO2 due to modern industry. In the laboratory CO2 absorbs radiant heat, so this must be more of the same on a global scale. The modeler ignores the simple physical facts that total man-made production of CO2 since the start of the Industrial Revolution only accounts for about one percent of the total CO2 in the ocean-atmosphere system and that convection completely dominates radiation in the transport of heat through the atmosphere. Never mind, they tell themselves, we can always plug in enough feedbacks and fudge factors to make the model work. 

At least in the short term.

A stochastic modeler (i.e. a physicist) looks at the same data and sees quasi-cyclic random fluctuations superimposed on a linear trend. It looks like red noise, which means that random variations are bigger at longer time scales than at shorter time scales. The apparent linear trend in recent global average temperature is quite possibly the outcome of noise components which are longer than the record length. Examination of much longer records of temperature data from ice-cores shows that this is indeed the case. The data does indeed have a red spectrum, and the observed temperature record is typical of what you get when you take a short sample from such a red noise time series. There is nothing unusual about the twentieth century climate.

The stochastic modeler then takes a longer look at the ice core time series over the last half million years or so. It is very interesting. There have indeed been large swings in climate. The last one ended 11,000 years ago. Climate at this longer time scale looks very much like a particular type of red spectrum known as a “random walk”. (A random walk is the sum that you get if you throw a coin over and over again and add one for heads and subtract one for tails after each throw.) There is a big difference though. Random walks tend to wander further and further away from zero (variance increases with time) but the temperature throughout the succession of ice ages remains within a narrow channel (between about -18 and +10 deg C). It is a “bounded random walk”.

Why should it be bounded?

Simple physics tells us that, even in the complete absence of greenhouse gases, the planet cannot get any colder than the Ice Age temperature of -18 C because, at that temperature, the earth’s surface radiates the same amount of heat that it receives from the sun. This is the Stefan-Boltzman Law and it accounts for the lower boundary.

It is an observed fact that, in the present epoch, the surface temperature of the sea under natural conditions in the tropics rarely rises above 28 deg C. Any extra heat causes no increase in temperature. Instead, adding heat causes more rapid evaporation, followed by more vigorous turbulent convection (a stochastic process) which carries the extra heat to the top of the atmosphere where radiates into space. This accounts for the upper boundary.
The stochastic modeler’s theory of climate as a bounded random walk is physically reasonable.


On the other hand, a deterministic modeler (e.g. the palaeoclimatologist Richard Ally in the video below) looking at the same Ice Age temperature time series, sees that there have been large, rapid fluctuations which he cannot explain because he does not understand stochastic process. His response? Climate is obviously highly unstable and we don’t understand it and so we cannot be too careful, therefore we must de-industrialise the world immediately.

And the present pause? To a stochastic modeler it comes as no surprise. It could have been predicted 20 years ago on a desktop computer using a simple autoregressive (AR) model. However, such mundane predictions are rarely published or funded. Only alarmism works.

References:

Richard Alley’s Global Warming – YouTube: https://www.youtube.com/watch?v=T4GThA35s1s

Pelletier, J.D. (2002) “Natural variability of atmospheric temperatures and geomagnetic intensity over a wide range of time scales”. PNAS, 99, supp. 1, pp 2546-2553.

John Reid is a retired physicist in Cygnet, Tasmania.

No comments:

Post a Comment