Wednesday, September 3, 2014

New paper claims 99.999% certainty global warming over past 25 years is man-made

"There are three kinds of lies: lies, damned lies, and statistics."- Mark Twain


A new paper published in a journal called "Climate Risk Management" claims a ridiculous degree of "certainty" of  99.999% that global warming over the past 25 years is man-made. The claim is made based upon climate models already falsified at confidence levels of 98%+.

According to the authors,
"there is less than a one in one hundred thousand chance of observing an unbroken sequence of 304 months [25.3 years] (our analysis extends to June 2010) with mean surface temperature exceeding the 20th century average."
Fundamental problems with this claim [which is basically the falsified IPCC attribution claim of 95% certainty on steroids] include:


  • There is no statistical difference between the rate of warming over the 27 years from 1917-1944 and the 25 years from 1975/1976 to 2000:



Thus, this new paper is not even wrong with 99.999% certainty


Assumed climate model forcings for CO2, solar TSI, Southern Oscillation Index [SOI] and volcanic.
Upper right graph uses the same falsified technique of IPCC of comparing climate models assuming no change in CO2 [black] with increased CO2 [blue]. 




Abstract

December 2013 was the 346th consecutive month where global land and ocean average surface temperature exceeded the 20th century monthly average, with February 1985 the last time mean temperature fell below this value. Even given these and other extraordinary statistics, public acceptance of human induced climate change and confidence in the supporting science has declined since 2007. The degree of uncertainty as to whether observed climate changes are due to human activity or are part of natural systems fluctuations remains a major stumbling block to effective adaptation action and risk management. Previous approaches to attribute change include qualitative expert-assessment approaches such as used in IPCC reports and use of ‘fingerprinting’ methods based on global climate models. Here we develop an alternative approach which provides a rigorous probabilistic statistical assessment of the link between observed climate changes and human activities in a way that can inform formal climate risk assessment. We construct and validate a time series model of anomalous global temperatures to June 2010, using rates of greenhouse gas (GHG) emissions, as well as other causal factors including solar radiation, volcanic forcing and the El Niño Southern Oscillation. When the effect of GHGs is removed, bootstrap simulation of the model reveals that there is less than a one in one hundred thousand chance of observing an unbroken sequence of 304 months (our analysis extends to June 2010) with mean surface temperature exceeding the 20th century average. We also show that one would expect a far greater number of short periods of falling global temperatures (as observed since 1998) if climate change was not occurring. This approach to assessing probabilities of human influence on global temperature could be transferred to other climate variables and extremes allowing enhanced formal risk assessment of climate change.

5 comments:

  1. This is another successful attempt to show that if you assume that there would have been no background warming then you can conclude that there would have been no background warming — and with great confidence and a low p-value.

    ReplyDelete
  2. You misunderstood the paper - it does not use climate models.

    ReplyDelete
    Replies
    1. What part of "We construct and validate a time series model of anomalous global temperatures to June 2010, using rates of greenhouse gas (GHG) emissions, as well as other causal factors including solar radiation, volcanic forcing and the El Niño Southern Oscillation. When the effect of GHGs is removed, bootstrap simulation of the model reveals that there is less than a one in one hundred thousand chance of observing an unbroken sequence of 304 months (our analysis extends to June 2010) with mean surface temperature exceeding the 20th century average." do you not understand?

      They do use a simple "bootstrap model"

      Delete
  3. from Climate, etc.

    Leonard Weinstein | October 10, 2014 at 9:13 am | Reply
    Judith,
    You start out implying the rise of 0.8C that already occurred since 1850 is essentially all due to AGW. However, most supporters of AGW admit that the rise to 1940 was likely mainly due to natural variation, and a recovery from the LIA. The rise from 1940 to present (the part claimed to be mostly human caused) is < 0.5C, and part of that may be natural variation. Thus the maximum human AGW contribution since 1850 is likely the order of 0.4C plus or minus a small amount. Based on this, the sensitivity has to be much lower than most claims, and likely is less than 1C per doubling of CO2, including possible negative feedback. The future trend will probably be dominated by natural variation, with a small AGW overlay. The continued use of the full 0.8C as AGW distorts the conversation.

    http://judithcurry.com/2014/10/09/my-op-ed-in-the-wall-street-journal-is-now-online/#comment-636622

    ReplyDelete
  4. Analysis: Climate scientists have not used proper procedures to determine if global warming is attributable to CO2

    http://judithcurry.com/2014/10/23/root-cause-analysis-of-the-modern-warming

    ReplyDelete