Monday, August 3, 2015

Why the man-made global warming climate models are a "fudge," according to Hansen himself

Kyoji Kimoto, a Japanese chemist, scientist, and fuel-cell computer modeler & inventor, has submitted his latest work as a second guest post to The Hockey Schtickand which refutes multiple false physical assumptions which underlie the falsely alleged "first physically sound" climate model described in "the most influential climate change paper of all time." These same erroneous physical assumptions also continue to serve as the fundamental basis of James Hansen's NASA/GISS climate model, many other models including the 'state-of-the-art' IPCC climate models, and form the basis of the wide range of modeled CO2 climate sensitivity estimates.

In Kimoto's new work below (and in prior papers), he addresses the multiple unphysical assumptions made by Manabe & Wetherald, Hansen/GISS, and IPCC modelers et al, including a "fudged," arbitrary, and fixed tropospheric lapse rate of 6.5K/km, which does not adjust to perturbations in the atmosphere. This false assumption artificially limits negative lapse rate feedback convection. Using physically correct assumptions, Kimoto finds the climate sensitivity to doubled CO2 to be a negligible 0.1-0.2C. 

Kimoto quotes the father of CAGW James Hansen from a 2000 interview stating that the lapse rate is indeed an artificially-fixed “fudge” for Hansen's 1-Dimensional climate model, stating, 
“In the 1-D model, it’s [the lapse rate] just a fudge, and you choose different lapse rates and you get somewhat different answers. So you try to pick something that has some physical justification.” 
Kimoto concludes, 
Since the [1-Dimensional radiative-convective model] studies with the fixed lapse rate "fudge"] have failed as shown in Fig. 1, the canonical climate sensitivity of 3C claimed by the IPCC is theoretically meaningless, and which is also obtained by the [3-Dimensional Global Circulation Models] studies based on the [climate sensitivity at artificially-fixed absolute humidity] of 1.2~1.3K in Table 1. 
 In conclusion, the cause of anthropogenic global warming debate for the past 50 years is the lack of a parameter sensitivity analysis of the lapse rate for doubled CO2 in the [1-Dimensional radiative-convective model] studies by Manabe & Wetherald (1967), Hansen et al. (1981) and Schlesinger (1986). Parameter sensitivity analysis is a standard scientific procedure necessary to check the validity of the modeled results. 
Kimoto's prior papers & posts also discuss additional false assumptions of climate models including a mathematical error in calculation of the Planck response parameter, and limitations of the potential greenhouse warming of the top ocean layer due to penetration depth, and others, proving the climate models are overheated and far too sensitive to man-made CO2. 

Addendum: Hansen also admits in the same highly-revealing interview linked above that he kept thinking "what is it in our model that makes it so damn sensitive [to CO2]?" and thinks it could be the "cloud scheme," but doesn't know. In the interview, Hansen also says his model cannot reproduce the paleoclimate data and blames the data for this, not his "damn sensitive" climate model! Hansen also says the US Dept of Energy concluded in 1983 that climate sensitivity was low and provides his false assumptions why he thinks there was "an error in their [DOE] thinking," but which a future HS post will show was instead an error in Hansen's thinking.


  1. It is most refreshing to see science brought to bear instead of the usual dogmatic assertions coupled w hysteria. Thank you.