Wednesday, August 19, 2015

New paper finds another solar amplification mechanism by which solar activity & cosmic rays control climate

A paper published today in the Journal of Atmospheric and Solar-Terrestrial Physics finds another potential solar amplification mechanism mediated by galactic cosmic rays [GCRs] (and distinct from Svensmark's cosmic ray theory of climate). The author demonstrates:

Solar modulation of GCR [Galactic Cosmic Rays] is translated down to the Earth climate.
The mediator of solar influence are energetic particles.
GCR impacts the O3 [ozone] budget in the lower stratosphere.
O3 influences the temperature and humidity near tropopause, and greenhouse effect.
Effectiveness of this mechanism depends on geomagnetic field intensity.

"In this paper we show that bi-decadal variability of solar magnetic field, modulating the intensity of galactic cosmic ray (GCR) at the outer boundary of heliosphere, could be easily tracked down to the Earth's surface. The mediator of this influence is the lower stratospheric ozone, while the mechanism of signal translation consists of: (i) GCR impact on the lower stratospheric ozone balance; (ii) modulation of temperature and humidity near the tropopause by the ozone variations; (iii) increase or decrease of the greenhouse effect, depending on the sign of the humidity changes. The efficiency of such a mechanism depends critically on the level of maximum secondary ionisation created by GCR (i.e. the Pfotzer maximum) − determined in turn by heterogeneous Earth's magnetic field..."


The paper adds to over 100 potential solar amplification mechanisms described in the literature.

As to the false belief that solar activity does not correlate to global temperatures, the sunspot 'integral', the accumulated mean sunspot activityand Fourier analysis all demonstrate this belief to be false:





Graphics from the paper and abstract below:













Solar modulation of GCR [Galactic Cosmic Rays] is translated down to the Earth climate.
The mediator of solar influence are energetic particles.
GCR impacts the O3 budget in the lower stratosphere.
O3 influences the temperature and humidity near tropopause, and greenhouse effect.
Effectiveness of this mechanism depends on geomagnetic field intensity.

Abstract

The Sun's contribution to climate variations was highly questioned recently. In this paper we show that bi-decadal variability of solar magnetic field, modulating the intensity of galactic cosmic ray (GCR) at the outer boundary of heliosphere, could be easily tracked down to the Earth's surface. The mediator of this influence is the lower stratospheric ozone, while the mechanism of signal translation consists of: (i) GCR impact on the lower stratospheric ozone balance; (ii) modulation of temperature and humidity near the tropopause by the ozone variations; (iii) increase or decrease of the greenhouse effect, depending on the sign of the humidity changes. The efficiency of such a mechanism depends critically on the level of maximum secondary ionisation created by GCR (i.e. the Pfotzer maximum) − determined in turn by heterogeneous Earth's magnetic field. Thus, the positioning of the Pfotzer max in the driest lowermost stratosphere favours autocatalytic ozone production in the extra-tropical Northern Hemisphere (NH), while in the SH − no suitable conditions for activation of this mechanism exist. Consequently, the geomagnetic modulation of precipitating energetic particles – heterogeneously distributed over the globe – is imprinted on the relation between ozone and humidity in the lower stratosphere (LS). The applied test for causality reveals that during the examined period 1957–2012 there are two main centers of action in the winter NH, with tight and almost stationary ozone control on the near tropopause humidity. Being indirectly influenced by the solar protons, the variability of the SH lower stratospheric ozone, however, is much weaker. As a consequence, the causality test detects that the ozone dominates in the interplay with ULTS humidity only in the summer extra-tropics.

Sunday, August 16, 2015

Why climate modelers seeking funds & fame have distorted climate science

Kyoji Kimoto, a Japanese chemist, scientist, and fuel-cell computer modeler & inventor, has submitted his latest work as a third guest post to The Hockey Schtickand explains why false, unphysical assumptions and mathematical errors first made by Cess, Manabe, Hansen, et al, continue to be propagated today in the latest state-of-the-art IPCC climate models underlying the basis of climate alarm. 

Among other matters, Kimoto discusses the mathematical error which led to exaggeration of the Planck feedback parameter, and why the false assumption of a fixed lapse rate exaggerates potential warming from doubled CO2 by a factor of at least 10 times. 

Please also see Kimoto's prior posts here, and his previously published paper regarding the Planck feedback parameter miscalculation:

https://drive.google.com/file/d/0B74u5vgGLaWoQjdtVklDb0RrYmM/view?usp=sharing

Why are cloudy nights warmer? Not from greenhouse gas 'back-radiation'

Why are cloudy nights warmer? Is it due to a greenhouse effect from water vapor, inhibition of convection, higher relative humidity, temperature of the cloud bottoms being higher than the clear sky, or ?

Stephen Wilde, who has been a member of the Royal Meteorological Society since 1968, answers the question:

The answer depends on whether it is day or night.

Note that the question is only valid for individual clouds because clouds that form part of horizontally advecting air masses involve different thermal mechanisms.

The question is also only valid for stratiform or fair weather cumulus clouds which are not actively involved in upward convection because surfaces beneath active convective cells are affected by winds flowing in from around the convecting cell.

In daytime an individual cloud both reduces convection and increases humidity beneath it which allows indirect, even diffuse, insolation to continue warming the surface and air beneath the cloud. That is why a cloud passing over a water surface can cause a rise in temperature beneath it. The condensate in the cloud blocks cooling convection and the higher humidity reduces cooling evaporation from the water surface. AGW theory says that the cloud radiates IR downward to raise the surface temperature which I consider to be wrong.

At night time an individual cloud reduces radiative loss to space because the cloud is at (or sometimes above, due to advection) the temperature along the lapse rate slope which is attributable to its height. For maximum radiative loss one needs an atmosphere that is perfectly transparent to IR and the presence of any material reducing that transparency must reduce the rate of radiative loss.

An interesting factor at this point is that IR is susceptible to transparency being reduced by the pressure and density of non-radiative gases because collisional activity can divert IR from photon release to conduction and convection. That is less of an issue with more energetic wavelengths which are not so affected by pressure and mass density. One of the mistakes of AGW radiative theory is to fail to see that IR behaves differently to other wavelengths in the presence of mass compressed by gravity. See the Catling paper in support of that point.

The night time cloud cannot warm the ground but it can reduce radiative cooling which then becomes limited to the temperature of the lapse rate slope at cloud base.

The ground will cool to a lower temperature beneath higher clouds because they are at a higher and colder location along the lapse rate slope.

Clouds at the surface (fog) can stop surface radiative cooling altogether if the fog is deep enough but sometimes the fog layer is shallow and allows radiative cooling of the surface to continue until freezing fog forms.

Clouds high up, for example cirrus clouds, may inhibit radiative cooling of the surface very little.

Note though that the presence of clouds at night is generally indicative of advection of warmer air across the radiating surface so that the amount of radiative inhibition caused solely by the presence of the cloud is hard to separate out.

A surface rarely cools to the temperature of a high cloud because other factors are involved in the cooling process such as the length of day and night, the energy stored in the surface materials, the amount of local air mixing and the regional synoptic situation.


In no case is downward IR from the cloud doing any active warming. In all cases the cloud is simply reducing the capability of the surface to cool itself using different mechanisms day and night.

Why greenhouse gases accelerate convective cooling in the troposphere

Stephen Wilde has alerted us to a new chapter on planetary atmospheres published in Treatise on Geophysics, 2nd Ed, Volume 10, 2015, which underlies a number of points made in his prior HS post

Erasing AGW: How Convection Responds To Greenhouse Gases To Maintain The Hydrostatic Equilibrium Of The Atmosphere.

Stephen writes,
"Amongst much else he says this:

“At high pressures, collisions generally occur before an excited atom or
molecule undergoes radiative decay and emits a photon in a
random direction, with the result that vibrational or rotational
energy is converted into KE [Kinetic Energy] and heat, called thermalization
or quenching.”

So, the higher the pressure at a surface beneath an atmosphere, the more
likely is energy transfer by collisions in place of photon emission.

Thus the denser the air at the surface the more readily energy will be
passed from GHGs to non GHGs.and the less photon emission there will be
relative to temperature.

The 'ideal' adiabatic lapse rate slope set by mass and gravity marks the steady decline in the probability of photon emission relative to collisional activity as one descends through atmospheric mass. All the actual lapse rates have to average out to that 'ideal' lapse rate for an atmosphere to be retained.

I've been saying that for a while.

Just as I have been saying all along, atmospheric mass reduces photon
emission in favour of collisional activity so that the surface temperature
can rise without destabilising radiative equilibrium with space.

Since the surface is heated unevenly, convection then stores that additional
surface energy in the form of potential energy within rising and falling
columns of air. That potential energy is taken up and returned towards the
surface in a never ending cycle for so long as there is an atmosphere.

The descending column inhibits convection beneath it and so continuing insolation can raise surface temperature beneath the column above the Stefan-Boltzmann prediction.

Earth, as a whole, radiates 255k to space but 33k is recycled within convective
overturning and it is that 33k which provides the necessary energy for the
motion involved in convective overturning."
The author also notes "convection dominates lower tropospheres" in the radiative-convective equilibrium of Earth's troposphere and other planets with thick atmospheres. In other words, if greenhouse gas "radiative forcing" [a term not used by the author one single time] increases, a compensatory increase in convection (and evaporation) will negate and erase any such warming at the surface from greenhouse gas "radiative forcing."

Section 10.13.2.4 of the chapter discusses convective stability and provides derivations of the same mathematics underlying the HS 'greenhouse equation.'

https://drive.google.com/file/d/0B74u5vgGLaWoVm0wRGkzVFZHNTg/view?usp=sharing

Quantum Physics Predicts Doubling of CO2 Levels Can Only Warm Earth By < 0.4C

In relation to the recent posts falsifying CAGW on the basis of Planck's quantum theory, this essay by geophysicist Norm Kalmanovitch explains why The Effect of a Doubling of the Concentration of CO2 in the Atmosphere as Depicted by Quantum Physics 
"is only in the order of a few tenths of a degree C, and definitely less than 0.4°C.

At the current rate of increase of 2ppmv/year it will take 193 years to achieve this doubling. A 0.4°C temperature increase caused by this doubling of CO2 in 193 years is only a year to year temperature increase of just 0.002°C; i.e. 0.18°C by 2100."
CAGW once again falsified by quantum mechanics.


Wednesday, August 12, 2015

WSJ: Obama’s Climate Plan Will Increase Poverty & Redistribution

Obama’s Climate Plan and Poverty

The EPA’s new anticarbon rule is full of redistribution to offset its harm to the poor.



President Obama says that critics of his plan to decarbonize the economy are “the special interests and their allies in Congress” repeating “the same stale arguments” about “killing jobs and businesses and freedom.” He adds that “even more cynical, we’ve got critics of this plan who are actually claiming that this will harm minority and low-income communities.”
Is he thinking of critics who work at the Environmental Protection Agency? Perhaps so, because multiple new antipoverty transfer programs are built into the EPA’s new Clean Power Plan. The fine print is there, for anyone who cares to look, 1,317 pages into the rule’s 1,560-page preamble.
The EPA authors are careful to reiterate that “its benefits will greatly exceed its costs.” (Sure.) But then they ever so gingerly observe that “it is also important to ensure that to the extent there are increases in electricity costs, that those do not fall disproportionately on those least able to afford them.”
In particular, the EPA is concerned about “low-income communities, communities of color, and indigenous communities.” The agency orders states “to evaluate the effects of their plans on vulnerable communities and to take the steps necessary to ensure that all communities benefit from the implementation of this rule.” These are the themes of “environmental justice,” the political grievance school that argues for income redistribution to offset the allegedly disproportionate damage to the poor and minorities from pollution.

Opinion Journal Video

Editorial Board Member Joe Rago on who pays for the regulatory agency’s three million gallon spill of heavy metals into a tributary of the Animas River. Photo credit: Associated Press.
It is more accurate to say that any economic disparities arise from the rule itself. Regulations that artificially raise energy prices are regressive. By definition the poor—er, low-income community members—spend a larger share of their incomes on fuel and utilities than the well-to-do climate activists of Marin County and Hyde Park.
As energy prices rise, they spill into other basic needs like food, via fertilizer and feed, and housing, via building materials like cement. Everyone ends up with less disposable income and a diminished standard of living, but low-income workers really are worst off.
The EPA thus requires states to set up “financial assistance programs” only for those living near or below the poverty line. As a model other than straight cash subsidies, the EPA cites a Maryland program that offers “free installation of energy conservation materials” such as window insulation and furnace retrofits. Another is New York, which hands out compact florescent lightbulbs and even new refrigerators. The resulting energy efficiency savings, the EPA helpfully notes, are “of particular value to low-income households who can least afford high energy bills.”
At the federal level, the EPA is creating a program that gives twice as large a subsidy for renewable and efficiency projects that are built in inner-city neighborhoods and disadvantaged rural areas. There will be job retraining for laid-off coal miners. The agency also plans to install more solar generation on top of or around public housing. So while it will raise their utility bills, at least the poor will get a complementary photovoltaic panel.
Perhaps it is bad manners to suggest that the poor themselves might prefer higher incomes rather than the EPA’s form of carbon justice. U.S. economic growth is already much slower than it should be, and the new EPA climate-change rule will make it worse by subtracting billions of dollars every year from potential GDP by misallocating capital and undermining business confidence. This will result in few opportunities and smaller wage gains, with damage to the poorest Americans in particular.
For these reasons, a recent study commissioned by the National Black Chamber of Commerce estimates that the EPA plan will increase the black poverty rate to 32% in 2025 from 26% today. Hispanic poverty will rise to 29% from 23%. No fewer than 28 states raised such economic hardships in their comments to the EPA, to no avail.
The contradiction of modern climate liberals is that they promise lower energy bills and a wind-and-solar jobs boom, with zero trade-offs. But then they demand more redistribution to mitigate the economic and human damage that are the real outcome of their policies. Instead of offering to weatherize the homes of the least fortunate, how about trying to increase prosperity?

Tuesday, August 11, 2015

Planck's Law proves why radiation from a cold object cannot warm or increase the energy of a warmer object

The Arrhenius greenhouse effect falsely assumes radiation from a cold "blackbody" emitter of radiation can warm or increase the energy content of a warmer "blackbody," in essence that a "blackbody" [CO2] emitting 15 micron radiation at a corresponding blackbody temperature of 193K [-80C] can warm a much warmer blackbody at 255K (-18C) by 33K up to 288K (15C), the temperature of the Earth surface. However, Planck’s Law and the theory of blackbody radiation proves why this does not spontaneously occur in nature, and why the Maxwell/Clausius/Carnot/Boltzmann/Feynmann/US Std Atmosphere gravito-thermal greenhouse theory is the only physically correct greenhouse theory.

Planck’s Law and the theory of blackbody radiation proves why low frequency/energy photons (eg 15um photons from CO2) cannot transfer any quantum energy to a higher frequency/energy/temperature blackbody because all of those lower frequency/energy microstates & orbitals are already completely filled or saturated in the hotter body. This fact alone from quantum mechanics falsifies CAGW.

As shown on these calculated Planck blackbody curves from 220K to 320K, CO2 (+H2O overlap) absorbs and emits in the LWIR [LongWave InfraRed] the same as a true blackbody would at an emitting temperature of ~217K over the LWIR band from ~12um to ~17um:


Even though CO2 has emissivity less than a true blackbody and line emissions centered around 15um, and observations also show CO2 emissivity decreases with temperature unlike a true blackbody, for purposes of this simple question, we’ll assume (like climate scientists incorrectly do) that CO2 emits and absorbs as a true blackbody.

Can a blackbody at 217K cause a blackbody at 255K to warm by 33K to 288K as the Arrhenius theory claims? No, this is absolutely forbidden by the second law of thermodynamics on both a macro and quantum/micro basis (see Chapter 13 linked here).

The lower energy/temperature/frequency microstates of a blackbody at a given temperature are by definition “saturated” in a perfect blackbody absorber/emitter and that explains why classical physics shown by the dashed lines in fig 4.8 does not occur in nature and instead a Planck curve of emission and absorption is found in nature. If those lower energy/temperature/frequency microstates of a blackbody were not saturated, then any energy level photons could be thermalized by a blackbody and the frequency vs. blackbody energy intensity curve would go to infinity as shown by the (false) dashed lines in fig 4.8. Also note, the dashed red & blue lines of the false blackbody in fig 4.8 are at higher intensity levels for each given frequency than that predicted by Planck's law. This demonstrates as well that lower frequency/energy photons cannot warm a blackbody at higher frequency/energy/temperature and that heat transfer is one-way only from hot to cold blackbodies (despite the fact that radiation between them is bidirectional).


As explained in the prior post, this discrepancy between the predictions of classical physics vs. what was observed in nature led Planck to "invent" quantum theory to explain blackbody emission and absorption:
To falsely assume that a photon of any energy level can be thermalized by a blackbody at a given temperature would result in the dashed red and blue function in fig 4.8, which does not happen in nature. Therefore, Planck devised quantum theory to explain why blackbodies instead follow a Planck curve shown by the solid blue and red lines below (corresponding to different blackbody temperatures) and contain "cutoff frequencies for thermalization":

Sunday, August 9, 2015

Planck's Quantum Theory Explains Why Low-Energy Photons Cannot Warm a Warmer Blackbody

This post contains excerpts from "Principles of Modern Chemistry, 7th Edition," which we have previously used to illustrate why a low-quantum-energy 15um photon (e.g. as from CO2) cannot be thermalized by a higher-quantum-energy molecular or atomic orbital or vibrational microstate in a blackbody that is at a higher frequency than that of the lower-frequency photon. Reviewing the last post: 

"First up, a common misconception in the climate debate is that radiation from a cold body (e.g. the -18C atmosphere) can warm a hot body (e.g. the +15C Earth surface) just because the cold body does indeed send very-low-energy photons to the hot body. Heat transfer (not radiation) from cold to hot is forbidden by the 2nd Law of Thermodynamics on a macro basis, and by the Pauli Exclusion Principle of fundamental quantum theory on an atomic and molecular basis.  
If a lower-quantum-energy photon is "absorbed" by the completely saturated low-energy microstates (eg vibrational, translational, rotational, chemical bonds) & molecular or atomic orbitals of a higher-energy body, the hot body must simultaneously eject a photon of the exact same wavelength/frequency/energy as that absorbed, due to the Pauli Exclusion Principle of fundamental quantum theory. Thus there is no change whatsoever in the energy content/temperature of the hotter body due to "absorbing" a low-energy photon from the colder source with simultaneous emission of an identical photon of the exact same wavelength/frequency/energy (some instead refer to this as "reflection" of the lower-energy photon). This explains the 2nd Law of Thermodynamics on a quantum basis, thus why low frequency/energy photons from a cold emitter cannot warm a warmer blackbody at a higher frequency/temperature/energy.  
Since the emitting temperature of ~15um photons from atmospheric CO2 is -80C by Wein's & Planck's Laws (also explained in the reference below), these photons cannot possibly be thermalized/increase the energy or temperature of the much warmer Earth surface at +15C."
Many have asked why a photon with insufficient quantum energy cannot be thermalized or increase the heat energy of a warmer blackbody. This is the basis of blackbody radiation absorption and emission and Planck's Quantum Hypothesis, as is very well described beginning on page 146 of  "Principles of Modern Chemistry, 7th Edition"  excerpted below. 

To falsely assume that a photon of any energy level can be thermalized by a blackbody at a given temperature would result in the dashed red and blue function in fig 4.8 below, which does not happen in nature. Therefore, Planck devised quantum theory to explain why blackbodies instead follow a Planck curve shown by the solid blue and red lines below (corresponding to different blackbody temperatures) and contain "cutoff frequencies for thermalization":







Thursday, August 6, 2015

WSJ: The End of Doom: Despite an explosion in population greater than Malthus could have ever imagined, global living standards are higher than ever

Apocalypse Later

Despite an explosion in population greater than Malthus could have ever imagined, global living standards are higher than ever.


We live in an age of all-pervasive cultural pessimism. In one sense, this is understandable. The 18th century, the Age of Enlightenment, produced an explosion of scientific discovery as men’s minds escaped from the shackles of subservience to authority, both political and ecclesiastical. The 19th century was the great age of optimism, as technological development exploited the achievements of science, bringing inventions like the locomotive, the electric light and the telephone.
That optimism dissipated in the 20th century, when two disastrous world wars exposed the dark side of mankind. Far from recovering a sense of hopefulness during the relative peace of the 21st century, gloominess has become the default position of the intellectual classes in the Western world. As Pope Francis’ recent encyclical, “Laudato Si’,” puts it: “We may be leaving to coming generations debris, desolation and filth.”
Ronald Bailey begs to differ. As his book demonstrates, a careful examination of the evidence shows that, at least in material terms (which is not unimportant, particularly for the world’s poor), life is getting better. The overriding reason for this, according to Mr. Bailey, is continuing technological progress, facilitated—and this is crucial—by the global triumph of market capitalism.
ENLARGE

THE END OF DOOM

By Ronald Bailey
Thomas Dunne, 345 pages, $27.99
Among the scares examined by Mr. Bailey in “The End of Doom: Environmental Renewal in the Twenty-First Century” are overpopulation, the exhaustion of natural resources (particularly oil), the perils of biotechnology and genetic modification, and global warming.
Mr. Bailey has little difficulty demonstrating that, despite an explosion in world population greater than Thomas Malthus could possibly have envisaged in the 18th century, global living standards are higher than ever. “Food,” he writes, citing statistics from the World Bank and other organizations, “is more abundant today than ever before in history.” In the past 50 years alone, global food production has more than tripled.
It is also more than likely, in the opinion of most demographers, that world population will peak in the relatively near future and then start to decline. Mr. Bailey attributes this to the related phenomena of growing personal wealth in the developing world and the advance of education, particularly for girls, in those countries. He underplays, I suspect, another factor: Perhaps the most striking aspect of global development is the dramatic migration of population from the country to the city. Of course, this population movement is excellent news for wildlife and biodiversity.
It is even easier for him to show that a fear of the world running out of natural resources, popularized by scaremongers like Stanford demographer Paul Ehrlich, is wholly without foundation, as an elementary understanding of markets clearly shows. No doubt the age of oil will one day come to an end. But as my old friend Saudi Arabia’s Sheikh Yamaniused to point out, the Stone Age did not come to an end because we ran out of stone.
As for the alleged perils of biotechnology and genetic modification (which is simply an improved form of the age-old practice of the selective breeding of plants), if there was any substance to the fears of Frankenfoods, these practices would have stopped decades ago. What the green revolution has done is feed the world and reduce poverty on an unparalleled scale.
I part company to some extent with Mr. Bailey on global warming, where he claims that “the balance of scientific evidence suggests that man-made climate change could become a significant problem by the end of this century.” This is highly unlikely. The so-called greenhouse effect is certainly a scientific fact, but all the evidence suggests that its magnitude is modest, its progress is slow and we can readily adapt to it.
The estimated rise in mean global temperature since around 1880 is 1.4 degrees Fahrenheit. Yet the extent to which humans are responsible for this change is outweighed by the climate system’s own variability. Nations like Singapore, whose climate is 22.5 degrees hotter than the global average, suggest that urban planning can continue to outpace nature, as former global-warming alarmist James Lovelock rightly noted in his recent book, “A Rough Ride to the Future.”
And, as Mr. Bailey points out, the overriding case against the abandonment of fossil fuels is that any benefit that might occur would only help generations yet unborn at the expense of impoverishing those alive today, particularly those in the developing world. Which is why the world will not abandon fossil fuels.
If there is a connecting thread among all these irrational prophecies, and the profoundly harmful policies that the doomsters recommend, it is the precautionary principle, which this book rightly castigates. Based on a confusion between the sensible precept “be careful” and the nonsensical proposition that you can’t be too careful, it insists on taking the worst-case scenario as the outcome that should dictate policy. On that basis, one would never get in a car. And the massive technological advances that we have seen since the Industrial Revolution, and the reduction in global poverty that has followed, would never have occurred.
Another factor is the quasi-religious appeal of these prophecies, which may help to explain the papal encyclical to which I referred at the start. Even more recently, the Church of England at its latest synod called for all vicars to be trained in “eco-theology” as well as the Bible. It also called for churchgoers to do without lunch on the first day of each month, as a fast against climate change. Perhaps this should not be mocked: It might help combat obesity, which is probably more damaging than climate change.
“The End of Doom” is not quite in the same class as Matt Ridley’s classic, “The Rational Optimist,” but it is a good book and deserves to be widely read.
Lord Lawson served as Britain’s Secretary of State for Energy (1981-83) and Chancellor of the Exchequer (1983-89). He is chairman of the Global Warming Policy Foundation.

Why there is > 97% confidence that climate sensitivity to CO2 is not significantly different from zero

And, conversely, why there is > 97% statistical confidence that solar activity IS the major if not single "control knob" of climate:


Looking at all of history, it is apparent that climate sensitivity [to CO2] can not be significantly different from zero.

If you understand the relation between mathematics and the physical world, you understand that, for a forcing to have an effect, it must exist for a period of time and the effect of the forcing is calculated by its duration. If the forcing varies, (or not) the effect is determined by the time-integral of the forcing (or the time-integral of a function thereof).
The CO2 level has been above about 150 ppmv for at least the entire Phanerozoic eon (the last 542 million or so years). If CO2 was a forcing, its effect on average global temperature (AGT) would be calculated according to its time-integral (or the time-integral of a function thereof) for about 542 million years. Because there is no possible way for that calculation to consistently result in the current average global temperature, CO2 cannot be a forcing.
Variations of this proof and identification of what does cause climate change (R^2 > 0.97 (greater than 97% statistical significance)) are at http://agwunveiled.blogspot.com

  • “If you understand the relation between mathematics and the physical world, you understand that, for a forcing to have an effect, it must exist for a period of time and the effect of the forcing is calculated by its duration. If the forcing varies, (or not) the effect is determined by the time-integral of the forcing (or the time-integral of a function thereof).”
    Absolutely, which as you (and others) have clearly demonstrated rules out CO2 as a forcing and shows the time-integral of solar activity IS the climate forcing consistent with available paleoclimate and instrumental data, (and modulated by ocean oscillations, which in themselves are lagged effects of the solar forcing shown in the second graph below):
    Climate Modeling: Ocean Oscillations + Solar Activity R²=.96


Tuesday, August 4, 2015

Why the Pauli Exclusion Principle of quantum mechanics forbids CO2 photons from warming the Earth surface

This post contains excerpts from "Principles of Modern Chemistry, 7th Edition" and will be a reference text for comments here and elsewhere regarding the physics and physical chemistry of the atmosphere, the gravito-thermal greenhouse effect, the laws of thermodynamics, entropy, enthalpy, and other related topics.

First up, a common misconception in the climate debate is that radiation from a cold body (e.g. the -18C atmosphere) can warm a hot body (e.g. the +15C Earth surface) just because the cold body does indeed send very-low-energy photons to the hot body. Heat transfer (not radiation) from cold to hot is forbidden by the 2nd Law of Thermodynamics on a macro basis, and by the Pauli Exclusion Principle of fundamental quantum theory on an atomic and molecular basis. 

If a lower-quantum-energy photon is "absorbed" by the completely saturated low-energy microstates (eg vibrational, translational, rotational, chemical bonds) & molecular or atomic orbitals of a higher-energy body, the hot body must simultaneously eject a photon of the exact same wavelength/frequency/energy as that absorbed, due to the Pauli Exclusion Principle of fundamental quantum theory. Thus there is no change whatsoever in the energy content/temperature of the hotter body due to "absorbing" a low-energy photon from the colder source with simultaneous emission of an identical photon of the exact same wavelength/frequency/energy (some instead refer to this as "reflection" of the lower-energy photon). This explains the 2nd Law of Thermodynamics on a quantum basis, thus why low frequency/energy photons from a cold emitter cannot warm a warmer blackbody at a higher frequency/temperature/energy. 

Since the emitting temperature of ~15um photons from atmospheric CO2 is -80C by Wein's & Planck's Laws (also explained in the reference below), these photons cannot possibly be thermalized/increase the energy or temperature of the much warmer Earth surface at +15C. 

In the reference below, the Pauli exclusion principle is discussed on pages 215, 224, 230, 254, 268, and others. The Pauli exclusion principle prohibits more than 2 electrons from being in the same atomic or molecular orbital simultaneously. In the hot body, photons from a colder emitting source will not be able to join the higher-energy orbitals in the hot body both because they contain insufficient quantum energy, and those orbitals are already saturated with a maximum of 2 higher-quantum-energy electrons. 

In the (simplified) Bohr theory, the energy well is very steep and any incoming photon at a lower-quantum-energy level than that of the target will not "catch and stick" on the sides of the energy well as illustrated in the figures below (from an earlier 4th edition of the Principles of Modern Chemistry, p. 540), thus, such lower-quantum-energy photons are not thermalized and cannot increase the energy/heat content/temperature of the hotter body.  




Also in the reference below (beginning on page 236), microstates of molecules, for instance chemical bond energies and molecular orbitals are also subject to discrete quantum energy levels. A molecular orbital or chemical bond can only have a 1-2 electron wave functions at a time, analogous to the Pauli Exclusion Principle for atoms. A photon must contain sufficient quantum energy to increase such electrons to the next higher quantum energy level in order to be thermalized/increase the energy of the hotter body. 

 
PDF available here

Monday, August 3, 2015

Why the man-made global warming climate models are a "fudge," according to Hansen himself

Kyoji Kimoto, a Japanese chemist, scientist, and fuel-cell computer modeler & inventor, has submitted his latest work as a second guest post to The Hockey Schtickand which refutes multiple false physical assumptions which underlie the falsely alleged "first physically sound" climate model described in "the most influential climate change paper of all time." These same erroneous physical assumptions also continue to serve as the fundamental basis of James Hansen's NASA/GISS climate model, many other models including the 'state-of-the-art' IPCC climate models, and form the basis of the wide range of modeled CO2 climate sensitivity estimates.

In Kimoto's new work below (and in prior papers), he addresses the multiple unphysical assumptions made by Manabe & Wetherald, Hansen/GISS, and IPCC modelers et al, including a "fudged," arbitrary, and fixed tropospheric lapse rate of 6.5K/km, which does not adjust to perturbations in the atmosphere. This false assumption artificially limits negative lapse rate feedback convection. Using physically correct assumptions, Kimoto finds the climate sensitivity to doubled CO2 to be a negligible 0.1-0.2C. 

Kimoto quotes the father of CAGW James Hansen from a 2000 interview stating that the lapse rate is indeed an artificially-fixed “fudge” for Hansen's 1-Dimensional climate model, stating, 
“In the 1-D model, it’s [the lapse rate] just a fudge, and you choose different lapse rates and you get somewhat different answers. So you try to pick something that has some physical justification.” 
Kimoto concludes, 
Since the [1-Dimensional radiative-convective model] studies with the fixed lapse rate "fudge"] have failed as shown in Fig. 1, the canonical climate sensitivity of 3C claimed by the IPCC is theoretically meaningless, and which is also obtained by the [3-Dimensional Global Circulation Models] studies based on the [climate sensitivity at artificially-fixed absolute humidity] of 1.2~1.3K in Table 1. 
 In conclusion, the cause of anthropogenic global warming debate for the past 50 years is the lack of a parameter sensitivity analysis of the lapse rate for doubled CO2 in the [1-Dimensional radiative-convective model] studies by Manabe & Wetherald (1967), Hansen et al. (1981) and Schlesinger (1986). Parameter sensitivity analysis is a standard scientific procedure necessary to check the validity of the modeled results. 
Kimoto's prior papers & posts also discuss additional false assumptions of climate models including a mathematical error in calculation of the Planck response parameter, and limitations of the potential greenhouse warming of the top ocean layer due to penetration depth, and others, proving the climate models are overheated and far too sensitive to man-made CO2. 

Addendum: Hansen also admits in the same highly-revealing interview linked above that he kept thinking "what is it in our model that makes it so damn sensitive [to CO2]?" and thinks it could be the "cloud scheme," but doesn't know. In the interview, Hansen also says his model cannot reproduce the paleoclimate data and blames the data for this, not his "damn sensitive" climate model! Hansen also says the US Dept of Energy concluded in 1983 that climate sensitivity was low and provides his false assumptions why he thinks there was "an error in their [DOE] thinking," but which a future HS post will show was instead an error in Hansen's thinking. 


https://drive.google.com/file/d/0B74u5vgGLaWoWmt6aVlkSDFiaDQ/view?usp=sharing