Big temperature swings normal, says climatologist

Written by Ben Pelton

By Dan Pelton

Staff Reporter, Orangeville Citizen

As this was being written Tuesday night, the area was being drenched by what looked remarkably like an April shower, with the temperature soaring into double digits. But when you step outside to pick up your paper this afternoon, the forecasters say you will be in a paradise … for polar bears.

polar bear humor

So far, the winter of 2012-13 has featured temperatures both well above and the normal high of about –4º C. There have been few days when they were actually normal. While some experts are pointing to climate change as a pivotal factor in weather fluctuations, Environment Canada’s senior climatologist says temperature changes are par for the course.

“There’s a difference between climate and temperature,” explained David Phillips in an interview.

“The wild swings in temperature have had nothing to do with climate. It’s about where the winds come from.”

The recent warming trend, for example, was caused by breezes blowing in from the southern United States.

Continue Reading 2 Comments

Outmoded Science Journals Fail Again on Peer Review

Written by

Times are moving fast and Science and Nature magazine can’t cope. Caught resting on their laurels these eminent mainstream publications have again been exposed as inept or biased when it comes to peer-reviewing the papers submitted to them.

PSI RUBS OUT Science and NatureThe latest blow concerns the questionable peer review policies of these “top” journals. These denizens of academic publishing are being pulverized in the blogosphere for their inflexibility and conservatism more in keeping with the bygone era of traditional paper and print publishing. Signaling the revolution this week are astute analysts from various quarters. Leading Aussie science blogger, Jo Nova, reflects the mood in the climate science community lamenting:

“The peer review system has decayed to the point where the culture of the two “top” science journals virtually guarantees they will reject the most important research done today. It is the exact opposite of what we need to further human knowledge the fastest. Science and Nature are prestigious journals, yet they are now so conservative about ideas that challenge dominant assumptions, that they reject ground-breaking papers because those papers challenge the dominant meme, not because the evidence or the reasoning is suspect or weak.”

Continue Reading No Comments

Real Global Warming

Written by Ben Wouters

 

by Ben Wouters

Current climate science totally neglects the enormous amounts of heat available inside the Earth. This seems reasonable since the measured heat flux through the oceanic crust is ~0,1 W/m2, and is absolutely dwarfed by the ~240 W/m2 average solar radiation that warms the Earth. Only the sun warms the Earth, no other heat source is considered and a Greenhouse Effect (GHE) is the only way to explain why our temperatures are ~33K above the generally accepted Effective Temperature for Earth of 255K.

To expose the error in this reasoning and see some real global warming we have to go back in time. Some 125 million years ago in what is now the Pacific Ocean perhaps the largest seismic event of the last 300 million years started. A Mantle Plume burst through the ocean floor, and some 100 million km3 glowing hot magma erupted into the Pacific. This is 1 km3 magma for every 14 km3 water in all the world’s oceans, capable of warming those oceans some 15-20K. The magma cooled down and formed what is known as the Ontong Java–Manihiki–Hikurangi Plateau (OJMHP) (1).
 
 Not surprisingly we find very high temperatures following that period as is shown in the following reconstruction of deep ocean temperatures (2) (3) . (Notice these are DEEP ocean temperatures, not surface)
Wouters paleo temps
Unfortunately this reconstruction doesn’t go all the way back to 125 million years, so we can’t see if temperatures were even higher before the peak at ~83 million years. Since the creation of the OJMHP several similar but smaller events followed, explaining the sometimes rising deep ocean temperatures in a generally cooling trend.

 
According to this reconstruction it is obvious that between 80 and 90 million years ago the deep ocean temperatures where at least ~15K warmer than at present, and the deep oceans have on average been cooling down about 1K every 5 million years since then.
 
With a very much simplified model I’ll show how these major geological events in the distant past are still influencing our current temperatures.
 
Simple energy balance model
 
Assumptions:
– incoming solar energy is 240 W/m2 after reflection , only warming Earth’s surface – current average surface temperature is 290K (oceans, disregarding the continents).
 
Consider a small planet in outer space, no internal heat, fully covered with a floor heating system (FHS) to replace incoming solar energy. No heat from the FHS can escape to the inside of the planet.
 
And of course we can control the energy flowing to the FHS. With the FHS turned off the surface temperature of our little planet is ~2,7 K due to the Cosmic Background Radiation.
1) Turning on the heat, we begin with 0,1 W/m2, after stabilisation the surface temp is ~30K.
2) Next 240 W/m2, temperature stabilises at 255K, Earth’s generally accepted Effective Temperature, and 240 W/m2 is radiated to space.
3) More heat, 400 W/m2, temperature stabilising around 290K, and 400 W/m2 radiates to space.
 
Now we cover our little planet with an insulation blanket, with the same thermal resistance as our atmosphere, 290K on the inside results in 240 W/m2 loss at the outside. With the surface temperature at 290K and the blanket covering the planet, we can turn down the FHS to just 240 W/m2 and maintain the 290K for as long as we supply the same amount of energy as escapes on the outside of the insulation blanket.
 
Back to Earth. Note that only simple insulation (no back radiation heating or similar) is needed to explain the surface temperature, given the boost to 290K or higher by the creation of the OJMHP and that the energy budget is balanced. Recall that current climate science states that the atmosphere is even capable of WARMING the surface ~33K, so just maintaining current average surface temperatures should be an acceptable premise.
 
Since the peek temperature the deep oceans have been cooling down very slowly. Assuming all of the oceans had a temperature of 290K from surface to bottom at the time of the temperature peak, we see that most of that heat has already been lost to space, given the current average temperature of the deep oceans of ~275K. Every time the sun supplies less energy than escapes to space (eg Milankovitch cycles), the ocean’s surface cools down, lost heat is re-supplied by now warmer water from below and the deep oceans cool down. 
 
This mechanism also explains the exceptionally stable temperatures on Earth. Excess incoming energy warms the surface layer, and increases outgoing radiation almost immediately. Shortage of incoming energy is buffered by the deep oceans enormous thermal mass.

 
Recently we started having Ice ages (last ~2.5 million years). Without a new Mantle Plume eruption Earth may well be heading towards a Snowball Earth situation. Interestingly the small 0.1 W/m2 heat flux through the oceans crust can warm all of the oceans 1K every ~5000 year when ice prevents heat loss at the surface. In this situation the small geothermal heat flux could actually be (part of) the explanation for the ending of an ice age.
Extending this setup to Earth’s early history we can envision a situation where the effect of a Faint Young Sun is offset by a much thinner crust, allowing a substantial heat flux plus a much more active Earth with many Mantle Plumes and other seismic events warming the already existing oceans.
 
Conclusion
 
With the inclusion of Geothermal Heat in the climate equation, the role of the atmosphere is simply that of an insulation blanket. The sun is barely able to prevent the cooling of planet Earth. With the diminishing amount of buffered heat in the deep oceans we are moving towards a colder period, unless a major re-heating by Geothermal Energy comes along. Obviously this is not a complete climate theory. 
 
Most of the classical meteorology from before the CO2 hype is still valid. Milankovitch, Svensmark and many other theories and effects can co-exist on top of this basic climate setup. All this has serious implications for the role of CO2 and climate sensitivity, which may very well be slightly negative. Instead of worrying about humanity warming the planet, we should prepare for the Ice Age that is coming sooner or later, unless of course the next major undersea flood basalt saves the day.
 
Ben Wouters, Zuid Scharwoude, Netherlands.
 
References
 

Continue Reading 1 Comment

New: Handbook of Drought & Flood Prediction in South Africa

Written by Prof. WJR Alexander

by Prof. WJR Alexander
 
Professor Alexander’s comprehensive and groundbreaking new handbook ‘Analytical methods for water resource development and management‘ is available as free public resource created thanks to a donation of R200 000 from South Africa’s Water Research Commission. It details analytical methods for the development and management of water supplies and provides guidance to policymakers, researchers and the general public.
 
 
Raphael Fresco
 
On the front of the handbook is an illustration on that sets the story. Professor Alexander explains:
 
This is part of Raphael’s famous fresco (wall painting) titled the School of Athens in the Vatican. I had the privilege of studying it during WWII when we had plenty of time to spare. The theme of the fresco is Philosophy and this part of the fresco shows Euclid teaching mathematics to a group of enthusiastic pupils. He has a pair of dividers symbolising measurement and is pointing to a visual image on a slate. His studies have enabled us to measure distances from a point on earth to a point on the moon with a high degree of accuracy. But we still cannot predict future rainfall and river flow other than in probabilistic terms. This is the difference between accurate mathematical descriptions and broad probabilistic methods that we have so much difficulty in mastering. 
 
Alexander continues:
The issues covered in a handbook are of extreme national and possibly international importance. The problem is that I discuss the climate change issue in passing and demonstrate with a high degree of confidence that the observed multi-year, widespread occurrence of floods and droughts occur synchronously with variations in the global receipt and poleward distribution of solar energy. There is not an atom of evidence that they are the consequence of climate change. 
 
Water demands will exceed resources 
 
Here in South Africa, as well as in many other countries on the African continent with dry climates the demands will soon exceed the available resources. This will not happen suddenly. At first rare, major droughts will be the problem. But as the demand increases even the frequent minor droughts will result in the imposition of restrictions. We have already entered this period here in South Africa. 
 
Looking into the future we will have to develop a greater understanding of the numerical properties of multi-year sequences of river low, as well as of the isolated high flows that come to our rescue when they restore the water volume in the empty dams. Immediately the concept of multi-year river flows comes into the analyses it will be like opening a Pandora’s Box of issues that have to be addressed. These are detailed in the handbook. 
 
On the demand side, the components are also becoming more complex. Until very recently the principal demands were agricultural (food production), urban and industrial demands. Now the environmental concerns have to be accommodated. How will they be accommodated in the numerical analyses? 
 
On top of all this, the climate change issue has become a major interest. Briefly, the theory is that increasing discharges of carbon dioxide into the global atmosphere will cause atmospheric temperatures to rise. This in turn will result in increases in frequency and magnitude of floods and droughts. The problem is that there is a strong debate among the proponents of climate change regarding the global temperature changes but not a single example of increases in floods and droughts. There is no way whatsoever that climate change scientists will be able to provide information on the multi-year properties of the hydro-meteorological processes required for water resource analyses. 
 
Instead of accepting my invitation that we get around the table to discuss these important issues, I have been subjected to personal vilification and refusal to accept my papers for publication. This includes the tactic of deliberately delaying the publication of my handbook, the original version of which has been gathering dust on the shelves of the Water Research Commission ever since February 2010. 
 
Recommendation 
 
As a recognized expert in his field Professor Alexander has increasing concerns regarding the welfare of the poor and disadvantaged people of South Africa and elsewhere in the world. He insists his handbook addresses and resolves these issues in the public domain. 
 
“My recommendation is that those institutions that appreciate the dangers that lie ahead and the urgent need to develop measures to accommodate them, should as a matter of great urgency organise a multi-day event at minimum cost to discuss these issues and provide measures to accommodate them,” says Alexander.
 

Continue Reading No Comments

New Global Warming Report: Scientists Lied to Australian Parliament

Written by

In a damning new study Aussie climatologists are shown to have made false and unsupported claims to stoke up alarm over man-made global warming.

Respected unpaid climate analyst, Malcolm Roberts, of Brisbane, Australia compiled the ‘CSIROh! Report‘ on the invitation of ABC Radio’s Steve Austin. Across 29 pages Roberts details a litany of evidence proving that the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia’s national science agency, corruptly and unlawfully misrepresented science, climate and Nature.

Steve Austin ABC

Austin asked of Roberts, Please read through the Australian scientific paper and identify where you believe the CSIRO data has been falsified or is wrong.” To complete his task Roberts engaged in detailed correspondence with CSIRO’s Chief Executive Dr. Megan Clark and CSIRO’s Group Executive-Environment Dr. Andrew Johnson; extensive analysis and research of CSIRO reports and discussions with former CSIRO scientists including former chief research scientist Professor Garth Paltridge.

With evidence presented by the above authorities Roberts put Aussie government’s climate science under the microscope to expose how bias and propaganda misled the public to support the government’s tax on carbon dioxide (CO2). Even-handedly Roberts concedes, “CSIRO has many fine people and a proud heritage. In areas outside climate it appears to have capability and credibility. That is threatened by CSIRO’s politicization.” But, critically, his findings reveal that CSIRO had no empirical scientific evidence whatsoever that human CO2 caused warming (see Appendix 2). Instead, the reports shows a dearth of actual evidence but the policies so far enacted are implicated in causing the needless deaths of more than 40 million people, mainly in Third World regions.

Four Failures to Find Fault

The key litmus test applied by the study was the requirement that CSIRO’s science should provide “yes” answers to these four key questions:

1. Is global ATMOSPHERIC temperature warming unusually in either amount or rate

and is it continuing to rise?

2. Does the level of carbon dioxide (CO2) in air control or determine Earth’s temperature?

3. Does human CO2 production determine the level of CO2 in air?

4. Is warming catastrophic or even damaging?

CSIRO paper Malcolm Roberts

Roberts, who also provides research for the Galileo Movement, demonstrates that CSIRO failed to show any actual “causal relationships” to validate even one “yes” answer. On the contrary, Roberts identified evidence that shows CSIRO scientists used taxpayer funds rather to advocate for global governance at United Nations (UN) conferences than evince empirical data to support their position. Roberts says, “This is consistent with CSIRO’s actions supporting implementation of UN Agenda 21, the UN’s campaign pushing global governance. It bypassed Australia’s parliament and people and threatens Australia’s sovereignty and our personal freedoms.”

What the study shows in answer to those four key questions is a very different reality as follows:

1. Global atmospheric temperatures peaked in 1998. Temperatures have since been flat

with every year since colder than in 1998. Since the start of atmospheric temperature

measurement in 1958 temperatures cooled slightly from 1958 to 1976. A sudden small

2. Carbon dioxide (CO2) levels in air are a consequence of temperature, not a cause.

This is the reverse of UN IPCC, CSIRO and government claims. It applies throughout

Earth’s history and over every duration. It’s true seasonally and long-term;

3. Nature alone determines levels of CO2 in air. This is the reverse of UN IPCC, CSIRO

and government claims. It means that cutting or increasing human CO2 production

cannot affect CO2 levels in air. It’s useless to cut human CO2 production;

4. Warmer periods in Earth’s history are highly beneficial to people, humanity, civilization

and the natural environment. This is the opposite of UN IPCC, CSIRO and government

claims. Warmer periods are scientifically classified as optimums.

As a result, this damning analysis, says Roberts, shows that CSIRO scientists are deeply enmeshed in producing corrupt UN IPCC reports. The evidence shows the IPCC colluded with CSIRO to enlist contributing scientists of various rank and to have papers referenced and presumably act as reviewers. Without applying any safeguards, CSIRO endorsed UN IPCC reports despite those reports being demonstrably corrupt and pushing a political agenda. UN IPCC contributors and officials are shown to have bypassed and at times prevented scientific peer-review. “As a method of quality assurance, the process of peer-review is now worthless” says the Roberts report.

Evidence reveals that all four UN IPCC reports to national governments and media—1990, 1995, 2001 and 2007—contradict empirical scientific evidence and provide no logical scientific reasoning for their core claim that human CO2 caused, causes or will cause global warming. “The corruption is pervasive, systemic and driven by a political agenda to achieve a political outcome,” says Roberts. Empirical scientific evidence and discussion in Appendix 4 reveals corruption of ground-based temperature data and of CO2 data used by the UN IPCC and CSIRO.The propaganda relied upon by alarmists is ostensibly that collated by former U.S. Vice President, Al Gore, shows Appendix 3.

Major international banking firm Merrill Lynch is implicated in the climate shenanigans. (Appendix 6). They and other international banks are shown to profit enormously from trading in CO2 credits. This relationship raises perceptions and questions about the opportunity for conflicts of interest.

Evidence Proves Natural Forces, not Humans Drive our Climate

This telling Australian report lays out in black and white that our atmosphere is not warming, much less unusually. “Fluctuations since 1958 reveal modest natural cyclic temperature variation.” While ground-based rural measurements reveal the same since 1890. In Appendix 4 it is shown that the strongest natural factors proven by empirical scientific evidence to control global climate. They are El Nino, La Nina and other regional ocean-atmosphere decadal cycles. Scientists have identified many factors driving climate. These include galactic, solar system, solar, planetary and lunar cycles ranging from 150 million years to 11 years. Strong drivers include:

• Solar: (1) variations in sun’s solar output; (2) Output of solar particles; (3) Sun’s magnetic field polarity and strength;

• Water vapour: (1) atmospheric water content; (2) Cloud cover;

• Cyclic regional decadal circulation patterns such as North American Oscillation and the southern Pacific ocean’s El Nino together with their variation over time;

• Ocean: (1) temperature; (2) salinity; (3) currents; (4) sea surface temperatures;

• Volcanic activity.

The above natural drivers are either omitted from, or downplayed in erroneous unvalidated computerized numerical models used by the UN. CSIRO has thereby used deception dressed up as science to cede sovereignty over Australian science to an unscientific and corrupt foreign political organisation pushing a global political agenda. “CSIRO is thus abetting systemic and pervasive documented corruption of science,” says Roberts.

Tellingly, the prestigious Inter Academy Council’s (IAC) August 2010 review of the UN IPCC showed that there were “crippling deficiencies” in UN IPCC processes and procedures that should have sounded alarm bells that CSIRO is supporting implementation of UN Agenda 21, the greatest threat to Australian sovereignty.  

Roberts invites readers to examine the evidence on offer in this new study and to verify for themselves that CSIRO has misled the media. He points to three key falsehoods that any objective examination of the available scientific proves. They are:

1. Human CO2 controls and determines global temperature and climate. False;

2. There is an overwhelming consensus of scientists supporting that claim. False;

3. Catastrophic consequences will result at some unspecified future date from human disruption of global climate: sea level rise, extreme weather, floods, drought, snowfall, fires, ocean pH (alkalinity), disease, species extinction, … All false.

 “Through the National Press Club and media, CSIRO misled the people and parliament of Australia. CSIRO has been actively engaged in UN IPCC corruption of climate and science, “ concludes the Brisbane climate analyst. How Steve Austin, the listeners of ABC radio and other Australian citizens react to these damning findings remains to be seen.

 

 

Continue Reading No Comments

Correcting GHG Theory: Black Body Assumption Changes GHE from 33C to Nothing

Written by Dr. Pierre R Latour

By Pierre R Latour, PhD ChE (and N. Kalmanovitch)

 

In 1827 celebrated French Mathematician Jean-Baptiste Joseph Fourier determined that the theoretical temperature of the Earth based on just the thermal radiation from the sun was cooler than the actual temperature due to atmospheric insulation. He named this insulation effect “un effet de verre” (an effect of glass) after work by French Scientist de Saussure who demonstrated this insulation using glass panes for insulation.

 

Later work by physicists Planck, Stefan, and Boltzmann provided understanding of the relationship between temperature of a body (blackbody) and the intensity of radiation allowing for the calculation of the Earth’s theoretical radiative temperature exclusive of atmospheric insulation according to the formula  Te = [So(1-A)/4σ]1/4 derived from Planck’s equations using the Stefan Boltzmann constant “σ”.

 

We can measure planet surface temperature from its radiative spectrum.  We can also calculate Te by making some estimate of both solar Irradiance (So) and the planet albedo (A). The subtraction of Te from the planetary temperature provides a metric for comparing the relative atmospheric insulation of the planets essentially calibrating the insulation effect first noted by Fourier back in 1827. This theoretical temperature difference depicting atmospheric insulation was renamed “the greenhouse effect” relating the insulation against thermal transmission by conduction by the glass in a greenhouse to the insulation against thermal radiation provided by the atmosphere of the planets.

 

Preying on overall public ignorance of science unscrupulous scientists rebranded the greenhouse effect as some sort of physical effect driven by CO2 emissions causing catastrophic global warming. This fraudulent use of the term greenhouse effect was introduced over 25 years ago and spawned a series of equally fraudulent terms such as “greenhouse warming” and “greenhouse gases” providing the propaganda vocabulary that has created a climate change issue out of nothing to serve the political agenda of the climate change scam perpetrators.

 

While the term greenhouse effect is perfectly valid in its geophysical scientific context; it is completely ludicrous when used as a mechanism whereby increased CO2 creates energy out of nothing and causes the Earth to warm to catastrophic levels as a result of an enhanced greenhouse effect. This article takes the fraudsters to task exposing the fraudulently rebranded version of the greenhouse effect for what it is.

 

In 1981, James E Hansen assumed Earth radiates as a theoretical black body, with emissivity = em = 1.0. (J Hansen, et. al., “Climate Impact of Increasing Atmospheric Carbon Dioxide”, Science, V213, n 4511, pg 957-966, 28 Aug1981)

Hansen converted the measured solar constant 1366 w/m2 of the sunbeam intercepted by Earth’s circular disk to Earth’s spherical area of average emission plus reflection intensity back to space, 1366/4 = 341.5 w/m2 of the globe’s surface. Then he accounted for its reflectivity, mostly by clouds, assuming albedo = 0.30, to estimate Earth’s average radiation absorption and emission intensity to space to be I = 1366*0.7/4 = 239 w/m2 of its spherical surface.

James Hansen Coal Trains

Using Boltzmann’s equation for a radiating temperature, Te, by Earth’s surface corresponding to this estimate of its average radiating emission intensity:

I = 5.67*em*(Te/100)4 = 239, Hansen calculated Te = 100(239/5.67)0.25  = 254.80K = -18.30C. (Subtract 273.1K = 0C.)

In doing so, Hansen made a simplifying assumption that Earth’s emissivity em = 1.0

The average thermal temperature of Earth’s atmosphere is difficult to measure equator to poles, surface up to 100 km, night and day, over seasons, for a decade, but was estimated to be about Ta = 288K = 15C.

Hansen declared the difference, Ta – Te, to be the greenhouse gas effect of gases and clouds, GHE = 15 – (-18) = 33C. As a conjecture, he attributed much of this “anomaly” to the presence of increasing CO2 because he had no explanation verified by evidence.

This famous 33C global warming by CO2 conjecture has caused great concern, controversy and research since his declaration and subsequent Congressional testimony. Humanity, through the UN IPCC, has struggled since the 1997 Kyoto Protocol to fashion a thermostat to throttle fossil fuel combustion to reduce this GHE to hold Ta = 15C to save the planet from runaway global warming. (Said thermostat was proven by control system engineering mathematics to be unworkable in 1997.)

One difficulty is GHE is the difference between two different types of temperature, thermal Ta measured by thermometers and radiant Te measured by photometers and spectrometers. Two different phenomena in nature. Everyone can sense the distinction between the two on a bright winter day ski slope. Facing the sun Te = 25C and turning away, Ta = -5C. At night Te approaches -270C while Ta may drop to -10C. They are naturally different; CO2 is not the cause. GHE has been explained to represent a meaningless Whatchamacallit.

The black body assumption that Earth absorbs and emits all incident radiation, em = 1.0, is very poor for many reasons, including because it does not account for photosynthesis by land plants and ocean plankton. Flora consume solar power and store it in hydrocarbon molecules they make: starch, sugar, cellulose, animal food. Which means Earth’s forests, grasses and jungles do not emit as much as they absorb, so emissivity < absorptivity. A cooling effect. Besides, the globe is not black.

Emissivity of different materials varies between near zero (0.022) for polished silver to near 1.0 (0.98) for lamp black. Measuring or estimating emissivity of the whole radiating globe; ocean, land, ice, desert, jungle, mountains and atmosphere, is not easy, so Hanson made his simplifying black body assumption, em = 1.0. But it surely is not black. em < 1.0.

We now know Earth’s emissivity is much less than 1.0, so its corresponding radiating temperature to emit at 239 w/m2 must be higher than -18C.

Global Climate Model, under Zero-dimensional models provides an estimate em = 0.612 without reference. It goes on to state “Taking all this properly into account results in an effective earth emissivity of about 0.64 (earth average temperature 285 K (12 °C; 53 °F)).

Using Earth’s emissivity em = 0.612 rather than 1.0 and the same Boltzmann equation Hansen used, Te = 288.08K = 14.98C.

So GHE = 15.0 – 14.98 = 0.02C, not 33C. 

Using the other reference em = 0.64 and T = 285K gives Te = 284.88K and GHE = 0.12C.

The GHE collapses to zero when the black body assumption is abandoned for colorful Earth, within any margin of error. It doesn’t exist! There is nothing to it! Much ado about nothing! The sky is not falling after all. Another inconvenient truth. Good news. Problem solved! CO2 is innocent! UN IPCC can close shop and go home. No more climate change research needed. No need to collect temperature data for a billion years to discern a correlation with CO2, which cannot prove causality anyway. Everyone can relax and return to normally exhaling CO2, nonpolluting green plant food, and burn as much inexpensive, abundant fossil fuel as they can afford.

There is no new science here, just careful attention to assumptions, definitions, logic and accurate parameters

CO2 and H2O are radiating gases because they are dipole (asymmetric) molecules; O2 and N2 are not. But they don’t trap heat beyond natural thermal heat capacity, they just absorb some IR in the sunbeam vector and emit it in all directions. This scattering does modify the EMR field in Earth’s atmosphere slightly, but doesn’t really affect average temperature very much. If anything it is a cooling effect below. The basic GHG theory notion that cold atmospheric CO2 does not absorb/emit incoming solar IR, only upwelling IR from Earth’s surface, and back radiating it down to warm the hotter surface below is false. If it were true that cold CO2 molecules high in the sky can transfer heat back down to the warmer surface, it would constitute creation of energy, a perpetual motion machine (just what AGW promoters need to drive AGW), because the Second Law of Thermodynamics does not allow energy transfer from cold to hot bodies. It is a one way street, hot to cold. Always and everywhere.

CO2 may increase Earth’s emissivity slightly. Which would cause it to radiate Te < 14.98C, another cooling effect. [CO2] has been steadily increasing recently at about (386 – 316)/(2009 – 1959) = 70 ppm/50 years. Temperature has stabilized since 1998.

In 2010, Earth emitted at 233 w/m2, less than 1981 because solar intensity dropped with sunspot activity, on its normal 11 year cycle. Albedo may have changed also. You are now in a position forecast this effect on Earth’s Te assuming em = 0.612. Like Hanson you can calculate Te = 286.3K = 13.2C. It turns out measured Ta = 287.7K = 14.6C and GHE = 1.4C = 0 within the margin of error again. This is a remarkable confirmation of your prediction! I would congratulate you for your model’s predictive power of global warming/cooling. All you need to do is estimate emissivity and predict solar intensity disturbance on Earth’s radiation emission rate. Now you see J Hansen’s monumental mistake.

Greenhouse gas theory is hereby refuted, three ways: Whatchamacallit, blackbody, perpetual motion machine. The political and financial ramifications of these discoveries are enormous. Consensus is irrelevant because correct science and engineering trump consensus. Skepticism is the foundation of the scientific method of Newton.

 

Continue Reading No Comments

Who are the “peers” who review?

Written by

by Douglas Cotton

It is common to find proponents of the much publicised radiative greenhouse conjecture adopting smear tactics in an attempt to discredit ourselves and authors who have contributed papers and articles to Principia Scientific International (PSI).

On various climate blogs they monotonously ask questions like: “When are you going to have your paper published in a proper journal?” But to them, the only “proper” journals (or websites) are those which support their conjecture that carbon dioxide warms the world. Many of them have jobs to protect, research grants to obtain or perhaps valuable domain names and websites which may well crumble should the greenhouse tumble.

The Thinker

Why would a member of PSI wish to support a journal which helps to propagate the very conjecture which virtually all of nearly 200 members here know to be false? Why waste time, and in some cases pay for reviews, when there is no chance of any papers being published if counter views are expressed therein? Why, in any event, should we imagine their “peers” are any more suited, qualified or knowledgeable enough to review PSI papers than any others, such as these from among our fast growing membership?

The carbon dioxide related “greenhouse effect” will one day soon take its brief place in history as the greatest scientific mistake of all time. Many within the ranks have now become aware that is was, in fact, a huge error initially dreamt up by a handful of people and then very successfully marketed upon the politicians and the general public, including young school children. This started over 30 years ago around 1979, and so now a whole generation has been brainwashed with watered down descriptions which the gullible lap up.

Websites such as Skeptical Science, WattsUpWithThat, Science of Doom and many more are set up by those with a vested interest in swaying public beliefs. School children and climatology undergraduates alike flock to these sites to arm themselves with arguments, which they then copy verbatim in order to rubbish any counter views expressed elsewhere. Frankly, it is amazing just how many people get involved in reading and writing the millions of comments on these climate blogs. It is little wonder that the owners of the above three in particular just simply delete comments with contrary views and ban those who post them, sometimes even blocking their internet access so that they can’t even read the pseudo science posts and comments.

The facts are that this is a science which, at its heart, requires a deep, advanced understanding of atmospheric physics, for it is all about heat transfer mechanisms and such are the domain of physics. But most climatology researchers have limited understanding of the physics involved. They have picked out a few equations from the first year textbooks, and then used such equations without understanding the very important limitations and prerequisites for these to be applicable.

They probably know that radiation has a dual particle and wave nature, but it suits them to imagine strings of identical photon particles crashing into the Earth’s surface like little hand grenades, always imparting more thermal energy to that surface. But radiation from a cooler atmosphere does not transfer heat to a warmer surface – not one little bit – ever.

The Sun’s radiation could never have heated Earth’s surface by the proverbial 33 degrees, nor the Venus surface by about 500 degrees, especially when we know that the Venus surface actually receives less than 10{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} as much direct solar radiation as does Earth’s surface. Mind you, the anonymous author of Science of Doom would like you to think it receives closer to 100{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} when he writes “The surface receives radiation from the sun, S. In the case of Venus this value would be (averaged across the surface), S = 158 W/m².” He could have found out that the Russians had actually estimated about 10W/m^2 (averaged across the surface) using measurements made with probes dropped to that surface. But, as is usual with Science of Doom, it’s all about what you can “prove” with voluminous computations that show how at most 10W/m^2 (or was it 158W/m^2 SoD?) of radiation coming back out of the Venus surface then gets its energy multiplied somehow up to 16,000W/m^2 due to a postulated “runaway greenhouse effect.”

Many climatologists don’t appear to understand the maximum entropy states required for thermodynamic equilibrium, nor the limited effect of radiation from molecules like carbon dioxide with few radiating frequencies, nor the gravitational effect on thermal gradients, nor exactly how the Stefan-Boltzmann Law should be applied, nor the diffusion process which is the only valid explanation for the high Venus temperatures. Then they ignore the consequences of non-radiative processes which transfer the energy from the surface into the ordinary nitrogen and oxygen molecules at the boundary. It is these molecules which then carry much of this energy from the surface into the atmosphere, until water vapour and carbon dioxide radiate it away to space. Nitrogen and oxygen are the real blanket: radiating molecules like carbon dioxide are holes in that blanket. Water vapour reduces the thermal gradient, and thus lowers the surface temperature for the new radiative equilibrium. But the IPCC will tell you that water vapour warms the surface, and so it supposedly has a positive feedback effect, multiplying the assumed warming by carbon dioxide.

The only thing it seems the climatologists don’t ignore is their marketing efforts in propagating what now smells much more like a fraudulent hoax than a mere scientific mistake. Do people get their information direct from peer-reviewed papers? Do school children? No. The children are brainwashed at school and the public is brainwashed by selective, biased media propaganda, carefully orchestrated by the establishment of the all-knowing IPCC rubber stamp mechanism.

If they really had valid counter arguments to the physics presented in PSI papers, then you would think that they would take advantage of the opportunity offered to anyone in the world to submit an attempted rebuttal for papers that are online for “Peer Review in Open Media” because that, we believe, is the way science ought to operate. Indeed, we await seeing their own papers subjected to open review by the “peers” who can so easily rebut them with science from the realm of valid atmospheric physics and related disciplines. We at PSI will take on any challenges in a spirit of open debate, such as on our forums, and we will investigate all official rebuttal attempts in the spirit of true science. For only then will truth prevail and the world be a better place.

Continue Reading 1 Comment

Greenhouse Gas Confusion Magnified by Misuse of Infrared Thermometers

Written by

By Carl Brehmer

Instruments called radiometers are believed to measure both up-welling and down-welling longwave radiation, but what do radiometers actually sense and what do their readouts actually mean?

handheld IR thermometer


The core of a radiometer is a thermopile held within a vacuum under a dome that has one end attached to a case that holds a “reference temperature.” Depending upon the target a positive or negative electrical charge will be induced in the thermopile, which is transmitted to a circuit board. If the target is warmer than the reference temperature of the case (the ground for instance) radiative warming of the thermopile will occur and a positive current will be induced and if the target is cooler than the case (the open sky for instance) radiative cooling of the thermopile will occur and a negative current will be induced. Along with the electrical signal generated by the thermopile two other temperature signals are sent to the circuit board: 1) the case temperature and 2) the dome temperature. The voltages from these three signals are then mathematically converted into a readout in W/m2.

“The Eppley PIR has 3 output signals; the thermopile (mV), case temperature (V), and dome temperature (V). The 3 signals are combined in the Pyrgeometer Equation, which determines the thermal balance of the instrument and hence the contribution of down-welling longwave radiation (LW).” PIR – Precision Infrared Radiometer on Kilo Moana by Frank Bradley

“The mV output from the thermopiles is converted to W m-2, then corrected for the temperature effects on the PIR’s case.” Eppley PIR (Precision Infrared Radiometer) ® CAMPBELL SCIENTIFIC, INC. Copyright © 2001-2007 Campbell Scientific, Inc.

Here is an example of one such Eppley PIR calculation of DLWR examined in the paper PIR – Precision Infrared Radiometer on Kilo Moana by Frank Bradley. When pointed upwards towards the sky what the Eppley PIR (Precision Infrared Radiometer) actually sensed was radiative cooling of the thermopile. This radiative cooling induced a small negative voltage in its output wires and this was mathematically converted into a negative W/m2: in this example -66.6W/m2. The radiometer then, using S-B formulae, calculated the potential IR emission of the thermopile itself based on the reference temperature of the case and it’s presumed emissivity (the result of this calculation was 460.9 W/m2) These two numbers where then added together (-66.6 + 460.9 = 394.3). Finally the Eppley PIR estimated the affect of the dome temperature on this number, which was -10.9 W/m2. It then added this number to the previous sum:
(-10.9 + 394.3 = 383.4)

This is how a measured up-welling IR radiant energy flux of -66.6 W/m2 became a down-welling IR radiant energy flux of 383.4 W/m2. So, even though the thermopile was sensing a -66.6 W/m2 up-welling flux the Epply PIR readout said that there was a 383.4 W/m2 down-welling flux. In reality, the Epply PIR readout is a calculation of what the downward radiant energy flux from the atmosphere would be if it were not being cancelled out via destructive interference by up-welling IR radiation. As such, the existence of an atmospheric down-welling radiant energy flux is a mathematical confabulation; it is a hypothetical reality not a measured or sensed reality.

If you direct the Epply PIR towards the ground the thermopile experiences radiative warming and the same calculations are done to produce an up-welling IR radiation number. The readout is, again, a hypothetical rather than what is actually sensed. An honest radiometer would just tell you the actual radiant energy flux and in what direction it is flowing. In which case the “net” up-going IR radiation would be the total up-going IR radiant energy flux and the down-welling IR radiant energy flux would be nil.

I think that it would be valuable to review what the unit W/m2 actually means.

“A watt per square meter (W/m²) is a derived unit of heat flux density in the International System of Units SI. By definition, watt per square meter is the rate of heat energy of one watt transferred through the area of one square meter, which is normal to the direction of the heat flux.”

“Heat flux” is “The amount of heat transferred across a surface of unit area in a unit time. Also known as thermal flux.”

Flux (n) “the rate of flow of something, such as energy, particles, or fluid volume, across or onto a given area.” Encarta® World English Dictionary © 1999 Microsoft Corporation.

The unit W/m2 then is a measurement of the rate that thermal energy is actually moving from one place to another. It is not a measurement of potential energy flow. Actual thermal energy flow is always unidirectional down a temperature gradient. Think of the unit used to measure water flow in a river: gallons/min. This is a measurement of how much water is actually flowing past a particular point over a given time period. It is not a measurement of what the flow would be if the river bed where infinitely steep. Yet this is what the ULWR and DLWR numbers on the K-T Earth Energy Budget chart are; they are potential thermal energy flows and not actual energy flows. Radiometers, in reality, only sense the flow of radiant thermal energy in one direction, but the readout on radiometers, rather than being a readout of what is actually being sensed, is a calculation of what the heat flux would be if the radiating matter were in a vacuum radiating towards a perfect black body at zero °K.

Ergo, a radiometer takes something that is physically sensed (either the radiative cooling or the radiative warming that induces a small negative or positive current in the thermopile) and converts it into a hypothetical number. We can deduce from their design that engineers of IR radiometers are adherents of the two-way radiant energy exchange paradigm and design these instruments to manifest that paradigm. For example, if you took one of these instruments down into a wine cellar and allowed it to assume room temperature and then measured the IR radiation flux coming from room’s walls and ceilings it would say that ~300 W/m2 is coming from each surface. In reality, the actual “heat flux density” within a wine cellar in W/m2 would be 0.00 since the entire room is in thermal equilibrium and there is no heat flowing from anywhere to anywhere. From where then does the ~300 W/m2 number on the radiometer readout come from? It is the calculated amount of IR radiation that the thermopile would emit if it were in a vacuum opposite a blackbody at 0 °K. In reality a radiometer in a wine cellar whose temperature has equilibrated with that environment should read 0.00 W/m2, because the amount of heat that is actually flowing from one wall to the other within a wine cellar is nil.

Let’s move then to the outside world. Again, neither the DLWR number 333 W/m2 nor that the ULWR number 396 W/m2 seen on the K-T Earth Energy Budget chart’s are measurements of actual radiant energy fluxes. Rather they are instead mathematical estimates of what the upward or downward flux would be if the other were absent. This is like measuring the wind speed to be 10 mph from the west, but calculating that there is actually a 20 mph wind coming from the east that is being opposed by a 30 mph wind coming from the west to yield a net 10 mph wind from the west! Just as wind only flows in one direction so to does thermal energy. Again, the empirical evidence that DLWR is completely extinguished by ULWR is the very radiometer that presumes to measure its presence. These radiometers detect 0.00 W/m2 downward heat flux; what they do sense is an upward radiant energy flux and then calculate what the DLWR would have been had it not been extinguished.

Here is the kicker. The IPCC in its soon to be released AR5 report* will again affirms that they consider DLWR to be substantively identical to insolation in that they just add the hypothetical number seen on the readouts of IR radiometers to the actual measured short wave radiant energy flux coming from the sun.

“The instantaneous RF (radiative forcing) refers to an instantaneous change in net (down minus up) radiative flux (solar plus longwave; in W m–2) due to an imposed change.” AR5 draft chapter 8

This sets up the perspective that the atmosphere is actually the surface of the earth’s primary heat source since the readouts on radiometers assert that the down-welling longwave radiant energy flux coming from the atmosphere is twice that of direct sunlight! In reality it is nil.

*PSI has a fully-searchable copy of the recently leaked AR5 draft report. PSI members may enjoy use of this facility by entering the back end of the site (enter ‘LOGIN’ details on the right of this page).

Carl Brehmer

End Note: In one of my career paths I obtained a degree in Electronics Technology and it was from that knowledge base that I did the above analysis of the internal operation of the Epply PIR using Epply’s description of their own instrument found in the Owner’s Manual. I challenge anyone to find within the Owner’s Manual of any radiometer the claim that they actually sense down-welling IR radiation. What you will find on close inspection is that these Owner’s Manuals will reveal that DLWR readouts are always a calculated hypothetical based on the two-way radiant heat transfer paradigm, rather than an actual direct measurement of DLWR.

Continue Reading 2 Comments

The Climate Models are Failing

Written by Dr. Klaus L.E. Kaiser

By Heinz Hug

(Nachrichten aus der Chemie, 61: 132 [2013)

Translated by Klaus L.E. Kaiser, 12 Feb. 2013

[sub-title] Heinz Hug questions the importance of CO2 for climate change

SJR journal cover“According to our calculations, in the coming years, it should get warmer by leaps. But we do not trust that prognosis. Because the simulations should also have been able to predict the current standstill of the temperature increase – which did not happen.” That according to the climate researcher Jochen Marotzke of MPI-M in Hamburg, according to the Spiegel [magazine] of 9/2012. The reasons why climate models fail are obvious.

It is to be emphasized that the [discussion about the] greenhouse gas effect does not concern the absorption by IR-trace gases (CO2, CH4, H2O, and similar) but their emission, which warms the earth’s surface via “back-radiation” [Ruckstrahlung], [ref. 1]. In fact, satellite spectra of 667 cm-1 show an impressive “funnel within the Planck curve” which is based on the impediment of the warmth-radiation from the earths body by the ν[nu]2-band of CO2, [ref. 2].

Continue Reading No Comments

Wrong and Twisted Again

Written by

World-renowned sea level expert, Nils-Axel Mörner has come out to trash latest alarmist media claims that Bangladesh is facing serious flood risks due to man-made global warming. In particular, Dr. Mörner has denounced environmentalist, Bill McKibben who speaks of “30 million refugees” and Canada’s Sunday edition of the star.com for publishing a misleading article, ‘Bangladesh faces mass migration, loss of land from climate change.’

mangrove bangladesh

The story claims villages in much of rural southwest Bangladesh are suffering the ravages of climate change. The author of the piece, Raveena Aulakh, relies heavily on the junk science of Atiq Rahman, of the Intergovernmental Panel on Climate Change (IPCC) report. Rahman blames Bangladesh’s woes on human emissions of carbon dioxide and insists raising taxes will stem the rising tides. “All this could happen faster because of lack of reduction of greenhouse gases,” says Rahman. “And even if we stopped now, it would take a lot of time for things to get better.”

But Dr. Mörner, famed for his scientific rebuttals of such claims is having none of it. He has issued a press statement (February 10, 2013) to denounce as bogus the article’s shabby “climate change” link to Cyclone Aila that wrought devastation on the subcontinent in 2009.

Mörner retorted that Cyclone Aila “had nothing to do with any sea level rise. It was just the destruction of one of those events hitting this coast so badly. Unfortunately, this is normal for this part of the world, and has always been so.” The real concern, says the sea level expert, should be the incessant chopping down of mangrove trees to clear space for shrimp farms which fuels soil erosion and increases the risk of flooding. As Mörner pointed out during the ‘Sealevelgate‘ scandal in 2001, politicized and cherry picked science “leads to confusion over cases such as Bangladesh, whose plight is the exact opposite of the one claimed by environmental lobbyists and the IPCC [Intergovernmental Panel on Climate Change].”

Nonetheless, star.com writer,  Raveena Aulakh sticks rigidly to the IPCC’s doomsaying narrative and cynically promotes the usual tired old claims about melting ice sheets, diminishing glaciers and man-made global warming.

If only Aulakh and other sensationalist publications would introduce a little more of the Mörner balance in their articles for he has already exposed the fraud and misunderstand about the subcontinent in a key article from 2011 in the Spectator. ‘Rising credulity,’ describes how Mörner visited and studied the Sundarban delta area in Bangladesh (pictured) and was able to observe clear evidence of coastal erosion, not sea level rises.

The truth about sea levels are that they have always been fluctuating and always will no matter what governments think they might do to control them. Indeed, there are well-documented and huge variations in sea levels, by as much as two meters, because, as is the way with Nature, they stubbornly refuse to maintain a constant level. Dr. Mörner describes the world’s oceans and seas as more akin to an “agitated bath where the water is slopping back and forth. This is a dynamic process.”

By contrast, independent scientists know full well that Bangladesh is cursed because of rain over the Himalayas, which is unconnected with the sea. “It is also cursed because of the cyclones which push water inland. Again, this has nothing to do with the sea, adds Mörner. 

Sensationalist authors such as Atiq Rahman should, says Mörner, first check their facts with the world’s true experts on sea level. They can be found at the INQUA (International Union for Quaternary Research) commission on Sea Level Changes and Coastal Evolution (of which Mörner is a former president), not with  the discredited IPCC.

Continue Reading No Comments

Moon’s Hidden Message

Written by Ben Wouters

Effective temperatures
 
by Ben Wouters
 
The concept of Effective Temperature (Te) originates in astronomy, and is a rule of thumb calculation to estimate the radiative temperatures of planetary bodies in a solar system. For planets without atmosphere, no internal heat source and surface emissivity equal to 1 the Te is the whole story. All mentioned factors increase the “base” Te. See eg this site for more details.
 
Surprisingly the Te for the moon using albedo 0,11 is ~270K, but the actual average temperature is much LOWER (~197K). This means that either the moons albedo is massively higher (~0,75 iso 0,11) or there is a serious flaw in the way the Te’s are calculated. Let me show you that the Te for the moon is ~161K iso ~270K and for Earth ~151K iso the well-known 255K
 
Basically Te is arrived at by taking the surface temperature of a sun and calculating the remaining radiation (Total Solar Irradiance (TSI)) at the distance of the body under consideration. The amount of energy intercepted by a body is equal to the TSI times the cross-sectional area of the body. Since a sphere has 4 times the area of a circle, the TSI is divided by 4 to arrive at the average RADIATION/m2 on the body. A reduction for reflected radiation is also applied. So far so good. To arrive at the Te, the Stefan-Boltzmann formula (SB) is used on this number, assuming “black body” (BB) behaviour of the body. This last step is where things go wrong: using the average radiation to arrive at a temperature iso calculating different temperatures and then averaging them. You cannot average the input for a non-linear equation like the SB formula (fourth power) and expect a meaningful result. The following spread sheet demonstrates this nicely. It shows the radiation heating two identical BB plates, the resulting temperature using SB and the average temperature. Notice that in all cases the AVERAGE radiation is 240 W/m2.
 
Wouters moon table
 
The last example (480/0) represents the situation for heavenly bodies with only one sun, like our own planet Earth. A better way to calculate Te is to calculate the average radiation only for the sunny side of a body, calculate the average temperature for that side and THEN average with the dark side which has a radiative temperature of 0K, so dividing by 2 of the result for the sunny side will do. Since the original method didn’t use it, I also ignore the ~2,77K resulting from the cosmic background radiation.

 
Here are the Te calculations for the moon and earth:
 
I assume a TSI for both of 1364 W/m2, the moon reflects 11{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} and Earth 30{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117}.
 
Moon: (1364 W/m2 x 0,89)/2 = 607 W/m2 SB => 322K => Te = (322K + 0K)/2 = 161K
Earth: (1364 W/m2 x 0,70)/2 = 477 W/m2 SB => 303K => Te = (303K + 0K)/2 = 151K
 
All this means that a non-rotating grey body at our distance from the sun, without atmosphere, no internal heat and in radiative balance will have this average surface temperature. Change in any of the mentioned factors will give a higher ACTUAL temperature.
 
I’ll show that this reasoning is correct by looking at some temperature plots of the moon.
 
Wouters divine project
 
Diviner Project: link
 
Some observations: – the temperature nicely follows radiation values during daytime, but doesn’t drop to 0K at night – apparently some heat storage takes place during the day that re-radiates during the night, given enough time most probably converging to the same temperature as Latitude 890 Winter – temperature at Latitude 890 Winter seems to converge to 30-40K given enough time without sunshine
 
My conclusion is that for some reason the moon has a “base” temperature of ~30K, most probably caused by internal heat. On deep crater floors near the poles temperatures as low as 25K are found. A heat flow of ~100mW/m2 would be enough to explain this temperature. At the poles Earthshine is an unlikely candidate.
 
So the moon behaves reasonably well like a BB, except: – it reflects some of the radiation (=> making it a “grey body”) – it rotates and some heat storage is taking place – its “no radiation” temperature is not 0K, but ~25-40K
With its “base” temperature of ~25K, some leftover heat from the previous “day” and the sun adding the rest the average measured moon temperature of ~197K can be easily explained.
 
Back to Earth: The correct Te for Earth of 151K and consequently the Greenhouse Effect being ~139K iso ~35K leaves us with the problem of explaining Earths average surface temperature of ~290K. Is CO2 even more powerful than previously thought? Or is there a much more plausible explanation?
 
 
Ben Wouters, Zuid Scharwoude, Netherlands.

 
 

Continue Reading No Comments

ARMSTRONG: Climate seers as blind guides

Written by Prof J.Scott Armstrong

 
Forecasters often use unscientific computer models
 
by Prof. J. Scott Armstrong
 
The science of forecasting is complex. After 50 years spent studying the issue, I have found there is plenty of experimental evidence that in complex, uncertain situations, experts cannot forecast better than those with little expertise. In 1980, MIT Technology Review published my “Seer-sucker Theory”: “No matter how much evidence exists that seers do not exist, suckers will pay for the existence of seers.” Since 1980, research has provided more evidence for this surprising theory, especially Philip Tetlock’s 2005 book, “Expert Political Judgment.”
 
threelegged climate stool
Forecasts of dangerous man-made global warming rely heavily on expert judgments. Is the global warming alarm movement another example of the seer-sucker phenomenon? If so, what is the scientific approach to climate forecasting?
 
In the 1990s, I organized an international group of 39 scientists from various disciplines to summarize principles for a scientific approach to forecasting. The principles are based mostly on experimental studies on what works best in given situations. Some, such as the principle of full disclosure, are based on commonly accepted standards. The findings were translated into a list of 139 scientific principles and published in the book “Principles of Forecasting” in 2001. The principles are available at forecastingprinciples.com, and they are revised as new evidence becomes available. This site includes a freeware package that allows anyone to audit forecasting procedures.
 
In 2007, I along with Kesten Green from the University of South Australia, published an audit of the procedures used by the U.N. Intergovernmental Panel on Climate Change (IPCC) to produce “projections” of global warming. The IPCC authors used computer projections derived from some scientists’ expert judgments. They call the projections “scenarios” (i.e., stories). As the authors admit, they are not forecasts, yet they are used as such. The audit showed that when the IPCC procedures are assessed as if they were forecasting procedures, they violated 72 out of 89 relevant scientific forecasting principles.
 
What does scientific forecasting tell us about global temperatures over the next century?
 
In 2009, Mr. Green, Willie Soon of the HarvardSmithsonian Center for Astrophysics and I conducted a forecasting validation study using data from 1850 through 2007. We showed that a simple model of no trend in global mean temperatures for horizons of one to 100 years ahead provided forecasts that were substantially more accurate than the IPCC’s 0.03 degrees Celsius per year projections. For horizons of 91 to 100 years, the IPCC’s warming projection had errors 12 times larger than those from our simple model. Our own forecasting procedures violated only minor evidence-based principles of forecasting, and it did not rely on expert judgment about the trend. Scientific forecasts since that 2009 paper, described in our latest working paper, assess those minor deviations from the principles, and the results support our earlier findings.
 
Have there been similar cases in the past where leading scientists and politicians have concluded that the environment faces grave perils? In an ongoing study, we have identified 26 alarmist movements that were similar to the current man-made global warming alarm (e.g., population growth and famine in the 1960s, and global cooling in the 1970s). In all cases, human activity was predicted to cause environmental catastrophe and harm to people. Despite strong support from leading scientists, none of the alarmist movements relied on scientific forecasting methods. The government imposed regulations in 23 of the 25 alarms that involved calls for government intervention. None of the alarming forecasts turned out to be correct. Of the 23 cases involving government interventions, none were effective, and 20 caused net harm.
 
Policy on climate change rests on a three-legged stool of forecasts. First, it is necessary to have valid and reliable scientific forecasts of a strong, persistent trend in temperatures. Second, scientific forecasts need to show that the net effects of the trend in temperatures will be harmful. Third, scientific forecasts need to show that each proposed policy (e.g., a policy that polar bears require special protection because of global warming) would provide a net benefit relative to taking no action. A failure of any leg invalidates policy action.
 
Since 2007, we have searched for scientific forecasts that would support the three-legged stool of climate policy. We have been unable to find a single scientific forecast for any of the three legs — the stool currently has no support.
 
Two ways to encourage unity on the climate change issue would be to insist that forecasts be provided for all costs and benefits, and that all forecasting procedures abide by scientific principles. If validated principles are not included in the current forecasts, they should be added. Until we have scientific forecasts, there is no basis for unified action to prevent global warming — or cooling. Rational climate policies cannot rely on seers, no matter how many of them, how smart they are or how much expertise they possess.
 
J. Scott Armstrong is a professor at the University of Pennsylvania and author of “Long-Range Forecasting” (Wiley-Interscience, 1985).

Continue Reading No Comments

Government Scientist Gets Fired for Telling the Truth

Written by David Spady

By David Spady (Townhall.com)

Something’s amiss at the Department of Interior. Eight government scientists were recently fired or reassigned after voicing concerns to their superiors about faulty environmental science used for policy decisions. Which begs the question, “Are some government agencies manipulating science to advance political agendas?”

science gatekeeping

Fictional book authors operate in a convenient world, unconstrained by facts and experiences of the real world. The antithesis of works of fiction are scientific findings solely based on provable facts and experience. For agenda-driven environmental science, facts can sometime prove inconvenient. It’s far easier to advance an agenda with agreeable science, even if that means creating science fiction or fictional science. Fictional science thus becomes the pseudo-reality of environmentalist’s absolutism and any science that disagrees with their predetermined conclusions of man-made harm to the environment is ignored or distorted. Now we learn that in some government agencies, scientists who question the veracity and validity of scientific evidence used to formulate environmental regulations and policies are shunned, kept quiet, and purged.

The purpose of fictional environmental science is to sway public opinion through what amounts to propaganda. Intransigent purveyors of “green” propaganda know their greatest enemy is truth. One of the most famous propaganda experts was Germany’s Joseph Goebbels, who taught that if a lie is repeated often enough it will eventually be accepted as truth. Goebbels also knew that truth has to be suppressed if it contradicts the objectives of the propaganda. Goebbels wrote, “It thus becomes vitally important for the State to use all of its powers to repress dissent, for the truth is the mortal enemy of the lie, and thus by extension, the truth is the greatest enemy of the State.”

Over the past three decades, government has unleashed an unprecedented wave of environmental rules and regulations that affect nearly every aspect of American life, and for the most part the public has tolerated it. Public embrace of environmental propaganda and fear mongering about the apocalyptic consequences of mankind’s abuse of the planet have elevated environmentalism to a status above national security. The public is now more likely to give up rights and freedoms for the cause of saving the planet than for security reasons.

Read more here.

Continue Reading No Comments

The Tragic Tautology of the Greenhouse Gas Effect

Written by

By John O’Sullivan (and N. Kalmanovitch*)

Carl Brehmer reminds us of a crucial internal conflict within the “greenhouse effect” hypothesis. So vague and self-contradictory are the myriad explanations given by climatologists of this “theory” that anyone who critically examines it soon understands that it is best explained as a tautology.

In rhetoric ‘tautology’ is defined as using different words to say the same thing, or a series of self-reinforcing statements that cannot be disproved because they depend on the assumption that they are already correct. We never have and never will get a detailed scientific explanation of the “greenhouse gas” effect (GHE) because for climatologists to seek one would require them to dissect it, thus exposing the truth; it hangs on nothing of any substance.

We are never given the “how” and yet science is all about how things work. When Principia Scientific International (PSI), comprised of 200 experts in science and engineering, sought clarification from the supporters of the GHE they were either ridiculed or ignored. So with no answers as to the “how” inquiring minds turned to the “why” for the rise of this climate chimera.

In a series of articles we saw that the idea of a GHE driven by carbon dioxide was re-invented in the late 1970’s after being widely accepted in science as having been refuted before 1950.

After decades this re-invented “theory” finally gained acceptance during the 1980’s as the field of government-funded climatology grew. We were given no reason why mainstream science had got it wrong in dismissing it so unequivocally for more than a generation. There were certainly no new discoveries in the 70’s suddenly proving carbon dioxide did “trap” heat after all. Moreover, despite all the inward investment in climate research no time, let along any rigor was applied to providing any standard definition of what this newly re-born “greenhouse effect” actually was.

“Analogously but Different”

Incredibly, despite forty years and a multi-billion dollar taxpayer-funded “carbon reduction” industry avidly pursuing control of this alleged climate thermostat there are no agreed equations and no agreed descriptors of how this “thermostat” actually works. The International Panel on Climate Change (IPCC) adds to the confusion in this tragic-comedy by glibly declaring our atmosphere is analogous to a greenhouse “but different.”

These handwaving proponents of the hypothesis will always start out by admitting the only meaningful source of heat to the surface of the earth is the sun. But then they will often declare that certain gases then serve to drive “down-welling radiation” (or “back radiation”) from the atmosphere as a secondary heat source.

Please take no one’s word on this. Just do your own Google search; most definitions of the “greenhouse effect” either overtly assert or at least imply that downwelling IR radiation from the atmosphere adds additional heat to the ground/ocean.

But nowhere will you be told where the extra heat generated by the atmosphere goes, because all outward longwave radiation (OLR) at the top of the atmosphere (TOA) is equal to, and in balance with, all the absorbed sunlight.  So, within the “greenhouse effect” hypothesis all that “additional” thermal energy that the atmosphere generates disappears as mysteriously as it appeared in the first place (see diagram).

GHE tautologous Energy Budget

Image source: National Academy Press.

Astrophysicist, Joseph E. Postma speaks for most critics of this shape-shifting GHE. Postma points out that the duty of modern empirical science is to seek to identify the physical principles that underlay observed phenomena.  He writes:

“By identifying and understanding the underlying principle, we thus understand reality.  If we can mathematize the principle and justify it on a-priori mathematical absolutivity, then the phenomenon becomes a scientific Law, such as the Laws of Thermodynamics or Kepler’s Law of Universal Gravitation, or the Laws of Least Action or Least Time.  We can also engineer the physical principle and use it to our benefit, to produce products, services, and generally, to create wealth and increase the standard of living of people, etc.

The obvious question:  is the underlying principle of the atmospheric greenhouse effect actually defined, anywhere? All I have to tell you, is that “No, it is not.””

A healthy skepticism demands of us that we look again at the above diagram, sold to us as the basic model of the greenhouse gas effect. Imagine what difference the addition or removal of that cyclical flow of phantom internal energy would make on the system as a whole. It makes no difference scientifically at all and we could easily discard it if we wished by applying the accepted principle of ‘Occam’s Razor‘ (“plurality should not be posited without necessity”). But to a charlatan looking to pick your pockets for more tax dollars, it is very necessary being the cleverest and most powerful tautology ever sold.

————————–

*In 1827 celebrated French Mathematician Jean-Baptiste Joseph Fourier determined that the theoretical temperature of the Earth based on just the thermal radiation from the sun was cooler than the actual temperature due to atmospheric insulation. He named this insulation effect “un effet de verre” (an effect of glass) after work by French Scientist de Saussure who demonstrated this insulation using glass panes for insulation.

Later work by physicists Planck, Stefan, and Boltzmann provided understanding of the relationship between temperature of a body (blackbody) and the intensity of radiation allowing for the calculation of the Earth’s theoretical radiative temperature exclusive of atmospheric insulation according to the formula  Te = [So(1-A)/4σ]1/4 derived from Planck’s equations using the Stefan Boltzmann constant “σ”.

We can measure planet surface temperature from its radiative spectrum.  We can also calculate Te by making some estimate of both solar Irradiance (So) and the planet albedo (A). The subtraction of Te from the planetary temperature provides a metric for comparing the relative atmospheric insulation of the planets essentially calibrating the insulation effect first noted by Fourier back in 1827. This theoretical temperature difference depicting atmospheric insulation was renamed “the greenhouse effect” relating the insulation against thermal transmission by conduction by the glass in a greenhouse to the insulation against thermal radiation provided by the atmosphere of the planets.

Preying on overall public ignorance of science unscrupulous scientists rebranded the greenhouse effect as some sort of physical effect driven by CO2 emissions causing catastrophic global warming. This fraudulent use of the term greenhouse effect was introduced over 25 years ago and spawned a series of equally fraudulent terms such as “greenhouse warming” and “greenhouse gases” providing the propaganda vocabulary that has created a climate change issue out of nothing to serve the political agenda of the climate change scam perpetrators.

While the term greenhouse effect is perfectly valid in its geophysical scientific context; it is completely ludicrous when used as a mechanism whereby increased CO2 creates energy out of nothing and causes the Earth to warm to catastrophic levels as a result of an enhanced greenhouse effect.

This article takes the fraudsters to task exposing the fraudulently rebranded version of the greenhouse effect for what it is.

 

 

 

 

Continue Reading No Comments

Cold Heat

Written by Dr. Klaus L.E. Kaiser

By Dr. Klaus L.E. Kaiser

Cold Heat

Have Cold, need Heat. Global temperature means haven’t risen for 16 years now but some politicians still think the globe is experiencing runaway overheating. At the -50 C/F in mid latitude Canada right now even the polar bears are not frolicking on the beaches, but elsewhere, the heat is being turned on; namely on the climate prognostications previously given by the proponents of global treaties to limit carbon dioxide (CO2) emissions, which have been claimed to be the mother of all evil.

North Pole Submarines 1987

Nomenclature

The climate debate suffers from many misconceptions, some of which arise from misleading terminology. For example, the US Environmental Protection Agency (EPA) calls carbon dioxide (CO2) a “pollutant.” Nothing could be further from the truth. CO2 is an invisible gas without any smell, and non-toxic. In some environments, such as the air in submarines its concentration is typically several times that of the earth’s atmosphere where its level is approximately 400 ppm (parts per million).

Just to juxtapose CO2 with a real (air) pollutant, for example sulfur dioxide (SO2); SO2 is an entirely different kettle of fish. It has nothing to do with CO2. Sulfur dioxide burns your nostrils and lungs and causes respiratory illness, etc.

Vital Necessity

The fact is that CO2 is absolutely vital for all life on earth. All of the carbon in our food, in fact in our own bodies is entirely derived from the atmospheric CO2 through the photo-synthesis (PS) process. Apart from CO2, the PS process also requires some other nutrients (phosphorous, nitrogen, minerals and trace elements) and, of course, sunlight to function. Plant growers make use of that by artificially boosting the CO2 level in their greenhouses to 1,000 ppm and more as plants just thrive under high CO2 levels, from pine and citrus fruit seedlings, to cassava, corn and wheat. But, you say, the summer sea-ice in the Arctic is shrinking, supposedly due to increasing CO2 in our atmosphere…

Ice at the North Pole

The North Pole is a geographical point on the earth. It has no other redeeming features. In fact, it is not even on land, but covered by the water of the Arctic Ocean. The extent of Arctic sea-ice varies tremendously with the seasons, namely from an annual maximum of ~15 at the height of the Arctic winter to a mere ~4 million square kilometers at the end of the Arctic summer. In other words, its annual variation of ~11 million square kilometers is much greater than its average minimum extent. That fact alone should give you some food for thought.

Contrary also to widely-held beliefs, the North Pole has often been open water during the summer and not just in recent years. That is on record by a variety of submarines surfacing there over the last 50 years. If you don’t believe me, just check the newsreels and logs of the USS Skate which encountered much open water on its voyages there in 1958 and 1959.

Old Charts

Old charts of the Arctic typically show the extent of the Perennial Sea-Ice (PSI) in the Arctic Ocean. I have one of those, published in 1973 by the Department of Energy, Mines and Resources, Canada. It shows the southern / eastern limit of the PSI being entirely being west and north of the Canadian Arctic island archipelago, as you can see in Fig. 1, below.

Arctic sea ice extent 1973

Fig. 1. Perennial Arctic sea-ice extent (light colored area), chart published in 1973.

Obviously, that ice cover is in stark contrast to the ice-coverage graphs published in recent years by National Geographic (NG) magazine and other publications. The latter typically show the maximum and minimum extent of ice cover at the height of the Arctic winter and summer periods, but never the PSI extent. To show you how misleading some articles can be, let’s look at an ice-chart published by NG in 2011 (Fig. 2).

Arctic sea ice minimum July 2011

Fig. 2. Minimum extent (white area) of Arctic summer se-ice, as per National Geographic magazine, July 2011.

As you can see, the minimum summer sea-ice extents vary substantially between the two graphs. NG’s graph (Fig. 2) has it extending through much of the Canadian Arctic archipelago islands, while the older map (Fig. 1) shows it entirely outside of the archipelago. You may wonder, why is there such a difference? Once again, it’s in the definition, this time of what is “ice.” According to the answer I received from NG, the PSI ice [Fig 1] is “more than 2 years old and tends to be thicker and more resilient than younger ice” [Fig. 2].

Are other claims as to a shrinking Arctic ice cover more trustworthy than those about the ice cover? What about the polar bears?

Polar Bear Range

Pictures of cuddly polar bear cubs have been a mainstay of many organisations wanting you to donate to their cause of “saving the polar bears” for many years. Polar bears live in the Arctic and are well adapted to that environment. The NG article “On Thin Ice” mentioned above also contains photographs of polar bears in their natural habitat. That’s where things really get interesting: most of these pictures were taken at Svalbard which is OUTSIDE even NG’s exaggerated minimum summer sea-ice extent, the other ones at unspecified locations, perhaps the local zoo.

The Gist

The gist of the message here is that the “climate change” agenda has been hyped in many reports, even in some outlets of high reputation and previously deemed to be reliable. No wonder that much of the public falls for such stories; after all very few people have ever to cope with chills of -50 C or less.

In short, there is a difference between a temperature of +50 F in Houston, TX and -50 F in International Falls, MN or further north in the “great white north” of Canada.

Continue Reading No Comments

Have you been involved in scientific fraud? Retraction Watch wants to hear from you

Written by

Ivan Oransky over at Retraction Watch is giving readers the heads up on a new initiative by Grant Steen who has published a number of important papers on retractions. Steen is now looking to gather stories from anyone involved in science fraud.

Grant Steen

Here is Oransky’s piece setting out the details:

Why is there fraud in science?

Scientists believe—or at least profess to believe—that science is a process of iteratively approaching Truth.  Failed experiments are supposed to serve as fodder for successful experiments, so that clouded thinking can be clarified.  Observations that are fundamentally true are thought to find support, while observations that are flawed in some way are supplanted by better observations.

Why then would anyone think that scientific fraud can succeed?  Fraud would seem to be intellectual pyrotechnics; a dazzling light that leaves us in darkness.  If science truly is self-correcting, then why would people risk perpetrating fraud?  The notion of self-correction suggests that fraud is certain to be found out. Why risk it? Or are most scientists wrong?  Does science often fail to self-correct?  Is the literature full of misinformation, left behind like landmines in an abandoned battlefield?

What is the rationale for data fabrication and data falsification?  We invite anyone who has been involved in a scientific retraction due to fraud, or otherwise implicated in scientific misconduct, to write an essay for inclusion in a projected book about scientific fraud.  Essays are solicited from people who were involved as either a perpetrator or a co-author.  It is vital that this account be written from a personal perspective.  Please limit speculation and stick to verifiable facts insofar as possible, so that future historians can learn what actually happened.  Please do not discuss retractions that resulted from an honest scientific mistake, and do not dwell on transgressions such as plagiarism, duplicate publication, or co-author squabbles.  Discussion should focus primarily on data fabrication and data falsification.  We are especially interested in first-person accounts that relate to any (or all) of the following questions:

What actually happened?

What is the scientific story behind the transgression?

How did you (or a colleague) fabricate or falsify data?

What was the short- or long-term goal of the deception?

Did you perceive any significant obstacles to fabrication or falsification?

Did the research infrastructure fail in any way?

How was the fraud discovered?

Do you believe that the scientific enterprise was damaged?

What was the aftermath for you and for your collaborators?

What are your thoughts and perceptions now? 

Please limit your essays to no more than 3,000 words and send them to [email protected]

Be prepared to prove that you are who you claim to be; we will try hard not be taken in by a scam.  However, it may be possible to publish the piece anonymously, though this would greatly lessen the impact.  If accepted for publication, your work will be edited for clarity only; there will be no censorship, no editorial intrusion, and no correction of what are claimed as facts.  However, these essays will become part of a multi-author dialogue about scientific fraud.  If a book contract can be secured, each essay will form a chapter in the book.  No profits are anticipated, so no financial gain can accrue from the project.  However, this is a chance to tell your story on a national stage.

 

Continue Reading No Comments