New Research Paper Predicts 15 years of Global Cooling

Written by The Hockey Schtick

The Hockey Schtick blog highlights a new paper published in Geophysical Research Letters. It finds the natural North Atlantic Oscillation [NAO] controls temperatures of the Northern Hemisphere 15 to 20 years in advance, a lagged effect due to the large thermal inertia of the oceans. The authors find the NAO index can be used to predict Northern Hemisphere mean temperature multidecadal variability and the natural oceanic Atlantic Multidecadal Oscillation (AMO) 15–20 years in advance. A simple linear model based upon this theory predicted the ‘pause’ of global warming since about 2000 that IPCC models failed to predict, and projects Northern Hemisphere temperatures will “fall slightly” over the 15 years from 2012-2027. 

The NAO, in turn, has been linked to solar activity.

NOA Winter Index

Continue Reading No Comments

Physicist: There was no Fukushima nuclear disaster

Written by Kelvin Kemm, nuclear physicist

The terrible toll from Japan’s tsunami came from the wave, not radiation

I have watched a TV programme called ‘Fear Factor.’ In the series there are contestants who have to confront their worst fears to see who bales out and who can fight the fear and get through.

People who are afraid of heights are made to Bungee-jump off a high bridge, and people who are scared of spiders or insects are made to get in a bath full of spiders.

In virtually all cases the contestants later say that the fearful experience was not actually as bad as they feared. So the fear of the fear was greater than the fear itself ‘when the chips were down.’

This is often the case in life, that the fear of some factor turns out to be worse than the experience itself. The human mind builds a very scary image in the imagination. The imagination then feeds the fear.

If the picture in the imagination is not very specific or clear it is worse, because the fear factor feeds on the unknown.Fukushima nuclear plant

This is what has happened in the public mind concerning nuclear power over the last half century. Concepts concerning nuclear reactions and nuclear radiation are in themselves complicated and mysterious.

Over the last couple of decades physics advances in fields such as quantum mechanics, which is linked to nuclear processes has compounded matters for the public. The image of strong and mysterious forces and effects is now well entrenched. There are Hollywood movies and TV programmes about space travellers or alien invaders who use time travel and quantum forces, and then battle to evade the dangerous intergalactic nuclear zones.

A consequence of all this is that internationally the public is now really ‘spooked’ when it comes to the topic of nuclear power. A real ‘fear factor’ looms over the mere word ‘nuclear.’ Newspapers love this, and really push imagery like; ‘nuclear leak’ or ‘radiation exposure.’

To a nuclear physicist like me, I look upon such public reaction half with amusement and half with dismay. The amusement comes from the fact that so many people can be scared so easily by so little. It is like shouting: “Ghost in the bedroom,” and everyone runs and hides in the hills.

The dismay reaction is that there is a body of anti-nuclear activists who do not want the public to know the truth, and the anti-nukes enjoy stoking the fear factor and maintaining public ignorance.

Let us now ponder the Fukushima nuclear incident which has been in the news again lately.

Firstly let us get something clear. There was no Fukushima nuclear disaster. Total number of people killed by nuclear radiation at Fukushima was zero. Total injured by radiation was zero. Total private property damaged by radiation….zero. There was no nuclear disaster. What there was, was a major media feeding frenzy fuelled by the rather remote possibility that there may have been a major radiation leak.

At the time, there was media frenzy that “reactors at Fukushima may suffer a core meltdown.” Dire warnings were issued. Well the reactors did suffer a core meltdown. What happened? Nothing.

Continue Reading 13 Comments

Record-breaking Blizzard Kills 75,000 Cattle: Ignored by Biased ‘Global Warming’ MSM

Written by Liz Klimas, The Blaze

Ranchers are still digging out thousands of their cattle that became buried in a record-setting snowstorm in South Dakota late last week and over the weekend.

One would think the death of 75,000 cows by upwards of five feet of snow might get some national attention, but as one blogger observed, it has taken some time for the news of the precipitation massacre to reach outside of local media.Dakota Blizzard

“I searched the national news for more information. Nothing. Not a single report on any of major news sources that I found. Not CNN, not the NY Times, not MSNBC,” Dawn Wink wrote Tuesday. “I thought, ‘Well, it is early and the state remains without power and encased in snow, perhaps tomorrow.’ So I checked again the next day. Nothing. It has now been four days and no national news coverage.”

Wink dubbed it “The Blizzard that Never Was.”

National syndicated photo services also yield only a few results documenting the storm. The Weather Channel, taking photo submissions from locals, seems to have the most dramatic pictures of the scene.

At least four deaths were attributed to the weather, including a South Dakota man who collapsed while cleaning snow off his roof.

Gary Cammack, who ranches on the prairie near Union Center about 40 miles northeast of the Black Hills, said he lost about 70 cows and some calves, about 15 percent of his herd. A calf would normally sell for $1,000, while a mature cow would bring $1,500 or more, he said.

“It’s bad. It’s really bad. I’m the eternal optimist and this is really bad,” Cammack said. “The livestock loss is just catastrophic. … It’s pretty unbelievable.”

Continue Reading No Comments

“Power In” is NOT Equal to “Power Out”

Written by Joseph E Postma

I keep on seeing the phrase from alarmists, warmists, and luke-warmists, of this initiating assumption that, in order to conserve energy, you set the power input equal to the power output.  In other words:

Power In = Power Out

Haven’t these people heard of entropy?  The fact that for essentially NOTHING in the universe, power in = power out, is learned in high-school or even well before that.  entropy

So who are the people that claim that power in = power out, in direct and the most basic violation of thermodynamics?  Can you actually really be a physicist while claiming that power in = power out, in 100{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} efficiency?  Nothing is 100{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} efficient, because of our friend entropy – no matter how efficiently you try to get work out of a system, you can never get as much power out as you put in – there are always losses.

So there’s that, and of course, why else is power in NOT EQUAL TO power out?  Power in is not equal to power out because the energy which constitutes those powers does not come from the same surface area.  For Earth, ‘power out’ does not equal ‘power in’ because the power gets put in on only half the planet, while the ‘power out’ comes from the whole planet.

There’s twice as much surface area from which power can come out than to which power comes in, and so, if the power out equalled the power in, there would be twice as much energy coming out as comes in.  Equating flux will in general always lead to a basic violation of conservation of energy.  Equating flux, in general, is not the correct way to conserve energy.

I mean this is all very basic stuff, which I’ve written on extensively already.  The Earth is not flat, Sunshine is not cold, conserving flux is not the same thing as conserving energy, etc.

And that latter seems to be the source of all the climate confusion, among all participants of the debate.  Only me and other people at PSI (Principia Scientific International, i.e. “the Slayers”) seem to be stating the factual, traditional-science case that power is not the same thing as energy, that flux can’t be averaged, that real-time differential heat-flow equations are the only true solution for heat flow and temperature, etc.

Continue Reading 54 Comments

UN Climate Scientists Plead for Immunity from Criminal Prosecution: AR5 in Crisis

Written by

The UN’s Intergovernmental Panel on Climate Change (IPCC) published it’s Fifth Assessment Report (AR5) this week and already it is in crisis with accusations of malfeasance and pleas from climatologists for immunity from prosecution.

A critical backlash against AR5’s “junk science” is now in full swing and policymakers in Britain and Australia are already in full retreat from the travesty. The ongoing collapse in the UN’s climate cabal’s credibility puts a fresh light on why climatogists got in early with their formal request for immunity from prosecution at the Rio de Janeiro UN climate summit of 2012.

Today, prominent statistician Steve McIntye, one of the analysts often credited with exposing past IPCC ‘errors’ points to why this fiasco may rise to the level of criminality. McIntyre shows how UN officials systematically hid adverse data contained in the final draft review but omitted from the subsequent report now issued to the public. The world hasn’t seen this kind of orchestrated institutionalized deceit since the world banking crisis of 2008.

The astonishing plea by the world’s climatologists for immunity from prosecution was first reported last year when it surfaced embarrassingly during the Rio summit. At the time  John Bolton, a former U.S. Ambassador to the UN, was quick to question the motives, “The creeping expansion of claims for privileges and immunities protection for UN activities is symptomatic of a larger problem.”

This week, in ‘IPCC: Fixing the Facts’ McIntyre identifies the evidence that proves how UN authors cynically removed from their final report facts that contradicted the propaganda set out in the Summary for Policymakers (SPM) issued only last Friday. Such stark evidence reveals that climatologists failed to predict the flatlining of temperatures in recent decades.

McIntyre observes:

“Figure 1.4 of the Second Order Draft clearly showed the discrepancy between models and observations, though IPCC’s covering text reported otherwise. I discussed this in a post leading up to the IPCC Report, citing Ross McKitrick’s article in National Post and Reiner Grundmann’s post at Klimazweiberl. Needless to say, this diagram did not survive. Instead, IPCC replaced the damning (but accurate) diagram with a new diagram in which the inconsistency has been disappeared.”

Continue Reading 4 Comments

National Geographic – Fooled to disgrace itself

Written by Nils-Axel Mörner

The September issue of National Geographic was devoted to the idea that we are facing a disastrous flooding in the near future. They had the bad taste to illustrate this with a picture of the Statue of Liberty with the sea reaching up to her waist some 70 m above the present sea level. This is a complete misconception of physical possibilities in nature itself.statue of liberty Nat Geo

The firm scientific facts fully to dismiss all such flooding ideas have been presented by Professor Don J. Easterbrook last week on WUWT, and I don’t need to add further facts.

There is another side of this tragedy, and that is the question of how and on what grounds a top-magazine can be fooled to disgrace itself so very much. The IPCC and its supporting boy-scouts seem totally to have lost contact with reality in their claims of sea level rise and disastrous flooding events of low-lying islands and coastal areas.

Claims of a sea level rise by 2100 in the order of 1-2 m or more are simply impossible because it would upset all knowledge and all observational facts we have achieved over the entire time of scientific investigations.

In the article in National Geographic references were given to three scientists who were said to be responsible for the “facts” presented. Those persons are:

Philippe Huybrechts (Vrije Universiteit Brussel, Belgium)

Richard S. Williams Jr (Woods Hole Research Centre, US)

James C. Zachos (University of California, US)

They should all know better than to allow the falsification of facts and the discarding of all accumulated knowledge in geology and physics.

Continue Reading 3 Comments

What is Energy?

Written by Joseph E :Postma

Not “Watt is energy”.  In physics, and what should be everywhere else in anything calling itself science, what is the unit of energy?  The unit of energy has a name, and it is called a Joule, after English physicist James Prescott Joule.  A Joule, or Joules, are the unit of energy in science.  There are other equivalent metrics for energy such as “ergs” or “electron volts” but they are all equivalent to a certain number of Joules.

Watts, on the other hand, are a unit of flux.  In particular, the temporal flux of Joules, meaning the number of Joules being “used” or “passing by” in one second.  The fundamental definition and unit of a Watt is a Joule per second, so, W = J/s where the letters abbreviate the relevant quantities.  So, one Watt is one Joule of energy used in one particular second.  We call this flux.

When we get to radiation or light, and the measure of its strength, these are measured in Watts per square meter, which means Joules per second per square meter, and this is called flux density.  It is a number of Joules, being used each second, over the area of a square meter. W/m2 = J/s/m2.  These are the units for the Stefan-Boltzmann Equation which is the single equation that exists for converting radiation, or light, into temperature.  Why I mean by that is that the equation tells you the temperature of the light given its intensity, or conversely, the intensity of the light given its temperature.  The equation tells you that light has a direct equivalence to temperature, just like mass has a direct equivalence to energy.  The latter equation is Einstein’s E = mc2, which shows that mass has an equivalence to energy. Likewise, radiation has an equivalence to temperature via the Stefan-Boltzmann equation, which is F = σT4, where F is the Flux density of the radiation, σ is a constant, and T is the temperature.

So then what’s wrong with the IPCC energy budget?  Let’s have a look at it again (see diagram):

IPCC energy budget with backradiation

What they’re doing to get this thing to “work”, is adding together the flux densities of light.  Given the Stefan-Boltzmann equation which shows us that light flux density has an equivalence to temperature, then what this diagram is doing is adding temperatures together, to make it work.  When it adds 168 J/s/m2 from sunlight with 324 J/s/m2 from the atmosphere, it is saying that sunlight is -400C and that the atmosphere is 1.80C (because that is the equivalent temperature of those light flux densities), and that if you add together something that is -400C to something that’s +1.80C, you get +150C. Not just that – the diagram tries to say that air is hotter than sunlight!

Continue Reading 8 Comments

EPA Rebuttal:Man-made CO2 Global Warming is a Fraud

Written by Robert Ashworth PE

CFC Destruction of Ozone was the Real Cause

Introduction

Here is an excerpt1 from a paper written by a National Oceanic and Atmospheric Administration (NOAA) meteorologist; “Climate models used for estimating effects of increases in greenhouse gases show substantial increases in water vapor as the globe warms and this increased moisture would further increase the warming.” However, this meteorologist along with the International Panel on Climate Change (IPCC) crowd got it backwards about water vapor and CO2 — they cool the earth like all other gases in our atmosphere!

Although moisture in the atmosphere does increase with warming, this is because the higher temperature causes more water to evaporate. With every pound of water evaporated 1,000 Btu is absorbed and that causes cooling. Further, increased water in the atmosphere causes further cooling (not warming) by reflecting more of the radiant energy from the Sun that is hitting the water vapor molecules back to outer space.

Al Gore presented the climate change fraud as well in his “Inconvenient Truth”, actually a “Convenient Lie” presentation of the Vostok Ice Core data, see below.

Gore’s “Inconvenient Truth” Documentary — Cause and Effect Reversed

In this documentary, Al Gore fudged the Vostok Ice core temperature and CO2 line graphs so it would show a CO2 spike coming first in time, but the real graph showed just the opposite. See the data in a shorter time frame (240,000 Years Before Present rather than 420,000 Years Before Present as presented by Gore). This makes it easier to see which came first, Figure 1. 

Antarctic Ice Core Data

Figure 1. Vostok, Antarctica Ice Core Data2.

It is clearly seen that a global warming spike (blue line) always comes first. The spike warms the oceans, which slowly reduces the solubility of CO2 in water that results in the liberation of CO2 from the oceans around 800 years later (see Figure 2). Gore gave no explanation what would cause a CO2 spike to occur in the first place, but then again he is a politician with an agenda to make him wealthy. See the most recent time of warming between the 500 year medieval warming period and the start of an increase in CO2 in the atmosphere. One can see that CO2 started increasing during a cooling period showing it was not controlled by the warming that started some 80 years later and it is about 800 years from the end of the medieval warming period. This is historically what happens. Dr. Michael Mann of Penn State, eliminated the Medieval Warming period, his hockey stick graph, – clearly a fabricated graph by “cherry picking” temperature data.

Continue Reading 3 Comments

“Most Severe Winter Start In 200 Years!”

Written by P. Gosselin, No Trick Zone

+ Euro Municipalities Now Ignoring Foolish Predictions Of Warm Winters

Last Thursday evening and yesterday winter made its debut in Southern Germany and Austria  – and how! Read more here.

German RTL television last night here (starting at 4:30) called it the “most severe start of winter in 200 years!“, saying many meteorologists were caught by surprise. Up to half a meter of snow fell at some locations.Road Salt Pollinator

Gone are the mild winters of the sort Europe seen in the 1990s and early 2000s. Indeed for central Europe the last 5 consecutive winters have all been colder than normal – a record!

These days are blockbuster times for German road salt manufacturers. In Europe municipalities have learned their lesson: ignore foolish predictions of warm winters, order huge quantities of salt, and do it early!

Municipalities and road commissioners were once led astray by climatologists’ predictions of increasingly warmer winters and led to thinking that these had become a thing of the past due to global warming (recall famous words of David Viner and Mojib Latif). One major daily even proclaimed that spring would arrive in January!

As a consequence of these false global warming predictions, municipalities unwisely kept much smaller stocks of road salt for expected shorter and milder winters. Road commissioners saw little reason to keep thousands of tons of road salt in stock.

Then beginning in 2009 came one harsh winter after another. Road maintenance crews and commissioners were caught red-faced. Suddenly municipalities were running out of salt by January and were even longer able to keep the most vital traffic arteries cleared. Traffic chaos ensued and motorists were left to fend for themselves. Municipalities were stunned and left scratching their heads. Weather-wise the exact opposite of what climatologists had predicted had taken place. They learned the hard way. Now they are no longer heeding the foolish forecasts of warm winters.

Continue Reading No Comments

Incompetent Climatologists: Dr. Nir Shaviv Nails It

Written by PSI Staff

This week all hell is breaking loose in the crazy world of climate science blogging! Dr. Roy Spencer is Rattled, Willis Eschenbach is wilting and Anthony Watts has no clue. But as this latest spat among ‘climate experts’ hots up it takes an outsider from the ‘hard’ sciences to call it right. Climate research is being done by incompetents, says award-winning astrophysicist, Nir Shaviv (pictured).

Nir Shaviv Calling out Climatologists

Let’s take a snapshot of what Dr. Shaviv refers to. Leading skeptic climatologist, Dr Roy Spencer (University of Alabama, Huntsville) is currently trouncing Willis ‘Citizen Scientist’ Eschenbach‘s pet theory that localized emergent phenomena (eg thunderstorms) regulate temperature and that ‘forcing’ has little to do with it. Spencer counters that feedbacks only make sense over entire atmospheric circulation systems.

In his tirade, Dr Spencer appears to be calling Willis’s ideas either unoriginal or a plagiarism of Ramanathan and Collins (1991). Also, Spencer appears to be saying that ‘citizen scientists’ should first study for a PhD and only when fully accredited as an expert should they speak up (aka ‘Argument to Authority’). What a palava!

Continue Reading 1 Comment

The Anti-Science IPCC Global Warming Report 5

Written by Dr Charles Anderson

Fundamentally, the IPCC has never had any solid evidence of measurable man-made global warming caused by man’s emissions of carbon dioxide.  The newest report just issued does not change this.  Yet, the Summary Report issued to the press and politicians claims that catastrophic man-made global warming is now known to be more certain than ever. 

This claim is made on the basis of General Circulation Model (GCMs) computer models interpreted with an embarrassing flight of fancy.In the light of that claim, let us examine the predictions of such GCMs used in the prior reports, when we were informed that the science was already settled and well-known.  The draft report that was sent out to actual scientists for review had the following graph in it. 

The various shaded areas show the range of certainty of the average global temperature according to the body of computer models.Temp anomaly IPCC Fifth Report

The FAR was the first report of 1990, the SAR was the second report of 1995, the TAR was the third report of 2001, and the AR4 is the fourth report of 2007.  Over this period the U.S. government alone spent about $150 billion funding climate change related phenomena.  Each IPCC report claimed a higher level of confidence in catastrophic man-made global warming. 

So we should expect to be able to look at the range of expected temperatures from each report for 2015 and see that the range of each successive report falls within the range of the previous report, but is narrower.  This is both because the claim is that the science is better known and because the prediction time is becoming shorter.Because the colored ranges overlap, it is easiest to quickly see how the certainty of the predictions of the settled science actually changed from report to report by looking at the brackets on the right side of the graph which are color-coded.  These represent the range of the prediction for 2015 for each report.

So what actually has happened is that the settled science did claim a smaller temperature range in the second report than in the first report, but its prediction range did not lie entirely within the range claimed in the first report.  No, it admitted that the temperature increase might be smaller.  In the third report the 2015 temperature range was much wider than in the second report.  The 2015 temperature might be much higher than that predicted in the second report, or a bit lower. 

This represented a large increase in the scientific uncertainty being claimed.  The fourth report claimed that knowledge had improved and the range shrank compared to that of the third report, but apparently the knowledge was not as good as that of the second report whose range was narrower.  While the range of the fourth report prediction does lie entirely within that of the third report prediction and that of the first report, it excludes the lower part of the range of the second report on the settled science.Unfortunately for the IPCC, the fourth report prediction of the temperature did not allow for such low temperatures as have been measured in the meantime. 

The black and red dots of recent years are well below the predicted range of the fourth report and even below the range by a bit from any of the reports.  The only conclusion a rational person can make is that the settled science incorporated into the many computer models was wrong. 

Continue Reading No Comments

IPCC CLIMATE MODELS FAIL TO REPRODUCE DECADAL & MULTIDECADAL PATTERNS SINCE 1850

Written by Nicola Scafetta, Earth Science Reviews

 

Nicola Scafetta (2013) Discussion on climate oscillations: CMIP5 general circulation models versus a semi-empirical harmonic model based on astronomical cycles, Earth-Science Reviews 126 (2013) 321–357
 
Abstract: Power spectra of global surface temperature (GST) records (available since 1850) reveal major periodicities at about 9.1, 10–11, 19–22 and 59–62 years. Equivalent oscillations are found in numerous multisecular paleoclimatic records. The Coupled Model Intercomparison Project 5 (CMIP5) general circulation models (GCMs), to be used in the IPCC Fifth Assessment Report (AR5, 2013), are analyzed and found not able to reconstruct this variability. In particular, from 2000 to 2013.5 a GST plateau is observed while the GCMs predicted a warming rate of about 2 °C/century. In contrast, the hypothesis that the climate is regulated by specific natural oscillations more accurately fits the GST records atmultiple time scales. For example, a quasi 60-year natural oscillation simultaneously explains the 1850–1880, 1910–1940 and 1970–2000 warming periods, the 1880–1910 and 1940–1970 cooling periods and the post 2000 GST plateau.
 
This hypothesis implies that about 50% of the ~0.5 °C global surface warming observed from 1970 to 2000 was due to natural oscillations of the climate system, not to anthropogenic forcing as modeled by the CMIP3 and CMIP5 GCMs. Consequently, the climate sensitivity to CO2 doubling should be reduced by half, for example from the 2.0–4.5 °C range (as claimed by the IPCC, 2007) to 1.0–2.3°C with a likely median of ~1.5 °C instead of ~3.0 °C. Also modern paleoclimatic temperature reconstructions showing a larger preindustrial variability than the hockey-stick shaped temperature reconstructions developed in early 2000 imply aweaker anthropogenic effect and a stronger solar contribution to climatic changes. The observed natural oscillations could be driven by astronomical forcings. The ~9.1 year oscillation appears to be a combination of long soli–lunar tidal oscillations, while quasi 10–11, 20 and 60 year oscillations are typically found among major solar and heliospheric oscillations driven mostly by Jupiter and Saturn movements.
 
Solar models based on heliospheric oscillations also predict quasi secular (e.g. ~115 years) and millennial (e.g. ~983 years) solar oscillations, which hindcast observed climatic oscillations during the Holocene. Herein I propose a semi-empirical climate model made of six specific astronomical oscillations as constructors of the natural climate variability spanning from the decadal to the millennial scales plus a 50% attenuated radiative warming component deduced from the GCM mean simulation as a measure of the anthropogenic and volcano contributions to climatic changes. The semi-empirical model reconstructs the 1850–2013 GST patterns significantly better than any CMIP5 GCM simulation. Under the same CMIP5 anthropogenic emission scenarios, the model projects a possible 2000–2100 average warming ranging from about 0.3 °C to 1.8 °C. This range is significantly below the original CMIP5 GCM ensemble mean projections spanning from about 1 °C to 4 °C.
 
Future research should investigate space-climate coupling mechanisms in order to develop more advanced analytical and semi-empirical climatemodels. The HadCRUT3 and HadCRUT4, UAHMSU, RSS MSU, GISS and NCDC GST reconstructions and 162 CMIP5 GCM GST simulations from 48 alternative models are analyzed.
 

 

Continue Reading No Comments

Equating Flux

Written by Joseph E Postma

In the last post was an explanation of the difference between energy and energy flux.  Energy is generally a simple static scalar quantity, while flux refers to an instantaneous expenditure of energy.

Physics, i.e. the real world and real-world reactions, occur in real-time.  Reality doesn’t wait around for an average of something to build up and then decide to act – reality acts as time flows by, each infinitesimal moment to the next.  Reality reacts to instantaneous flux, not the average flux because there is no “average” that reality waits around for to react to.

The standard procedure for “conserving energy” and then creating an energy budget and subsequent greenhouse effect is by numerically equating the terrestrial flux output with the solar flux input. This numerical procedure is done with the  justification that “on average, the input and output must equal if the system is in equilibrium”.  But this is done numerically on paper, not physically in reality, because the physics of reality reacts instantaneously to forces, and doesn’t wait around for averages.

Equating Energy

So what’s the basic thing that we’re actually trying to conserve in regards to solar input and terrestrial output?  The real physical quantity we want to conserve is energy, not flux.  Energy is a fundamental unit of physics, while flux always depends upon the particular, real-time, local situation.  So if we assume that, on average, the input and output energies are equal, which they should be, then we can consider such energies for any particular second.  Considering any particular second is convenient since this allows us to directly convert the energy into flux later on.

In any given second, the Earth absorbs 1.22 x 1017 Joules of light energy from the Sun.  This is calculated with the Stefan-Boltzmann equation for the Sun, factored for the distance to the Earth and the Earth’s cross-sectional area, and its albedo.

In any given second, this energy, 1.22 x 1017 Joules, falls on one-side of the planet – the day-side hemisphere. So, now that we know the total energy falling on the Earth in one second, and we now also know where the energy falls in one second, we can convert the energy value into the units of the Stefan-Boltzmann equation, which are Joules per second per square meter. Therefore if we take the total energy and divide it by the surface area of a hemisphere of the Earth, you get an (linear) average of 480 Joules per second per square meter, or 480 W/m2. Using the Stefan-Boltzmann equation which equates flux to temperature, this is a temperature of +30 degrees C, which is very nice and warm and will melt ice into water on the day-side, etc. It is a reasonable number.

However, we must again recall that reality reacts to forces instantaneously, and not to averages of those forces after-the-fact.  The light energy falling on the day-side hemisphere in one second is not evenly (linearly) distributed because the Earth is round, not flat. That means that there is a locality-dependence on the true, real-time value of the flux density.  That is, when the Sun is overhead it is strongest, and when it is near sunrise or sunset it is weakest, and in between it smoothly ranges.  When the Sun is directly overhead, and even barely so, the flux density of the energy falling isn’t strong enough to just melt ice into water, but it is also strong enough to evaporate water into vapour.  This is what basically creates everything we recognize as the climate, is water vapour rising into the atmosphere from the strength of the Sun, and this occurs in real-time.  The greenhouse effect models do not show this, and they actually even contradict it, because they incorrectly average the power of the Sun to where it doesn’t physically exist, and thereby make the solar power far too cold (on paper) to be able to create that water cycle and climate.

This diagram is a representation of real-time reality and the physics that drives the climate on the Earth:Postma Earth Energy Budget

Back to Equating Flux

With an energy input of 1.22 x 1017 Joules over a hemisphere in one second from the Sun, and an energy output from the Earth of 1.22 x 1017 Joules from the entire globe, i.e. both hemisphere’s, it is not physically correct to equate these energy values in terms of flux. These values are true and totally correct in terms of energy.  They can not be made to be equal in terms of flux.

For example, if we say that the Earth is in numerical flux equilibrium with the Sun, and mistake this for conserving energy, then that would mean that the Earth must emit the same flux of energy as it receives the Sun.  Therefore the Earth must emit 480 W/m2 on average since that is what it receives from the Sun on an instantaneous basis.

Well, the Earth does not emit this flux of energy.  That is way too high of value.  If you converted that value into total energy emitted per second over the entire globe, it would be more energy than actually comes in.  The known and measured value for the flux output from the Earth is 240 W/m2.

Continue Reading 4 Comments

IPCC in Disarray: Time for a Review of Greenhouse Gas ‘Science’

Written by

As the UN’s Intergovernmental Panel on Climate Change (IPCC) flops with the release of its Fifth Report global policymakers are being left in no doubt why. Skepticism about man-made global warming and doubts about the validity of the ‘science’ of the greenhouse gas ‘theory’ are at all time highs.IPCC sinking

The reason? Despite carbon dioxide (CO2) levels rising by 40 percent, global temperatures have flatlined since 1998. None of the IPCC’s climate models forsesaw this. In fact, the greenhouse gas ‘theory,’ the scientific cornerstone of 30 years of climate alarm, unequivocally states that increased carbon dioxide in our atmosphere must cause more warming. But reality is disproving the theory.

The latest IPCC report is now reduced to conceding “natural variability” does play a part. This admission contradicts another cornerstone of their main thesis, that natural causes are of little or no consequence. But as the ‘Slayers‘ of the theory have long shown, it was always flawed because it made many dubious assumptions including the following:

  • The earth is flat.
  • The earth does not rotate.
  • The sun shines all day and all night with equal intensity.
  • Energy interchange in the climate is entirely by radiation. 
  • Conduction, convection and latent heat transfer do not happen.
  • Energy flow parameters are constants with no variability.
  • Energy flow is “balanced” with input equal to output.
  • Air movements, wind, rain, hurricanes are ignored.
  • Chaos has been abolished.
  • Change in this system is entirely caused by increasing human-induced trace gases in the atmosphere.
  • The earth is dead: there are no living organisms, no trees, animals, birds or people.

At this point honest scientists would admit the ‘theory’ seems discredited. Rational minds would admit that a fresh look is needed at the counterclaims of dissenting scientists. Such scientists have found a rallying point at Principia Scientific International (PSI).

Continue Reading 14 Comments

Do we really have a “33 °C Greenhouse effect”

Written by Jan Zeman, Czech Technical University, Prague

The Wikipedia entry for the Greenhouse Gas Effect states:

If an ideal thermally conductive black-body was the same distance from the Sun as the Earth is, it would have a temperature of about 5.3 °C. However, since the Earth reflects about 30{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} of the incoming sunlight, this idealized planet’s effective temperature (the temperature of a black-body that would emit the same amount of radiation) would be about −18 °C. The surface temperature of this hypothetical planet is 33 °C below Earth’s actual surface temperature of approximately 14 °C. The mechanism that produces this difference between the actual surface temperature and the effective temperature is due to the atmosphere and is known as the greenhouse effect.

The statement is almost completely untrue. For instance not even the math adds up: the difference between the two temperatures +14 °C and -18 °C is not 33 °C but 32 °C. But it is not important, what is important here is the fact that there’s not a difference of 33 °C, nor of 32 °C between the hypothetical and real Earth surface temperature. In short, there is clearly a confusion about what is meant scientifically when describing the “surface” of Earth.

I don’t want to rewrite astronomic customs, but for such purposes as a black-body radiation flux equation to and from the planet using the Stefan-Boltzman law, we would think the surface of the Earth should be considered to be “the atmosphere”- not the surfaces of the sea and land. The reason being that it is only the uppermost layer of the planet’s mass that is capable of radiation – in the sense as defined by the Stefan-Boltzman equation – unlike the boundary of the vacuum of space beyond.

This confusion is a result of our human perspective. In the case of big gas planets like Jupiter we observe from the outside and hardly anybody would suggest the immediate exterior of its uncertain small diameter core was the “surface.”

Indeed,  there’s an even stranger boundary custom to consider whereby we could discern the “surface” and atmosphere arbitrarily just at the point where Jupiter’s immense atmospheric pressures crosses 10 bar. Nonetheless, when talking about the Stefan-Boltzman law (i.e. about black-body radiation) as applied to Earth and it’s radiation budget, we should consider the gaseous atmosphere as being the Earth’s surface, not the actual surfaces of sea and land below.

Continue Reading 115 Comments