Time To Stop The Insanity Of Wasting Time and Money On More Climate Models?

Nearly every single climate model prediction, projection or whatever else they want to call them has been wrong. Weather forecasts beyond 72 hours typically deteriorate into their error bands. The UK Met Office summer forecast was wrong again. broken computer

I have lost track of the number of times they were wrong. Apparently, the British Broadcasting Corporation had enough as they stopped using their services. They are not just marginally wrong. Invariably, the weather is the inverse of their forecast.Short, medium, and long-term climate forecasts are wrong more than 50 percent of the time so that a correct one is a no better than a random event.

Global and or regional forecasts are often equally incorrect. If there were a climate model that made even 60 percent accurate forecasts, everybody would use it. Since there is no single accurate climate model forecast, the IPCC resorts to averaging out their model forecasts as if, somehow, the errors would cancel each other out and the average of forecasts would be representative.

Short term climate forecasts no better than the Old Farmers Almanac

Climate models and their forecasts have been unmitigated failures that would cause an automatic cessation in any other enterprise. Unless, of course, it was another government funded, fiasco. Daily weather forecasts are improved from when modern forecasting began in World War I. However, even short term climate forecasts appear no better than the Old Farmers Almanac, which appeared in 1792, using moon, sun, and other astronomical and terrestrial indicators.

I have written and often spoken about the key role of the models in creating and perpetuating the catastrophic AGW mythology. People were shocked by the leaked emails from the Climatic Research Unit (CRU), but most don’t know that the actual instructions to “hide the decline” in the tree ring portion of the hockey stick graph were in the computer code. It is one reason that people translate the Garbage In, Garbage Out (GIGO) acronym as Gospel in, Gospel Out when speaking of climate models.

I am tired of the continued pretense that climate models can produce accurate forecasts in a chaotic system. Sadly, the pretense occurs on both sides of the scientific debate. The reality is the models don’t work and can’t work for many reasons, including the most fundamental; lack of data, lack of knowledge of major mechanisms, lack of knowledge of basic physical processes, lack of ability to represent physical mechanisms like turbulence in mathematical form, and lack of computer capacity.

Bob Tisdale summarized the problems in his 2013 book Climate Models Fail. It is time to stop wasting time and money and put people and computers to more important uses.The only thing that keeps people working on the models is government funding, either at weather offices or in academia. Without this funding computer modelers would not dominate the study of climate.

Without the funding, the Intergovernmental Panel on Climate Change could not exist. Many of the people involved in climate modeling were not familiar with or had no training in climatology or climate science. They were graduates of computer modeling programs looking for a challenging opportunity with large amounts of funding available and access to large computers.

The atmosphere and later the oceans fit the bill. Now they put the two together to continue the fiasco. Unfortunately, it is all at massive expense to society. Those expenses include the computers and the modeling time but worse the cost of applying the failed results to global energy and environmental issues.

Let’s stop pretending and wasting money and time. Remove that funding and nobody would spend private money to work on climate forecast models.

I used to argue that there was some small value in playing with climate models in a laboratory, with only a scientific responsibility for the accuracy, feasibility, and applicability. It is clear they do not fulfill those responsibilities. Now I realize that position was wrong. When model results are used as the sole basis for government policy, there is no value.

It is a massive cost and detriment to society, which is what the Intergovernmental Panel on Climate Change (IPCC) was specifically designed to do.The IPCC has one small value. It illustrates all the problems identified in the previous comments. Laboratory-generated climate models are manipulated outside of even basic scientific rigor in government weather offices or academia, and then become the basis of public policy through the Summary for Policymakers (SPM).

Another value of the IPCC Physical Science Basis Reports is they provide a detailed listing of why models can’t and don’t work. Too bad few read or understand them. If they did, they would realize the limitations are such that they preclude any chance of success. Just a partial examination illustrates the point.

Data

The IPCC people knew of the data limitations from the start, but it didn’t stop them building models.In 1993, Stephen Schneider, a primary player in the anthropogenic global warming hypothesis and the use of models went beyond doubt to certainty when he said,“Uncertainty about important feedback mechanisms is one reason why the ultimate goal of climate modeling – forecasting reliably the future of key variables such as temperature and rainfall patterns – is not realizable.”A February 3, 1999, US National Research Council Report said,“Deficiencies in the accuracy, quality and continuity of the records place serious limitations on the confidence that can be placed in the research results.”

To which Kevin Trenberth responded,

“It’s very clear we do not have a climate observing system….This may come as a shock to many people who assume that we do know adequately what’s going on with the climate, but we don’t.”

Two Directors of the CRU, Tom Wigley, and Phil Jones said,“Many of the uncertainties surrounding the causes of climate change will never be resolved because the necessary data are lacking.”

70{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} of the world is oceans and there are virtually no stations

The Poles are critical in the dynamics of driving the atmosphere and creating climate yet there are virtually no stations in 15 million km2 of the Arctic Ocean or for the 14 million km2 of Antarctica. Approximately 85{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} of the surface has no weather data.

The IPCC acknowledge the limitations by claiming a single station data are representative of conditions within a 1200km radius. Is that a valid assumption? I don’t think it is.

But it isn’t just lack of data at the surface. Actually, it is not data for the surface, but for a range of altitudes above the surface between 1.25 to 2 m and as researchers from Geiger (Climate Near the Ground) on show this is markedly different from actual surface temperatures as measured at the few microclimate stations that exist. 

Arguably US surface stations are best, but Anthony Watts diligent study shows that only 7.9 percent of them accurate to less than 1°C. (Figure 1) To put that in perspective, in the 2001 IPCC Report Jones claimed a 0.6°C increase over 120 years was beyond a natural increase. That also underscores the fact that most of the instrumental record temperatures were measured to 0.5°C.

tball fig 1

Other basic data, including precipitation, barometric pressure, wind speed, and direction are worse than the temperature data. For example, in Africa there are only 1152 weather watch stations, which are one-eighth the World Meteorological Organization (WMO) recommended minimum density.

As I noted in an earlier paper, lack of data for all phases of water alone guarantees the failure of IPCC projections.The models attempt to simulate a three-dimensional atmosphere, but there is virtually no data above the surface.

The modelers think we are foolish enough to believe the argument that more layers in the model will solve the problem, but it doesn’t matter if you have no data.

Major Mechanisms

During my career as a climatologist, several mechanisms of weather and climate were either discovered or measured, supposedly with sufficient accuracy for application in a model. These include, El Nino/La Nina (ENSO), the Pacific Decadal Oscillation (PDO), the Atlantic Multidecadal Oscillation (AMO), the Antarctic Oscillation (AAO), the North Atlantic Oscillation (NAO), Dansgaard-Oeschger Oscillation (D-O), Madden-Julian Oscillation (MJO), Indian Ocean Dipole (IOD), among others.

Milankovitch Effect not included in IPCC models

Despite this, we are still unclear about the mechanisms associated with the Hadley Cell and the Inter-tropical Convergence Zone (ITCZ), which are essentially the entire tropical climate mechanisms. The Milankovitch Effect remains controversial and is not included in IPCC models.

The Cosmic Theory appears to provide an answer to the relationship between sunspots, global temperature, and precipitation but is similarly ignored by the IPCC.

They do not deal with the Monsoon mechanism well as they note,“In short, most AOGCMs do not simulate the spatial or intra-seasonal variation of monsoon precipitation accurately.”There is very limited knowledge of the major oceanic circulations at the surface and in the depths. There are virtually no measures of the volumes of heat transferred or how they change over time, including measures of geothermal heat.

Physical Mechanisms

The IPCC acknowledge that,“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.”

That comment is sufficient to argue for cessation of the waste of time and money. Add the second and related problem identified by Essex and McKitrick in Taken By Storm and it is confirmed.

Climate research is anything but a routine application of classical theories like fluid mechanics, even though some may be tempted to think it is. It has to be regarded in the “exotic’ category of scientific problems in part because we are trying to look for scientifically meaningful structure that no one can see or has ever seen, and may not even exist.“In this regard it is crucial to bear in mind that there is no experimental set up for global climate, so all we really have are those first principles.

You can take all the measurements you want today, fill terabytes of disk space if you want, but that does not serve as an experimental apparatus. Engineering apparatus can be controlled, and those running them can make measurements of known variables over a range of controlled physically relevant conditions.

In contrast, we have only today’s climate to sample directly, provided we are clever enough to even know how to average middle realm data in a physically meaningful way to represent climate. In short, global climate is not treatable by any conventional means.”

Computer capacity

Modelers claim computers are getting better, and all they need are bigger, faster computers. It can’t make any difference, but they continue to waste money. In 2012, Cray introduced the promotionally named Gaea supercomputer (Figure 2).

It has a 1.1 petaflops capacity. FLOPS means Floating-Point Operations per Second, and peta is 1016 (or a thousand) million floating-point operations per second. Jagadish Shukla says the challenge is“We must be able to run climate models at the same resolution as weather prediction models, which may have horizontal resolutions of 3-5 km within the next 5 years. This will require computers with peak capability of about 100 petaflops.”Regardless of the computer capacity it is meaningless without data for the model.

tball fig 2

Failed Forecasts, (Predictions, Projections)

Figure 3 shows the IPCC failed forecast. They call them projections, but the public believes they are forecasts. Either way, they are consistently wrong. Notice the labels added to Hayden’s graph taken from the Summary for Policymakers. As the error range increase in the actual data the Summary claims it is improving. One of the computer models used for the IPCC forecast belongs to Environment Canada. Their forecasts are the worst of all of those averaged results used by the IPCC (Figure 4).

tball fig 3

tball fig 4

The Canadian disaster is not surprising as their one-year forecast assessment indicates. They make a one –year forecast and provide a map indicating the percentage of accuracy against the average for the period 1981-2010 (Figure 5).

tball fig 5

The Canadian average accuracy percentage is shown in the bottom left as 41.5 percent. That is the best they can achieve after some thirty years of developing the models. Other countries results are no better.

In a New Scientist report Tim Palmer, a leading climate modeller at the European Centre for Medium-Range Weather Forecasts in Reading England said:“I don’t want to undermine the IPCC, but the forecasts, especially for regional climate change, are immensely uncertain.”

The Cost – “The sum expended must be well over $100 billion”

Joanne Nova has done most research on the cost of climate research to the US government.“In total, over the last 20 years, by the end of fiscal year 2009, the US government will have poured in $32 billion for climate research—and another $36 billion for development of climate-related technologies.

These are actual dollars, obtained from government reports, and not adjusted for inflation. It does not include funding from other governments. The real total can only grow.”There is no doubt that number grew, and the world total is likely double the US amount as this commentator claims.“

However, at least I can add a reliable half-billion pounds to Joanne Nova’s $79 billion – plus we know already that the EU Framework 7 programme includes €1.9 billion on direct climate change research. Framework 6 runs to €769 million. If we take all the Annex 1 countries, the sum expended must be well over $100 billion.”These are just the computer modeling costs.

The economic and social costs are much higher and virtually impossible to calculate.

As Paul Driessen explains“As with its polar counterparts, 90{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} of the titanic climate funding iceberg is invisible to most citizens, businessmen and politicians.”It’s no wonder Larry Bell can say,“The U.S. Government Accounting Office (GAO) can’t figure out what benefits taxpayers are getting from the many billions of dollars spent each year on policies that are purportedly aimed at addressing climate change.”

If it is impossible for a supposedly sophisticated agency like US GAO to determine the costs, then there is no hope for a global assessment. There is little doubt the direct cost is measured in trillions of dollars. That does not include the lost opportunities for development and lives continuing in poverty.

All this because of the falsified results from completely failed computer model prediction, projections or whatever they want to call them.It is time to stop the insanity, which in climate science is the repetition of creating computer models that don’t and can’t work? I think so.“Those who have knowledge don’t predict. Those who do predict don’t have knowledge.” Tzu, Lao (6th Century BC)

See Dr Ball’s website here:drtimball.com

Dr Tim F Ball:

B.A., (Honours), Gold Medal Winner, University of Winnipeg, 1970

M.A., University of Manitoba, 1971

Ph.D. (Doctor of Science), Queen Mary College, University of London (England), 1982Career

1996 to Now – Environmentalist, Public Speaker, Consultant, Author, columnist.

1988-96 Professor, University of Winnipeg

1984-88 Associate Professor, University of Winnipeg

1982-84 Assistant Professor, University of Winnipeg

1977-78 Acting Dean of Students

1972-82 Lecturer, Department of Geography, University of Winnipeg

1971-72 Instructor, Geography Department, University of Winnipeg

Dr Ball has authored more than 80 significant publications.

See list here:

http://drtimball.com/_files/dr-tim-ball-CV.pdf

 

Trackback from your site.

Leave a comment

Save my name, email, and website in this browser for the next time I comment.
Share via