Just How Accurate Are Weather And Climate Measurements?

The accuracy and integrity of weather and climate measurements have always been a concern. However, errors and omissions were not as consequential in the past as they are now.

A hundred or even fifty years ago, our major weather concerns were more limited to local weather. When we have a flight from NYC to LAX, we need to know more detailed and reliable weather information, like is it snowing in St. Louis where we have a layover?

Or the farmer in Nebraska who needs to see the spring wheat production forecast in Ukraine. He needs the best possible information to better estimate the number of acres of winter wheat he should plant for today’s global markets.

We especially need better and more reliable information to decide what actions we should consider preparing for climate changes.

While scientists, engineers, and software programmers know the importance and the need for this data accuracy, the general public is not aware of how challenging these tasks can be.

When looking at long term climate data, we may have to use multiple proxies (indirect measures that we hope vary directly with weather), which add an extra layer of complexities, costs, and sources of error.

One of the most commonly used proxies is the ancient temperature and CO2 levels from ice core samples. Also, for the last few hundred years, tree-ring data was a primary source of annual temperatures.

But since the past half-century, direct atmospheric readings are being used, which are very accurate and reliable.

Figure 1

When we look at figure 1, we see CO2 rise dramatically in the mid-1950s. We know that we stopped using proxies to measure the atmospheric CO2 content and started using direct readings from the Mauna Loa weather station in Hawaii.

So, we need to ask ourselves, was this dramatic increase in CO2 real, or could it be partially skewed by the change in the measurement process? Or in the early 1960s, we stopped using tree ring data where we could.

Certain discrepancies were found during the period when we had both tree ring and thermometer records. We cannot make such changes in measurements without leaving room for doubt.

For example, figure 1 shows the CO2 content of the Antarctic ice sheets that are thousands of years old. Before the mid-1950s, the CO2 estimates were calculated based on an ice core’s CO2 gas content.

The CO2 levels measured in this fashion never seemed to get much over 280 ppm over tens of thousand-year period.

Now note that starting about 6,000-years ago, we see a small but steady increase as we look from left to right.

And the growth seems at a reasonable constant rate until the mid-1990s. Here the classical assumption is that the CO2 and the temperatures were both going up.

Recently, scientists looked at the slope going down if we stand in the 1950s and look right to left.

Doing so raises the question, is CO2 getting squeezed out from the glaciers by its enormous weight as it ages and perhaps also in combination with CO2 being chemically sequestered out, and in what proportions?

Starting the mid-1950s, we see a very substantial and fast rise in CO2 levels, and we see the now-familiar CO2 hockey sticks. Did the CO2 shoot up that fast, or was it part of the anomaly caused by the change in measurement methods? We think the latter.

How would the average person know? Was this dramatic change ever explained in an exact, understandable way? For now, let’s refer to the subject in a more general term as “data integrity.”

Here is another simple example. If we wanted to measure the Boston area’s temperature 200 years ago, we might have taken, say, twenty thermometers to twenty different locations.

We would have made some general decisions about putting a few along the coast and the rest in various spots in the city and countryside—mostly on farms.

We may have only put one or two in the mountains or forests because these stations needed to be manned, and data recorded several times a day.

Then, maybe once or twice each day or week or month, they might have been consolidated, obtaining an average “Boston temperature” for October 1820.

How would that compare the Boston weather of October 1920 or 2020 to see if it’s been increasing or not? Well, it poses quite a challenge:

  • Over the last one hundred years, some trees might have grown around the thermometer, while in 1920, the thermometers might have been in full sun all day long. Now what?
  • Some instruments were moved for some reason, like a major highway construction; how did that affect the temperature readings?
  • Some instruments might have gradually gone out of calibration for months or even years before they were repaired or replaced. What do we do with the suspect data during the questionable period? Ignore it?
  • When the instruments were replaced, how were they replaced? Some variables include the same height from the ground, the same protecting box, mercury replaced by alcohol thermometer or thermocouples, etc.
  • A weather station was near a dirt road until it was covered by cement in 1926 and by asphalt in 1963? Later, reconfigured back to a dirt road when the area became a nature park in 2004?
  • How would we compare, contrast, and integrate those temperatures with temperatures leading up to 2020? Very different and very challenging:
    • Instruments that were once in a pasture are now near airport runways and jet exhausts!
    • Another one was near a shady, sandy road that’s now an asphalt parking lot.
    • Thermocouples have replaced many thermometers; how were the readings “stitched together”?
    • Other weather stations were just abandoned because of the high costs of maintaining them or were replaced by a remote thermocouple or telemetry.
    • How to reconcile the effect of the pollution of the 1960s to the 1990s with the pristine skies of the 1800s when clouds play such an important role?
    • And the cloud cover of 1820 was probably quite different from today as a result of the increasing levels of “aerosols,” which play a vital role in cloud formation, the “greenhouse” and “Albedo” effects.

In recent decades, and mostly since the satellite period, hundreds of Earth-based weather stations were abandoned for a variety of reasons, including cost and data reliability.

Over the past several decades, NASA and NOAA have been trying to “normalize” current and historical recorded land-based and sea-based weather records.

Figure 2

Note in figure 2 where we see two sets of precisely the same data! The blue line represented the actual land-based temperatures from 1,218 stations in the US when the readings were taken.

Compare that to the red line, which means these very same temperature records but after they were “normalized” by NOAA. [1]

“Normalization” has a practical basis. It is a little like calculating the amount of fruit when adding apples to oranges. However, the process is susceptible to erroneous assumptions and execution.

Donald Easterbrook, a prominent environmental scientist, claims that the previous historical records were purposely manipulated, as shown in figure 2. Accusations have been made that these temperatures were skewed to fit the current narrative of CO2 induced global climate change.

The historical blue line data has been changed at least four times over the past few decades, and now in its final form, the red line shows a more dramatic, steeper temperature rise since the 1980s by lowering the temperatures in previous decades!

Today when we are asked to make multi-trillion-dollar decisions based on our temperature history over the last century, they have become severe and consequential.

For more information we recommend our book A Hitchhikers Journey Through Climate Change, coming soon to the CFACT store at CFACT.org.

[1] Real Climate Science article: “The problem with the NOAA graph is that it is fake data. NOAA creates the warming trend by altering the data. The NOAA raw data shows no warming over the past century.” Source

Read more at CFACT

PRINCIPIA SCIENTIFIC INTERNATIONAL, legally registered in the UK as a company incorporated for charitable purposes. Head Office: 27 Old Gloucester Street, London WC1N 3AX. 

Please DONATE TODAY To Help Our Non-Profit Mission To Defend The Scientific Method.

Trackback from your site.

Comments (5)

  • Avatar

    RT

    |

    Got to keep the CC going. Have to love the accurate data records of NOAA. Bet if we moved the stations away from UHI sources, the numbers would fall to no heating observed. Now if we could remove the politics.

    Reply

  • Avatar

    JaKo

    |

    I bet that the temperature record “normalization” is being “$cientifically justified.” To about the same extent as the “face covering” studies show now, how effective these are in reducing the spreading of covid-1984.
    The conclusion:
    $cience for hire is the driving force behind many of the ills in our world; here, the exposed climate “untruths” are obvious, while many others would require much more effort to unmask…
    Cheers, JaKo

    Reply

  • Avatar

    tom0mason

    |

    “One of the most commonly used proxies is the ancient temperature and CO2 levels from ice core samples. Also, for the last few hundred years, tree-ring data was a primary source of annual temperatures.

    Says it all. These samples are naturally average by the CO2 gas migrating through the samples. Some investigators say — https://www.pnas.org/content/94/16/8343 and take note of the assumptions, such as

    Air trapped in glacial ice offers a means of reconstructing variations in the concentrations of atmospheric gases over time scales ranging from anthropogenic (last 200 yr) to glacial/interglacial (hundreds of thousands of years). In this paper, we review the glaciological processes by which air is trapped in the ice and discuss processes that fractionate gases in ice cores relative to the contemporaneous atmosphere. We then summarize concentration–time records for CO2 and CH4 over the last 200 yr. Finally, we summarize concentration–time records for CO2 and CH4 during the last two glacial–interglacial cycles, and their relation to records of global climate change.

    Has anybody investigated how fast CO2 disperses through ice? This can surely be verified in the lab?
    Adding today’s measurements to naturally averaged data is a scientific failure of vast consequence.
    So much sophistry, so little science.

    Reply

    • Avatar

      Jerry Krause

      |

      Hi TomO and Jerr,

      I address this comment to both of you because two natural phenomena are commonly ignored (overlooked): diffusion and radiation scattering by tiny particles.

      The physical structure of ice is an open rigid cage structure of water molecules maintained by an intermolecular attraction between water molecule know as hydrogen bonding. The evidence (proof) of this open structure is the observed fact the ice (solid water) floats on liquid water at temperatures of 273K (absolute temperature to avoid negative temperatures) and below. The same hydrogen bonding which maintains the genetic information of DNA. As I have recently written, I studied the diffusion of cadmium and lead ions in solid common salt. And while the carbon dioxide molecule is a linear 3-d molecule which means it is longer than broad. Hence, I can easily imagine it easily ‘slipping between from one open space of one ‘cage’ to the open space of the adjacent cage etc. etc. And thousands of years is a long time for the random motions (diffusion) to occur.

      As I ponder carbon-14 dating, the clear difference is carbon dioxide is a 3 atom molecule and the carbon-14 which is commonly dated is a part of macromolecule of thousands of atoms..

      I only learned about the existence of inexpensive infrared thermometers which are manufactured to measure the temperatures of surfaces (liquid or solid) but have commonly pointed my IRT up at visually cloudless Skys and at clouds. First the temperature of a reasonably thick overcast is only slightly less than the surface temperature of the ground and other surfaces I measure at the same time. The only explanation of this that I can image is the cloud droplets are scattering the IR photons beginning emitted by the earth surface’s back toward the earth surfaces.

      I have quoted R.C. Sutcliffe (Weather & Climate) many time here at PSI and I do again: “the natural atmosphere, however clean if may appear to be, is always supplied with a sufficient number of minute particles of salts, acids, or other substances … .” So I believe these minute particles which at minimum size are far larger than the atmospheric molecule even if at the same time are far smaller than the ordinary cloud droplets which almost scatter all the IR photons being emitted by the surface back toward the surface. So, given a measured surface temperature of about 25F I have measured a sky temperature (directly upward) of neg 35F. Which I consider isn’t any lower because the minimum sized minute particles are scattered a little of IR photon being emitted from the earth surface back toward the earth surface.

      Have a good day, Jerry

      Reply

Leave a comment

Save my name, email, and website in this browser for the next time I comment.
Share via