Originally published in Issues and Insights
Ever since the beginning of the global warming debate, now labeled “climate change,” there has been one immutable yet little-known fact: All of the temperature data stations used to make determinations about the state of Earth’s temperature are controlled by governments. And, all of the data on global temperature reported in the media are from government reports.
In June 1988, when Dr. James Hansen, then-director of NASA’s Institute for Space Studies in Manhattan, went before the Senate Energy and Natural Resources Committee to say that, “global warming has begun,” he was using temperature data collected by governments worldwide from a weather station network that was never intended to detect a “global warming signal.”
In fact, Dr. Hansen had to develop novel statistical techniques to tease that global warming signal out of the data. The problem is, these weather station networks were never designed to detect such a signal in the first place. They were actually designed for weather forecast verification, to determine if forecasts issued by agencies such as the U.S. Weather Bureau (now the National Weather Service) were accurate. If you make temperature and precipitation forecasts for a location, and there is no feedback of the actual temperatures reached and the rainfall recorded, then it is impossible to improve the skill of forecasting.
The original network of weather stations, called the Cooperative Observer Program (COOP), was established in 1891 to formalize an ad hoc weather observation network operated by the U.S. Army Signal Service since 1873. It was only later that the COOP network began to be used for climate because climate observations require at least 30 years of data from weather stations before a baseline “normal climate” for a location can be established. Once the Cooperative Observer Program was established in the United States, other countries soon followed, and duplicated how the U.S. network was set up on a global scale.
However, the COOP network has several serious problems for use in detecting climate change on a national and global scale. The U.S. temperature readings made by volunteer COOP observers are rounded to the nearest whole degree Fahrenheit when recording the data on a paper form called a B-91. When comparing such coarsely recorded nearest whole degree temperature data to the claims of global warming, which are said to be about 1.8°F (1.0°C) since the late 1800s, the obvious questions of accuracy and precision of the COOP temperature data arise.
Even more concerning is that more than 90% of the COOP stations in the United States used to record climate data have found to be corrupted by local urbanization or contamination effects over time, with hundreds of COOP stations found to have been severely compromised by being placed next to air conditioner exhausts, jet exhausts at airports, and concrete, asphalt, and buildings that have sprung up near stations. All of these heat sources and heat sinks do one thing and one thing only: bias the recorded temperatures upwards.
The crux of the problem is this: the NWS publication “Requirements and Standards for Climate Observations” instructs that temperature data instruments must be “over level terrain (earth or sod) typical of the area around the station and at least 100 feet from any extensive concrete or paved surface,” and that “all attempts will be made to avoid areas where rough terrain or air drainage are proven to result in non-representative temperature data.” However, as detailed in this report, these instructions are regularly violated, not just in the U.S. network, but also in the Global Historical Climate Network.
This isn’t just a U.S. problem; it is a global problem. Examples exist of similarly compromised stations throughout the world, including Italy, the United Kingdom, China, Africa, and Australia.
With such broad corruption of the measurement environment, “The temperature records cannot be relied on as indicators of global change,” said John Christy, professor of atmospheric science at the University of Alabama in Huntsville, a former lead author on the Intergovernmental Panel on Climate Change.
The fact is that all global temperature data are recorded and compiled by government agencies, and the data are questionable due to corruption issues, rounding, and other adjustments that are applied to the data. In essence, the global surface temperatures that are reported are a mishmash of rounded, adjusted, and compromised readings, rather than being an accurate representation of Earth’s temperature. While scholars may claim the data are accurate, any layman can surmise that with all the problems that have been pointed out, it cannot possibly be accurate, only an estimate with high uncertainty.
Only one independent global temperature dataset that is independent of government compilation and reporting methods exists, and that is satellite-derived global temperature data from the University of Alabama at Huntsville (UAH), which are curated by Dr. John Christy and Dr. Roy Spencer.
But, even the UAH satellite dataset doesn’t give a full and accurate picture of global surface temperature because of limitations of the satellite system. At present, the system measures atmospheric temperature of the lower troposphere at about 26,000 feet (8 kilometers) altitude.
To date, there is only one network of climate capable weather stations that is accurate enough to fully detect a climate change signal. This is the U.S. Climate Reference Network (USCRN), which is a state-of-the-art automated system designed specifically to accurately measure climate trends at the surface. Since going into operation in 2005, it has not found any significant warming trend in the United States that can be attributed to climate change.
Unfortunately, the data from the USCRN network are buried by the U.S. government, and are not publicly reported in monthly or yearly global climate reports. It has also not been deployed worldwide.
Given the government monopoly on use of corrupted temperature data, questionable accuracy, and a clear reticence to make highly accurate temperature data from the USCRN available to the public, it is time for a truly independent global temperature record to be produced.
This isn’t “rocket science.” Given that governments are spending billions of taxpayer dollars on climate mitigation programs, doesn’t it make sense to get the most important thing – the actual temperature – as accurate as possible? Given today’s technology, shouldn’t we confirm and validate the need for it with a network and data collection that is not relying on a 100-year-old system? After all, if we can send a man to the moon, surely, we can measure the temperature of our own planet accurately.
Who is we” and who would pay for this?
You must assume someone in the US should lead the way. But you appear to prefer the someone to be outside of the government. Does that mean NOAA will lose the authority to report a US global average temperature to NASA-GISS and other government organizations? Or would the new private weather station network replace the NOAA government averages?
You claim the US has a well sited USCRN network
But it is not used for the global average temperature.
So having a good weather station network does not mean the global average temperature will change.
You imply the US average temperature used for the global average temperature is not accurate, when in fact, it is almost identical to the USCRN average that you claim is accurate.
You have contradicted yourself.
Everything you say makes sense to me. Government control of climate data leads to politicization, and politicians are not known for their honesty.
If I got a degree in climate I would sue the school, the department and the teacher.
90% of the COOP readings are corrupt. The school, department and teacher have been using false numbers. I would make sure I got every penny back.
Can’t wait to meet the next lemming, I’m just going to laugh. All there data is wrong.
In light of Dr. Jennifer Marohasy’s success in getting the Australian Bureau of Meteorology to release their data comparing platinum probe thermometer readings to mercury thermometer readings since 1996, I wonder whether there is a way to map these probes elsewhere in the world to see what they are doing. Marohasy has found an upward statistical bias in the probe temperature readings, at times reaching 0.7°C. Just as the shift from proxy temperature data to thermometer data caused the “hockey stick” illusion, the new probe thermometers may be doing something similar to temperature estimates since the 1990s.