Log in

No account? Create an account

Previous Entry | Next Entry

The Data That Came in from the Cold

1. Preliminary

Radiosonde balloon measurements are in remarkable agreement with satellite measurements of temperature during that period when both were in use.  Here is the balloon data compared to both Univ. of Alabama-Huntsville (UAH) and Remote Sensing Systems (RSS), both of which are satellite-based. 

This argues that either a) both the balloon and satellite measures are wrong by the same amount in the same direction, year after year; or b) they both accurately reflect real temperatures. That said, radiosonde measures prior to the satellite system can be used with some confidence. 

2. In an Age Before the Age

That is, in the age before science became political science.   This chart of northern hemisphere temperatures appeared in National Geographic in 1976.  The zero-line was the then-present day temperature because the world is always supposed to be the way it is right now.  (Or else it is always supposed to be the way it was when we were growing up.)  There are undoubtedly psychological reasons for this. 

Matthews called attention to the fact that temperatures had been steadily declining since around 1938.  During the torrid 1940s there was open water at the North Pole and the Soviet Union was saved from the Nazis in part because because the Archangel Route was ice-free.  Articles appeared in the New York Times worrying about the warming.

But by 1976 the opposite worry had set in.  See John Gribbin's Forecasts, Famines, and Freezes for details.  Colder was worse.  It shortens growing seasons, inhibits plant growth, and the like.  In 1974, the US Science Board declared "During the last 20 to 30 years, world temperature has fallen, irregularly at first but more sharply over the last decade."  What to do? 


3. Saved from Global Cooling!

Obviously, if a cooling trend is bad, we must eliminate the cooling trend.  This was done posthumously, as it were, by retrospective adjustments to the old data.  Here are successive graphs from GISS:









GISS uses a temperature adjustment algorithm which adjusts data that are decades old.  In this manner the 1938-1978 decline, observed around the world, has been successively diminished and finally eliminated.  We have been saved from the cooling!

But there is a slight cheat in the above display: the vertical and horizontal scales differ and this tends to exaggerate the decline in the earlier graph and hide the decline in the third graph.  However, this is only a slight cheat because the numerical values were:

Mathews Graph 76: 1955 – 1965 abt. 0.30C warmer than 1970’s

Hansen/GISS 1980: 1955 – 1965 abt. 0.10C warmer than 1970’s

Hansen/GISS 1987: 1955 – 1965 abt. 0.05C warmer than 1970’s

Hansen/GISS 2007: 1955 – 1965 abt. 0.03C cooler than 1970’s

So instead of a 0.30 deg drop in temp, we now have a 0.03 deg warming from the earlier time to the later.  But "hardly a man is now alive who remembers that famous day and year" and the pravda now is that the decline from the 1940s to the 1970s was no big deal.  (See?  Here's the [adjusted] data!)  And there never was a big global cooling scare.  Nothing to see here.  Move along.

Larger versions of the graphs can be found at wattsupwiththat.  The clientele in the commbox over there are more intemperate than I, but also some Name scientists occasionally stop by. 



( 6 comments — Leave a comment )
Mar. 19th, 2010 03:49 am (UTC)
Hmm. But look at:


The HadCRUT3 series shows about a 0.5 C drop between the 1940s peak and 1976 - pretty much the same as the graph above, which shows 0.6 degrees. But it then goes on to show (which of course the 1976 graph doesn't) a rise to about 0.8 degrees above the 1976 value by 1999.(Not that far from the 'warmer' line on the 1976 graph. All the series show the 1990s and 2000s as warmer than the 'torrid' 1940s.
Mar. 19th, 2010 04:33 am (UTC)
Re: temperatures
Well, the adjusted values do; and everyone is working off the same numbers. If there is a systematic error in the adjustments that lowers past temperatures and raises recent ones, that might be sufficient. Personally, I am not certain that a global average temperature fine to a tenth of a degree is realistic, given the calibration status of the land stations and descriptions by ex-Naval personnel of the collection of sea temperature data.

But it is in the nature of the Multi-Decadal Oscillation (MDO) that it rises to a peak, cools to a trough, and then rise to another peak. It seems to be, from Danish work, related to the wave-length of the solar cycle: when the cycles are long and low, the earth gets cooler; when they are short and peaked, the earth gets hotter.

And successive peaks have been getting warmer even before the current hullabaloo because of the 400-year long rebound from the Little Ice Age. (Picture a sine wave riding along an increasing trend.)

But all that depends on the sun remaining in a Grand Max - the burner on high, so to speak - and there is mounting evidence that it no longer is doing so. The current solar cycle is coming off the longest one since the previous cooling spell. Solar Cycle 22 and 23 match closely with Solar Cycle 3 and 4. If the pattern repeats, our brand new halting and stumbling S/C 24 could shape up like S/C 5. The Dalton Minimum. Buy furs. A Finnish scientist reports that the Arctic Oscillation may have reversed polarity. That means high pressure over the Arctic and low pressure to the south, which will bring Arctic air down.

The problem computer modelers have is two-fold. With seven factors you can force-fit any set of past data by judicious choice of coefficients. This not only gives a false sense of confidence that you understand the process being modeled, but it virtually guarantees that you will never spot the factors that you did not include in the model. (Much better, IMHO, to start with empirical black box and study how outputs respond to inputs.)
Mar. 19th, 2010 09:18 am (UTC)
Just what was the algorithm for? And aren't the scientists supposed to present both the raw data and the tampered, I mean, adjusted results for comparison?

Mar. 19th, 2010 05:06 pm (UTC)
Re: algorithm
Certain kinds of adjustments are legitimate. For example, in a pharmaceutical fill operation I once saw a sequence of hourly data that looked like this: 65, 69, 62, 63. 65. 96, 63, 61, 68,... The "96" was clearly intended as a "69". Ditto, on 1 oz. and 2 oz. tube filler lines, I found a weight of 2.2 oz. on the 1-oz. line process sheet and for the same hour, a weight of 1.2 oz. on the 2-oz. line. Clearly, the two entries had been written on the wrong sheets. One time, on a packaging line measuring "print-to-perforation" registration on a tablet blister pack, I saw a nice stationary series suddenly shift up to a higher mean value (around which it continued as a stationary series). The assignable cause was traced to the ruler used to make the measurement. It had broken, and a new ruler had been used in its place. The difference between the two instruments resulted in a difference in the measured values.

Similar sorts of things can afflict station temperature data. Even the balloon and satellite data must be adjusted; the latter, say, for altitude, inclination, orbital period relative to earth's rotation, etc. The satellites do not measure =temperature= but other radar or microwave properties which are converted to temperature.

The problem, as I understand it, is that any such adjustment must be individually justified and tailored to the specific assignable cause. But scientists dealing with reams of data (which to them are simply numbers) have resorted to blanket "algorithms" to apply a universal correction regardless of the particular circumstances.

Mar. 20th, 2010 03:28 am (UTC)
Re: algorithm
It is true, of course, that the instruments may be at fault, or the method of data-gathering. Nothing surprising there. (This is specially poignant in the behavioral sciences, where often the instrument of measuring is the measurer himself.)

On the other hand, it seems incredibly irresponsible for a scientist to just apply algorithms without investigating the data first.

Mar. 20th, 2010 09:01 pm (UTC)
Re: algorithm
The major difficulty is that data collection methodology is data itself and neither should be discarded if authentication is at issue.

Realistically, there's never enough storage space. However, when it's legally mandated, funded, and will reasonably be subject to audit, managers better keep it or expect to lose their jobs if it goes missing.

Bent or missing data is the hallmark of fraud in science. Either have it on hand for inspection or do not cite it in one's publications.

Keep in mind, though, that strange data is not the same as bent.

Part of the problem is that many scientists today most in the public eye don't seem to follow the rule I was taught, which requires one to ask, "What is the best way to disprove what I want to prove?"

I think of this as the "Brick Man" rule.


( 6 comments — Leave a comment )


Captive Dreams

Latest Month

June 2015


Page Summary

Powered by LiveJournal.com
Designed by Taylor Savvy