This post reminded me of this post, which upon second reading seems to be in need of more detail.
Again, suppose you have temperature data T(x,t) over space x and time t, say the range of t is the period [1860,2000] . Suppose you have come up with a correction component g(x)=1/30*∫T(x,t)dt, where the integral is over the sub-period [1960,1990]. You create "anomaly data" T1(x,t)=T(x,t)-g(x), so that ∫T1(x,t)dt=0 for all x, again where the integral is over [1960,1990]. Finally, you perform a regression analysis to estimate the trend in the spatial average Tavg,1(t)=∫T1(x,t)dx, where this integral is over the surface of the Earth, which we assume has total area 1.0 in some system of units. Suppose the slope of your regression line indicates an average rate of increase of 0.6 degrees per 100 years. This, I believe, is a simplified version of what climatologists have done.
Now Tavg,1(t)=∫T1(x,t)dx = ∫(T(x,t)-g(x))dx =
Tavg(t)-gavg. So correcting with g(x) just subtracts a constant gavg (independent of time and space) from the spatial average of your original data. Clearly, this will not affect the slope of your regression line. Apparently, the "corrections" you have done do nothing at all.
So what's the point?
0 comments:
Post a Comment