Well if only one or two locations are needed to show a trend then consider how accurate of temperature readings we would have if only two independent places set up world wide where used as the reference points for the whole global climate. Say using Antarctica for one since its far removed from human in influences and perhaps the Sahara desert for the other being that too is unaffected by human influences as well.
If those two points where used for the whole world temperature reference then we should be able to assume that -20 F should be a normal typical night time temperatures and +120 F should be a normal day time temperature.
Sorry but if you want to show a trend in something on a planetary scale a few sensors is not even close to being scientifically valid. Its like saying a hand full of transistors is the same as a quad four 3.6 GHz processor and that the other billion or so transistors its uses are just unnecessary.
Right now the world wide weather and meteorological prediction and data gathering networks have tens of thousands of sensor systems in place and working all together in real time and most times they can only tell you that tomorrow may have a 80% chance of being the same or different as it was today. And next weeks forecast is guaranteed to be even less accurate than that!
If you told them to predict what the weather is going to be like for the next 100 years using only a few dozen sensors spread out world wide they would look at you like your retarded!
But then again the climatologists dont use the full compliment of meteorological data and information for their sources since it still always has that problem of having those annoying little * some place that says the whole estimate has a +- percentage of error in it. And that '*' is often listed as being 10 to 100 times greater than what it is they are going on about.
Saying something changed by +.3 with an error factor of +- 3 to 30 rather invalidates the whole thing.
If those two points where used for the whole world temperature reference then we should be able to assume that -20 F should be a normal typical night time temperatures and +120 F should be a normal day time temperature.
Sorry but if you want to show a trend in something on a planetary scale a few sensors is not even close to being scientifically valid. Its like saying a hand full of transistors is the same as a quad four 3.6 GHz processor and that the other billion or so transistors its uses are just unnecessary.
Right now the world wide weather and meteorological prediction and data gathering networks have tens of thousands of sensor systems in place and working all together in real time and most times they can only tell you that tomorrow may have a 80% chance of being the same or different as it was today. And next weeks forecast is guaranteed to be even less accurate than that!
If you told them to predict what the weather is going to be like for the next 100 years using only a few dozen sensors spread out world wide they would look at you like your retarded!
But then again the climatologists dont use the full compliment of meteorological data and information for their sources since it still always has that problem of having those annoying little * some place that says the whole estimate has a +- percentage of error in it. And that '*' is often listed as being 10 to 100 times greater than what it is they are going on about.
Saying something changed by +.3 with an error factor of +- 3 to 30 rather invalidates the whole thing.