Hacker News new | past | comments | ask | show | jobs | submit login
Climate models go cold (financialpost.com)
11 points by olalonde on May 20, 2011 | hide | past | favorite | 8 comments



As noted in the comments of the article:

"This article is not new but was first published in 'The Australian' in July 2008. [Evans] has never published a peer-reviewed paper on climate change and, up until 2008, has only published one paper in 1987 on a totally unrelated subject. From 1999 to 2006 Evans consulted to the Australian Greenhouse Office designing a carbon accounting system that is used by the Australian Government to calculate its land-use carbon accounts for the Kyoto Protocol. While Evans says that he "knows a heck of a lot about modeling and computers," he states clearly that he is "not a climate modeler." Evans has published an article for the Alabama-based Ludwig von Mises Institute, a libertarian think tank: Evans also published a background briefing document for the Australian chapter of the Lavoisier Group, a global warming "skeptic" organization with close ties to the mining industry."

So take that however you like.


This is a nonsensical criticism IMHO.

If someone is giving pure opinion then you have to evaluate the source. But if someone is presenting facts you have to evaluate the facts and there's no real point in evaluating the source (sources can give incorrect facts but that's exactly why you're evaluating the facts themselves). So in the case of someone stating facts the source is irrelevant because they are just serving to draw your attention to facts which you should then independently verify yourself.

In the article he makes several factual claims that can be verified to be true or false...

"They keep lowering the temperature increases they expect, from 0.30C per decade in 1990, to 0.20C per decade in 2001, and now 0.15C per decade"

"In the United States, nearly 90% of official thermometers surveyed by volunteers violate official siting requirements that they not be too close to an artificial heating source."

"satellites say the hottest recent year was 1998, and that since 2001 the global temperature has levelled off."

"The Earth has been in a warming trend since the depth of the Little Ice Age around 1680. Human emissions of carbon dioxide were negligible before 1850 and have nearly all come after the Second World War "

Those facts are what's relevant and it's those that should be evaluated (to the best of my knowledge all but the 1998 being the hottest year claim are true). Then you can draw your own conclusions based on those facts plus the ones you already knew. Who the author is or what he's published is besides the point.

Edit: on the 1998 claim I found this with a quick Google search: http://www.theregister.co.uk/2008/05/02/a_tale_of_two_thermo...


"They keep lowering the temperature increases they expect, from 0.30C per decade in 1990"

Take a look at model B of Hansen et al. 1988 and compare it to the actual temperatures and you'll see that it's not a bad fit. http://www.realclimate.org/index.php/archives/2009/12/update... . The 1988 estimate was "0.26+/-0.05 ºC" so "0.25C per decade in 1990" would be a more accurate statement than the overly accurate/imprecise "0.30" you quoted here.

That sequence of 0.30C, 0.20C, 0.15C could very well be 0.26, 0.18, 0.17 if rounding rules were chosen with a view towards exaggerating for the sake of truthiness. But let's say it isn't - without knowing the reason for the changes, and especially without knowing the error bars, then it's hard to comment on the apparent decline. The 1988 paper is one of the first global climate models, so perhaps the parameters needed refinements which later researchers have done.

"In the United States, nearly 90% of official thermometers surveyed by volunteers violate official siting requirements that they not be too close to an artificial heating source."

That has been researched a lot. The phrase "artificial heat source" is a deliberate distraction. The complaint is that the thermometers are too close to parking lots or other things which might contribute an "urban heat island" effect. This is quite different than "artificial heat source" which sound like it's next to a boiler or engine or exhaust from a building.

However, throw out those 90% and look at the 10% and the signal is still there. Look at the sea-based thermometers, and the satellite measurements, and the signal is still there. Thus, the putative "artificial heating source" has no significant effect on the measurements. See for example http://www2.sunysuffolk.edu/mandias/global_warming/global_wa... .

"since 2001 the global temperature has levelled off."

That's a biased sample error. 1998/1999 was a hot spike. See http://en.wikipedia.org/wiki/File:Satellite_Temperatures.png . Extend the average over a longer time and the trend is more obviously trending upward. http://en.wikipedia.org/wiki/File:Global_Temperature_Anomaly...

"The Earth has been in a warming trend ..."

That's a meaningless statement. No one contests that it's warmer now than the Little Ice Age. The question is, what impact do humans have on the warming trend? Pulling numbers out of thin air: If the breakdown is 0.01C "naturally" per decade and humans during the last century have gotten it to raise by 0.16C per decade, then the natural influences are minor and it's best to be concerned about the human contribution.


> The complaint is that the thermometers are too close to parking lots or other things which might contribute an "urban heat island" effect. This is quite different than "artificial heat source" which sound like it's next to a boiler or engine or exhaust from a building.

Quite a few actually are near an air conditioner exhaust. There's been a change over time from thermometers where somebody has to go outside and manually look at the thermometer to record a reading to thermometers that are essentially on a computer network; the electronic ones are generally connected to a building via a power/data cable which makes it more trouble to site them well - the cable might not be long enough or might get in the way or you'd have to dig up a street to put the cable in...so the new sensors tend to be closer to buildings than the old ones. Not to mention that air conditioners are more common now than they were in times past. The upshot is that lots of modern thermometers are near buildings that both reflect heat off surfaces and have air conditioners blowing heat into the air. Which made it at least plausible that the better-sited sensors might show a different trend than the worse ones.

The exact definitions of the categories are:

=====

Class 1 (CRN1)- Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (<19deg). Grass/low vegetation ground cover <10 centimeters high. Sensors located at least 100 meters from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots. Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away. No shading when the sun elevation >3 degrees.

Class 2 (CRN2) - Same as Class 1 with the following differences. Surrounding Vegetation <25 centimeters. No artificial heating sources within 30m. No shading for a sun elevation >5deg.

Class 3 (CRN3) (error >=1C) - Same as Class 2, except no artificial heating sources within 10 meters.

Class 4 (CRN4) (error >= 2C) - Artificial heating sources <10 meters.

Class 5 (CRN5) (error >= 5C) - Temperature sensor located next to/above an artificial heating source, such a building, roof top, parking lot, or concrete surface."

====

Class 1 and 2 combined only add up to about 8% of the sensors, according to surfacestations.org. So the claim in this article that "nearly 90% surveyed violate official siting requirements that they not be too close to an artificial heating source." today could be updated as "roughly 92% surveyed..."


The question isn't that the sites are/are not poorly situated; the question is "what impact does that have on overall average temperature readings?"

Here's the most relevant paper http://pielkeclimatesci.files.wordpress.com/2011/05/r-3671.p... . It "surveyed 82.5% of the U.S. Historical Climatology Network (USHCN)",

"""Temperature trend estimates vary according to site classification, with poor siting leading to an overestimate of minimum temperature trends and an underestimate of maximum temperature trends, resulting in particular in a substantial difference in estimates of the diurnal temperature range trends. The opposite-signed differences of maximum and minimum temperature trends are similar in magnitude, so that the overall mean temperature trends are nearly identical across site classifications. Homogeneity adjustments tend to reduce trend differences, but statistically significant differences remain for all but average temperature trends."""

In other words, for purposes of determining average temperature trends - which is what we are talking about - the siting does not play a statistically significant role

Note also that the numbers you gave ("Class 5 (CRN5) (error > 5C)") come from NOAA's Site Information Handbook where it clearly says "The errors for the different classes are estimated values." The results of that paper will no doubt help refine those estimates.

Finally, I point to the difference in how "artificial heat source" is defined for purposes of that classification vs. how it's understood in general reading. Without knowing the technical definition, most people will assume that it's near an active heat source, like "an air conditioner exhaust". But "artificial heat source" also includes "parking lot, or concrete surface" which are passive heat sources. And as the paper shows, the result is that they moderate the temperature but do not significantly affect the overall average.


I'm not taking sides on these matters, because I don't know enough, but if these claims about the scientific community are somewhat true, it also explains why there isn't any peer-reviewed material on the skeptic side. All these other arguments are also flawed. Science based on peer-reviewing papers resulting from closed source code and data is also flawed.


What makes you think there isn't any peer-reviewed material on the skeptic side? There's some. Not as much as there could be, perhaps...

For instance, there's this paper explaining the weakness of tree rings in temperature reconstructions: http://icecap.us/images/uploads/Loehle_Divergence_CC.pdf





Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: