Only about 320 million  out of 7 billion people still use fahrenheit.
Also there are definitely some quality control issues with the RTMA data, so you can bump into the garbage-in-garbage-out issue with it, but overall, this is a very nice start to something that could ultimately be quite useful.
Instead, I think you'd need to find temperature measurements that are completely independent and use them for verification. Along this line, I'm not sure how refitting the data to ground stations would produce a better match anywhere except at those ground stations (overfitting). Or are you using ground stations that are truly independent?
(The problem with finding completely independent measurements is that we'd want to use them as an input!)
I really like the interface, even the zoom buttons could be removed or hidden as you can scrool-zoom.
Make the temperature in Celsius please!
EDIT: There seem to be a problem with the algorithms, the largest zoom displays colder temperatures( look at the temperature map ).
Normal zoom, picture was resized by me
This teaches you nothing whatsoever except that your model has pretty colors.
Next thing someone is going to take these results, use them as input data for a new model, then send the results of that new model back as data for the first.
What would a better map be? Is your point that they are doing interpolation on something that is already interpolated? Or are you implying that there is no way to create a map of temperature using only point measurements? I would like to know how well this data matches the raw station measurements (and verification measurements) but I think it's a decent visualization of likely real time temperature across a region.
> but I think it's a decent visualization of likely real time temperature across a region.
Maybe I'm being pedantic, but to me this is a 2D visualization of a simulation. But it is not a map.
This is a blended product (i.e., multi-instrument, and gaps filled) with 1km resolution. There is also a 1km MODIS land surface temperature data product:
Neither of these is real-time (more like daily).
And while we actually use MODIS data as an input to our temperature correction model, it is, as you mentioned, land surface temperature, whereas our map represents near-surface air temperature (i.e., what you'd get in a normal weather report).
(Based on 3km/750m WRF model run two times a day based on the NCEP data. )
There seems to be a typo in the Stats (my emphasis):
>Pixels: 16-bit unsigned ints, representing "deci-kelvin" (i.e., divide by 100 to get the temperature in Kelvin).
It should either be centikelvin or divide by 10.
On another note: It would be cool if you could hover over a certain area to see its temperature.
This leaves me wondering, how does one go about designing such a system? Are you planning on doing a technical write up soon?
>>It regenerates every hour, providing a constantly updating snapshot of air temperature around the globe.
Once per hour is not "real-time." It's "once per hour."
But historically, "real-time" means "not batch mode" - where batch jobs are executed whenever there are available resources, in an unpredictable manner.
Real-time is responding to real world input on some periodic schedule that is appropriate. In this case, one hour sounds fine for a world temperature map. Are you expecting to make minute-to-minute decisions based on the global temperature distribution?
Local temperature I'd expect to be updated faster, but really it's a matter of your use-case.