While true, to an extent, it's also the case that resolutions other than native will require some kind of supersampling (eg. AA), unless they divide cleanly into native resolution - a problem CRTs never faced.
This results in both blur (ngligible with a high enough resolution, but doesn't seem to be the case here, seeing the Safari vs Chrome comparison) and a performance hit.
True, and even CRTs had the issue that if you set a resolution that created interference in the shadow mask it would look like crap. There was a always a 'best' resolution to a CRT even if it was forgiving in terms of different settings.
Given the 'free' GPU we get with machines these days (and by free I mean its always included) it seems reasonable to use its oversampling anti-aliasing feature for desktop display. I'm not sure why you wouldn't do that unless it was some sort of power issue.
The power issue is hugely significant. Using a GPU kills battery life, which is why laptops now have two GPUs -- a low-power basic GPU and a switchable high-power GPU
I get the impression that at every "resolution", we're really at the native retina resolution, but with UI widgets scaled to different levels, and different present-day resolutions reported to applications. Can anyone with one of these beauties confirm?
Yes, but this just amounts to scaling via the compositor and graphics stack (hardware-accellerated, of course), as opposed to being left to the 'screen' (post-GPU-hardware/firmware) as would be the case when actually driving a screen at non-native resolution.
See the Safari/Chrome comparison in the article to see what this means in real terms...
This results in both blur (ngligible with a high enough resolution, but doesn't seem to be the case here, seeing the Safari vs Chrome comparison) and a performance hit.