What's happening here is freetype2 (and to a greater extent Windows) are grid-fitting the fonts. They anchor the font to the nearest pixel via a rather complex algorithm, and then anti-alias that. You can see exactly what's happening on Windows by turning off ClearType.
This results in a font that is sharper on screen, but is more jagged and doesn't resemble the print counterpart quite as well.
MacOS on the other hand renders the font as it would for print - ignoring pixel boundaries - then antialiases that. On low DPI screens this results in a heavier feeling font, and some times artefacts where parts of the font get blurred out, but there's less difference between screen and print.
This was more useful when Macs were used heavily in pre-press, you'd want your display to match the print so you can avoid making proofs.
A few fonts include pixel alignment hints which Windows (but not freetype unless you compile yourself - Apple has patents on this) and MacOS can use to make a nicer alignment on screen, but since the libraries got better at auto-fitting these have dropped away. Take a newish font and put in on Windows XP with ClearType off to see how bad XP's was. Vista was better and 7 is pretty similar to MacOS when there are no hints.
It's not so relevant these days, but I don't imagine Apple changing. I'd expect Apple to move to high DPI screens over the next few years where this isn't an issue.
>This was more useful when Macs were used heavily in pre-press, you'd want your display to match the print so you can avoid making proofs.
It's also useful if you just prefer it that way. Strong hinting discards most of the original character of the font, but a lot of people find it easier to read. I think strong hinting looks like garbage, but this is a matter of opinion.
Absolutely agree. I prefer the OS X rendering in all of these samples.
On the Gmail screenshot, the capitals look vertically squashed and 'i' and 'l' are almost impossible to discern. The letter shapes improves at title size, but the letter-spacing is either too wide or too tight.
Clearly it's a matter of opinion. But I don't think your explanation makes any sense -- surely the presence or absence of a clear edge to the outlines is part of the "character" of the font too, no? The font designers didn't intend for it to be blurry...
What I mean is that hinting reduces the distinctions between fonts far more than mere antialiasing does. Or, equivalently, hinting discards more information.
It's also my preference but I tend to sit further from the screen than most people I know who use Windows. I find it easier to read from my preferred distance.
I believe you're correct: OS X doesn't employ hinting at all, whereas Ubuntu does by default. Indeed, if you set hinting to "None" on Ubuntu (this is how my computers are configured), you get very similar results to OS X.
I think it's largely a matter of preference: I prefer anti-aliased letters of the right shape, and find the distorted grid-fitted letterforms of Windows almost unbearably ugly, yet others complain about the "fuzzy" fonts on OS X.
IIRC, the TrueType hinting patents have all expired, so FreeType enables it by default now. Whether that looks better than the auto-hinter depends a lot on the specific font file.
I think this still depends on the distribution. Fedora, for example, still contains a freetype version with the hinting patch disabled. You need to install freetype-freeworld from a third party repository to enable it.
You're right, though. The patents expired in May 2010.
Mac renders fonts optimized for "correctness", which can appear fuzzy at low resolution. Their new hi-res (~200dpi) screens solve the problem, you get correctness and legibility.
Windows and Linux tend to optimize for screen legibility at low resolution, resulting in increased sharpness. At hi-res, they'll likely not look as correct as the Mac rendering.
> At hi-res, they'll likely not look as correct as the Mac rendering.
The higher res you go, the less differences there'll be between the two rendering methods. Fitting to the pixel-grid doesn't alter very much if the pixel grid is small enough.
Consider: Rendering a small font size at high-res is equivalent to rendering a larger font size at lower-res. If you look at the larger fonts on TFA's screenshot, e.g. "Node JS Design Patterns 101", there's proportionally much less difference (between Ubuntu and OSX) than at small sizes.
Freetype2 (the renderer used in linux) is of course capable of doing both (grid-fitting or not), and desktop environments like gnome make it very easy to choose whichever method you prefer.
I know, and I'm using it for this, but the previous dialog was just way better and user-friendly. It had large previews and descriptions while the new tweak tool only has some uncommented options in a drop-down menu.
You can still edit ~/.fonts.conf by hand. If you need an intermediate level, then you're advanced enough to use vim and write the config yourself. The GUI shouldn't have to pander to fickle powerusers like you, it's supposed to make the PC EASY.
No, that's wrong. As resolution increases, the relative effect of hinting and fitting drops to zero. On high resolution displays they will look increasingly indistinguishable.
Yes, it is a well-known issue with Linux and Windows :-)
Seriously, you have to pick one: you either deform letter shapes so that they fit the grid better, with the disadvantage that your line breaks change and/or your letter and/or word spacing look awful, or you ignore the grid until the last moment, compute what percentage of each pixel is covered by each 'infinite resolution' graphene, and color them gray accordingly, with the disadvantage that text looks a bit more blurry.
(technically, there is a third way: layout each character of each font at some set of fixed point sizes by hand, so that it fits the grid perfectly. The original Mac used that method; it became infeasible when the LaserWriter shipped)
There was a post on Coding Horror (Jeff Atwood's blog) about this in 2007. I'm not meaning to be snarky, but certainly for a certain class of people, this has been well-known for a while.
Also, as I alluded to in my previous comment, as resolutions increase in the future, the Mac OS approach is more correct in the long run.
Though, it has come with the cost of fuzzy fonts and squinting for the last decade.
It's probably the vision that they had planned for years. Isn't that where we all want to go? I'm hoping that "Retina" becomes the new standard everywhere within the next few years.
No discussion of font rendering in Linux is complete without a hat tip to infinality, which aims to provide much better font rendering. infinality started out as a patch to freetype, and was recently merged in, which is quite exciting.
Ironically, the website is quite hard to read, thanks to a kind of simulated macular degeneration effect of drop-shadow glow around each letter. It amused me.
I can't find many examples of the effect of the infinality patches, though. Is it just a single-axis hinter?
Last time I configured Infinality I remember the example config file being very well explained about the possible options. Try installing it and reading your /etc/profile.d/infinality-settings.sh
BTW, everything looks so big on unbuntu 12.04 compared to win xp on my laptop at 1280 x 800 px res. Is it normal ? even reducing the font size on ubuntu to 10px does not change much. I can't see as much thing on the screen on ubuntu compared to win xp, playschool like. Not very convenient for surfing or coding.
This results in a font that is sharper on screen, but is more jagged and doesn't resemble the print counterpart quite as well.
MacOS on the other hand renders the font as it would for print - ignoring pixel boundaries - then antialiases that. On low DPI screens this results in a heavier feeling font, and some times artefacts where parts of the font get blurred out, but there's less difference between screen and print.
This was more useful when Macs were used heavily in pre-press, you'd want your display to match the print so you can avoid making proofs.
A few fonts include pixel alignment hints which Windows (but not freetype unless you compile yourself - Apple has patents on this) and MacOS can use to make a nicer alignment on screen, but since the libraries got better at auto-fitting these have dropped away. Take a newish font and put in on Windows XP with ClearType off to see how bad XP's was. Vista was better and 7 is pretty similar to MacOS when there are no hints.
It's not so relevant these days, but I don't imagine Apple changing. I'd expect Apple to move to high DPI screens over the next few years where this isn't an issue.