

The raster tragedy at low resolution (1997) - anon1385
http://www.microsoft.com/typography/tools/trtalr.aspx

======
a1k0n
I really enjoyed this. I had a project once where I had to pre-render a font
to be displayed at various (low) resolutions, and the Microsoft foundry fonts
stood out as working exceptionally well at very small sizes. Almost nothing
else was even acceptible at e.g. 6 or 7pt. I knew it was because of superior
hinting, but I didn't realize how hinting worked, exactly, until now.

------
huhtenberg
It's an interesting read, but I just can't help but think that the tragedy
lies with Microsoft and not the low resolution. It's 2011 and Apple has been
doing font auto-hinting for ages, and so has been Adobe. Microsoft though
still prefers to live in tragedy and defiantly clings to TrueType format and
its horrendously labor-intensive manual hinting. I really don't know what's
wrong with them, but for a fraction of Vista's 1bn budget they could've
written a dozen of TTF auto-hinters.

~~~
benhoyt
And in 2011, "live in tragedy" Microsoft can render fonts sharply, Apple
can't.

I'm a Windows guy temporarily using a Mac to write an iPad app, and the main
thing that bugs me about OS X is how fuzzy the fonts look. In Windows, my
fonts look ultra sharp on-screen; on Mac, fonts look fuzzy. Apparently it's
because of this: [http://www.codinghorror.com/blog/2007/06/font-rendering-
resp...](http://www.codinghorror.com/blog/2007/06/font-rendering-respecting-
the-pixel-grid.html)

But I say to Apple: unless one's monitor has the dot pitch of the iPhone
"retina display", your font rendering looks really fuzzy. <rant>What good is
"preserving the typeface design" if it makes your eyes water?</rant>

~~~
baddox
Apparently, this is subjective. I agree with you that OS X fonts look awful,
but it's clearly a deliberate design choice, and knowing Apple, I suspect
there is at least one person who thought it over and genuinely thinks their
font rendering is the better choice. I can't fathom why anyone would prefer OS
X fonts, but apparently people do.

~~~
notJim
The (IMO, terrible) font rendering on Windows is actually the second biggest
reason I strongly prefer OS X (the biggest reason is not relevant to this
discussion.)

So yes, this is quite subjective.

~~~
baddox
Can you explain why you prefer the font rendering in OS X, or do you think
it's just a matter of what you've grown used to? I think fonts in OS X look
notably fuzzy, and the strokes all tend to look too bold.

~~~
sarenji
Anecdotally, there's definitely some truth to it being a matter of what you've
grown used to. I moved from an old Windows desktop to a Mac OS X laptop. I
found the fonts on Mac OS X to be exactly as you described -- fuzzy, bold,
fairly annoying. After a month of heavy usage, I stopped noticing it. When I
visited my old Windows computer a year later, I realized I perceived all the
fonts as incredibly jaggy, thin, and hard to read. But after a while (1 week)
I stopped noticing as much.

~~~
pbz
I have a similar experience, but while I can get used to the blurriness, and
after a while I don't notice it as much, going back is always a breath of
fresh air. I imagine it like putting your glassses on - if I had to wear any -
everything just aligns right. In fairness this happens on Windows as well if I
have ClearType turned on, but not quite as much. Mind you the MBP has a higher
dpi than my desktop monitor, and the bkurriness is still noticeable. On a
different laptop that has Linux running, slightly higher dpi looks good; same
with my Android phone.

No matter how long I use font aliasing, without some very high dpi my eyes
just cannot adjust, cannot totally ignore the bluriness. Seems like I'm in the
minority though.

------
micheljansen
Fascinating piece of history. I wonder if this will all once become relevant
again for some new kind of display technology which trades pixel density and
color depth for some other desirable property (such as e-ink etc.).

------
WalterBright
What I find interesting is that the subtitles on DVDs, like "Deadwood" circa
2007 use a blocky font that looks like it was taken from a 1983 PC CGA card.

Closed Captioning fonts are hardly better.

~~~
__david__
That's probably because your DVD player is rendering them. When I watch a
movie with subtitles in VLC I get really nice smooth Helvetica.

~~~
danvet
Iirc DVD subtitles are actually a pre-rendered overlay stream. YUV (with uv
subsampling) works great for pictures, not so much for text (and has only ever
the original DVD resolution and hence fails at upscaling). VLC does indeed
grab subtitles from the text and renders them crispy in the screen resolution.

~~~
ars
The pre-rendered ones are called subtitles, and the text is called closed
captions.

Not all DVDs have both, although a lot do. Some have just one or the other.

~~~
__david__
It was my understanding that the difference between closed captions and
subtitles was a semantic one--closed captions are for the hearing impaired and
subtitles are for people that can hear but don't know the language.

So closed captions have stuff like "door opens, a radio is playing an old tune
in the background. Sheila: _Hello_ " but with subtitles you'd just see "Hello"
since if you can hear then you can infer everything else about the scene.

~~~
jbri
There are sort of two different definitions at work here - the semantic
definition which is as you describe, and then there's the technical definition
which has to do with the implementation.

"Subtitles" on a DVD are basically a 4-colour (3 + transparent) bitmap which
is overlaid on the video stream. But DVDs also support the closed-captioning
that was originally used for TV broadcasts, which is text embedded in the
video signal that's decoded and displayed by the television set itself.

Media playing software can extract subtitles from the latter system and
display them in whatever font you like - but you can't do the same with the
DVD-overlay subtitles, because they're just images.

------
hmottestad
A fun trick I like to do is tilt my head 90 degrees and look at the font on
the screen.

This way you realize how terrible the font and resolution really are.

------
aidenn0
Anyone else feel like your kids are going to not believe that this sort of
thing had to be done as they grow up only with fine dot-pitch screens?

~~~
mbell
I doubt the need for font hinting will go away any time soon. Most monitors
are still roughly 96 dpi (same as mentioned in this article). Apple's retina
display might be in the area where hinting doesn't matter as much. However,
reaching 300 DPI for a largish computer monitor is really outside the realm
possibility today for anything but a completely custom design. For a 30" 16:10
screen you'd need a resolution of ~7575 x 4750. You'd be seriously hard
pressed to come up with a graphic card / cabling solution to handle that
resolution.

~~~
to3m
Indeed. Also, 7575 x 4750 x 24bpp = ~96MB RGB framebuffer.

Additionally:

DPI of my 2011 laptop: 115dpi.

DPI of my 2011 monitor: 100dpi.

DPI of my 1997 monitor: 90dpi (100dpi possible at a rather ugly 60Hz)

DPI of my 1993 monitor: 91dpi

With this progress (or lack of) in mind, I think the pixel will be around for
a while yet. Indeed, (thanks to LCDs) outside a few small-scale high-DPI
places, pixels are more obviously visible than ever before! So why people are
so sure that the pixel grid can just be ignored, I'll never know...

~~~
masklinn
> With this progress (or lack of) in mind

IBM once produced 204dpi desktop screens (T220/T221, 3840×2400 in 22.2")

------
TwoBit
I disagree that "correct math looks wrong". You simply used the wrong correct
math. Lots of other "correct maths" could produce even worse output. And some
could produce better.

~~~
ansgri
Indeed, «correct math» seems strange here, for interpretation of hints is
equally correct as beziers.

