> mobile devices don’t use subpixel rendering due to having to suport both vertical and horizontal screen orientations
makes no sense... (well, except that doing it properly would be harder to implement)
I recognize that fitting that into a scanline rasterizer in a device that doesn’t really have samples arranged in scanlines might be a bit tricky and that there might be less urgency to squeeze the last dregs of resolution out of a ≥300 dpi display than out of a 96 dpi one, but I disagree that this can’t be done.
Are you saying that many fonts are specifically designed for "3 column (and specific RGB order ?)" subpixel rendering ?
Re you second point, I originally wanted to say something like “fonts aren’t, renderers are”, but in the meantime I read the Raster Tragedy and it seems like the answer to your question is quite literally yes, hinting bytecode is in fact designed for a subpixel grid with a 1:3 aspect ratio (mostly by virtue of the font designer writing and checking it by feel using one). Shoehorning outline fonts and WYSIWYG layout onto low-resolution (≤ 150 dpi or so at 10 pt) displays is that much of a hack, and now I really, really want it to die, all the complaints about elitist designers notwithstanding. (That I find 200- and 300-dpi tablets and phones much more pleasant to work on is a mere coincidence, I assure you.)
The reason this doesn't work for Pentile displays is because the samples for the different colour channels are not offset from one another.
That's clearly the future for desktops too. At some point I would imagine subpixel hinting gets disabled entirely in desktop OSes but we just haven't got there yet.
I’m pretty sure it’s gone on macOS.
I suspect that 8K displays will be “good enough” and 16K will finally make this hack obsolete in a few years, but we’re not there yet.
Then again, I use a non-AA bitmap font in my editor, so yeah. Get off my lawn!
You can also directly edit the registry to change the gamma, greyscale/colour, and subpixel order.
I ended up solving this with a 5k display at 1:1.6. The smoothing artifacts are below my threshold of perception and all looks nice and smooth, while still having a 3200x1800 usable resolution. Downside: cost.
I just wish there was a 16:10 aspect ratio option.
It seems HP has re-released the Z27q... but in-name-only: the current model is called the "Z27q G3" but it's now just a mid-tier 27-inch 2560x1440 IPS display with no special features beyond DisplayPort 1.4 in-and-out ( https://www.hp.com/us-en/shop/pdp/hp-z27q-g3-qhd-display ). Originally the "q" in Z27q was for "quad" because the 5K (5120x2880) resolution really was 4x 1280x720 (the OG "high def" before 1920x1080 took-over), but now the "q" stands for "QHD" which doesn't mean Quad-HD dimensions, but just 4x the pixels (which isn't impressive at all: as pixel-count increases squared as length increases). Grumble.
Apple still has their own LG-made 5K monitors but they don't play-nice with Windows, and it's even worse with Apple's horrendously priced 6K monitor - but other than that, there's still very few options left for people wanting or needing a large desktop workspace beyond getting a 40-inch 4K display and running it in 96dpi mode.
And then you get the exact opposite problem of macOS people trying to force macOS rendering onto other platforms because they prefer it. If you're going to advice people not to mess with these settings, don't make it apply to just your platform. Just don't touch these things unless you're going for some kind of retro feel or are foolish enough to try to do your own font rendering.
Apple's weird "font smoothing" probably has more to do with the way glyps are artificially altered for aesthetics, anyway. Stick to the platform defaults, it's what your customers are used to. Font rendering is done through lots of subjective choices and personal preferences because there's no technical "correct" font. If one platform does it different, let it be different, don't force down whatever your preference is because you use an iPhone/Windows tablet/Linux computer.
Funnily enough, Apple disagreed with this author and changed the way their subpixel algorithm worked years ago, so the author's preference clearly wasn't what the majority were expecting.
Apple only changed it due to the prevalence of “retina” displays. When the pixel density of your screen is high enough, sub-pixel rendering stops making sense (your normal pixels are already small enough). You can just apply normal anti-aliasing and get the same result.
This also has the advantage to remove all the super special font rendering code from your code base, no need to percolate font data through your entire rendering pipeline, just so you can do sub-pixel rendering in your final compositing and scaling step. Just push normal pixels, and treat everything the same.
All of this doesn’t change the fact there might be situations where sub-pixel rendering still makes sense. It’s just the OS X no longer supports it.
Is that why fonts all look bad when I plug an external monitor into my Mac?
I wonder if the prevalence of Retina displays across the Apple product line coupled with the prevalence of rotatable displays from iOS devices made them go towards something consistent across the board:
because 90deg rotations really doesn't play well with subpixel rendering, doubled down by having to handle four different orientations with a single font and have it display consistently on a single device, it makes sense to go with simplicity and walk the antialiasing route, and since you have Retina basically everywhere, well, just flick the switch everywhere as well.
I seem to recall (years ago, so memory may fail me) some blog post explaining how dropping subpixel rendering also allowed for some major cleanup in the Cocoa/CoreSomething rendering code.
This is not a matter of preference but of measurable quality. Fonts on macOS look super smooth, and most Windows laptops these days still ship with 1920x1080 displays which even with Cleartype enabled just looks like shit.
Linux is even worse, compared to a Mac a Linux desktop just looks like straight from the 90s (which is when most stuff regarding rendering seems to have frozen in time).
People using Macs generally value aesthetics, optical polish, UI consistency and elegance, a factor that is sadly lacking in the Windows world (there are ... five different ways of creating UI applications at least, six different applications to look for basic system configuration, ...) and completely absent in the Linux world (as a result of everything relating to Desktop Linux being underfunded and done by volunteer developers without UX design people).
What are you smoking? most modern Linux distro's out of the box have the best font rendering of all OS's. Sure if you install some bare bones openbox DE it will look bad, but Gnome and to a less extent KDE look night and day better than windows and macOS.
I'm one of those people, and I'm not even a Windows user. What I like about them is that on crappy low resolution displays they are sharp. This seems less the case with Win10 though.
Yes, I know about the whole font shape debate in Windows vs Mac rendering. And I don't care one bit.
When I'm using the computer as a tool to read text all day, I want sharpness above all else. I don't care if the letters aren't exactly as their designer wanted them to be.
Macs? Sorry; Apple says one size fits all these days, and you will likely have to shell out money to get it looking 'right' on an external display (i.e. higher dpi). Normally, this means just embracing the full Apple ecosystem of peripherals, which is only complicated further given Apple appears to have stopped producing external displays years ago.
You can use any 4k screen with a Mac, and it will work well. Before I switched to Linux full-time, I used to use a 24", 4K Dell on my MBP, and it was glorious. The screen wasn't even all that expensive, around €300 3 years ago, if memory serves.
The issue is that you cannot obtain rendering on a low resolution grid that is both sharp and close to the shape on print . As such, given this constraint, I prefer sharpness rather than fidelity.
In my personal case, I actually use Linux on a high dpi display, so I can get both.
But when I have to use a low resolution screen, like at work, I would actually prefer a font that is designed with these constraints in mind, such that it looks both good and sharp. For this I like bitmap fonts, but they seem to be few and far between these days. Overly fancy fonts need to be beat into shape with hinting, but then the flow looks all broken. Or else, they're a blurry mess.
Terminus works great for my monospaced needs and Calibri (from Windows) seems to have bitmaps for low size fonts that look fairly pleasant.
 I'm talking mostly of small-sized text, like for interface widgets. For larger font sizes, it seems easier to get a decent result even with fancier fonts.
Stem darkening is basically necessary to some extent for most renderers, for multiple different reasons depending on the implementation, otherwise glyphs come out far too thin. There's no "right" answer here
Wait, that's not supposed to happen. Most likely they're setting subpixel values without taking display gamma into account. This should be fixed properly in the rendering pipeline.
I wouldn't hold your breath on a rendering fix
Apple 'fixed' it in 10.14 by disabling subpixel anti-aliasing all together (what this post argues against)
I can't speak for you, but for my work and current eyeglass prescription, 27 inches is my starting point for comfort.
Even if I dislike it, Apple seems to have removed the global option to turn font smoothing off in Big Sur.
Sadly, the only thing that ever did this correctly was Firefox, and it no longer does it. The proper method needs alpha values maintained for each subpixel in an image, not just each pixel, and it needs a specialized blit routine to composite with the background. This was agreed by most to be too complex, and, as other comments point out, DPI is increasing, so the easiest thing is to just disable subpixel rendering.
I wonder how much these "please stop" blog posts help: seems like they are supposed to raise awareness among developers, maybe to point it out to those who didn't think of it, but IME much of the awkwardness, especially in the UI, comes from non-tech-savvy client/user/management requirements: things must look good enough in a presentation, on their machines, and/or on common machines and systems at the time (though an argument can be made that it's the priority in many cases).
Sure about that? I would've thought the wildly varying subpixel layouts for the various display technologies and even generations of a particular technology would have a bigger impact. IPS subpixel layout looks nothing like an early PenTile layout which is different from a modern AMOLED layout.
A citation for that statement from the OP would be appreciated.
In the past ten years, Macs (which this writer is mostly concerned with) have all adopted ultra high density “retina” displays, which… well I’m not going to say they render the arguments moot, but the changes are so drastic that they force every statement to be reevaluated.
Also I could be mistaken but I believe as of about four years ago subpixel was dropped from Macs altogether.
And I agree, a lot has changed since the article was written. On HiDPI displays, personally I don't like subpixel antialiasing.
On Windows you can tune this for ClearType (I don't know the exact command, I usually just type "ClearType" into the start menu) and doing a test similar to what you would get in the optometrist's: a set of texts, and you pick which one is the best and if things get better or worse.
If you have issues with blurry text, it could also be because of filters your monitors apply. Some using some kind of sharpering algorithm. Trying to tune the blurry monitor using some of the available websites might not be a bad idea in that case.
This isn’t relevant anymore. This flag only affects older versions of macOS.
Either way, 100% agree. Stop trying to change a OS’s text rendering, if you still are.
I work mostly in the daylight and thus mostly in light color schemes for coding (doubly so when I’m forced to use glossy screens) but I get the appeal of not having a full white spotlight in your face when coding something in bed at 3AM. But: I’d rather most of my screen be dark (eg with terminals) but still have my editor viewport to be light-background, even if it means shrinking it.
FWIW: I can see aliasing on subtitles on my 70" 4k TV, sitting a few feet away. I have excellent eyes.
That is not generally true. It depends on font hinting and grid-fitting. With good grid-fitting, font rendering with disabled subpixel support is more sharp, as vertical lines are exactly aligned with pixel boundaries. That is why i disable it on my desktop.