Configuring presentable fonts for the nth time was the thing that tipped me over the edge years ago to promise to myself that I will never again make another Linux desktop configuration until I see one that _out of the box_ provides a not-obviously-buggy presentation for every app that I use daily.
Since then I have yet to see an environment meet that bar, and have wasted no more time configuring X, fonts, compositors, themes, key binds, etc. I look forward to this changing one day, but I'm not expecting it soon. The ecosystem moves between technical targets too often to reach sufficient polish to meet the bar.
You’re asking for the impossible. When it comes to fonts, no single configuration would ever make everyone happy. Look at the screenshots for the out of the box appearance on Manjaro. I don’t see anything wrong with it compared to macOS. Yet, here we are.
Making a font change to an open source project is a good way to attract angry replies in issue trackers if you’re not careful. The Perfect Configuration (TM) simply doesn’t exist.
Oh I agree that font and weight selection are entirely preferential.
In both of these images you can clearly see bad color bleed in the title bar font, and in the content font in the after case. I'm guessing the users main concern is that they want a higher text density which the after shot demonstrates. IMO both of these cases can be improved substantially to remove the color faults using grayscale subpixel rather than this colorful mess.
RGB subpixel has not looked good on any screen I've used in what about 10 years? The shapes of displays no longer produces good results with these techniques.
Why? Take a picture of your screen from close up with a modern phone and zoom in. You may see a non-RGB layout, but even if you don't you'll almost certainly observe a pattern where the empty space is significantly vertically larger around one boundary (often red/blue), this is related to the brightness targets monitors have been hitting for so long. As the pixels no longer align with the theoretical shape, if you can see the colors, the colors often look super trashy.
When was this? Did you try Ubuntu 18.04 or later lts? Curious what was wrong with Wayland/Gnome and fonts.
I generally replace the monospace/console font - but i do that on MacOS and windows too.
Tbh I find the lack of decent (tiled) wm (as well as broken display on "low-res" (1440p/QHD) external displays) on MacOS much more frustrating. I make do with yabai, but the hoops one needs to jump through and the resulting fragile setup isn't great.
Hm, now i came across this - so maybe there's a decent fix for that too?
The register link goes into more detail - basically Apple dropped (or never had?) fractional scaling. So for an m2 Mac the external display and internal display will look different (unlike, say with Linux/Wayland - or with an apple hidpi external display).
To be more precise, fractional scaling in macOS is implemented by rendering the entire display at 2x dpi at a resolution greater than the actual display resolution, then scaling the entire output down to the display resolution.
In some ways, this is smart - it avoids the need for applications to handle anything other than 1x and 2x modes, and it avoids the complications of handling fractional pixel sizes (how does an application draw a 1 logical pixel wide line at a 1.5x scale).
In other ways it is monumentally stupid. The macOS technique wastes resources rendering at a higher resolution than you actually need. Worse, even though the downscaling is excellent in macOS, it still has ringing artifacts and it still is blurrier than native fractional scaling.
This is personally one of the things that I hate most about macOS. Many Mac laptops ship with internal displays that default to a resolution that is scaled in this way (though fortunately not the 14/16 inch Apple Silicon MacBook Pro). That's annoying, but it's not nearly as bad as the situation with external displays. I use a UHD (3840x2160) 32" display. On Windows it's perfect at 1.25x scale. On macOS, my laptop ends up rendering at 6144x3456 and then scaling the output down. This looks very noticably blurry, and it's a huge waste of resources.
Only Windows really gets this right among desktop/laptop operating systems.
GNOME uses the same behavior as macOS, except that the "render at higher resolution and scale down" options are hidden by default, and the scaling itself is a lot blurrier.
KDE does support native fractional scaling, but there are a ton of bugs related to rounding errors causing 1px gaps and random cases where applications end up being blurry, especially on Wayland. Worse, display scaling changes often don't apply until you log out or restart applications, which is particularly annoying if you have a different scale factor on an internal and external display.
Windows on the other hand handles this relatively well, although there are still some annoying bugs on Windows 10 related to the taskbar and notifications (mostly fixed in Windows 11). There are also still a lot of Windows applications that do not handle scaling at all, which end up in blur city.
> Worse, display scaling changes often don't apply until you log out or restart applications, which is particularly annoying if you have a different scale factor on an internal and external display.
This doesn't sound right - i had gnome/Wayland running on two external displays with different fractional scaling - and a window spanning the gap or being moved from one screen to the other looked fine?
This (fractional scaling) only worked under Wayland - but there it worked fine.
I have to wonder why you're expecting the same level of polish from an operating system (a bunch of them actually) that have to support thousands of different configurations made by third-party vendors, as from a proprietary operating system from the same vendor that makes the hardware for it (with an extremely limited set of configurations). The reason you seem to prefer macOS and Apple is the very same reason I tend stay as far away from it as possible.
Welcome to HN: where NixOS is evangelized with the thunderous battlecry of "People who hate configuring love configuring NixOS! If you hate configuring too, you should stop using that system you've already finished configuring and then learn to configure this new one!"
Nice guide, I hadn't seen the trick with matching and replacing specific fonts (e.g. -apple-system).
One thing I'd tweak for my own usage though is to use Inter[0] in place of TeX Gyre Heros for the sans-serif font. Inter, as its name suggests, is designed specifically for usage as a screen/UI font and has metrics extremely similar to those of San Francisco, which is the font that Apple has been using for its UIs for several years now. Inter also has broad language support with a wider variety of weights, so there's practically no situation in which it comes up lacking.
I thought macOS did very little font hinting? I personally turn off hinting in my fonts.conf. Also, TeX Gyre Heros isn't identical to Helvetica Neue, which itself is not hard to find on the Internet.
Yes, this post is nonsense. The biggest issue is that macOS doesn't do sub-pixel anti-aliasing and hasn't for many years: the "after" picture looks nothing at all like macOS!
You'd achieve macOS style rendering on Linux by:
* disabling hinting completely (don't forget the auto hinter)
* disable subpixel anti-aliasing (use only grayscale full pixel anti-aliasing)
Then you need to find a way to enable sub-pixel positioning (which uses anti-aliasing to offset glyphs by fractions of a pixel). If your applications are all QT5/6 or GTK4, you're done, because this is enabled by default. Applications that use other text drawing methods (including GTK2/3 but probably others) may need additional adjustment. I think the only sure-fire way of doing it would be to patch Pango / Cairo.
Personally, I do most of the above but leave sub-pixel anti-aliasing enabled, because it works better on my low-DPI monitor.
The advice given on the page is outright wrong for a few reasons:
* it doesn't mention sub-pixel anti-aliasing at all (and leaves it enabled!)
* the before picture appears to have less hinting than the after picture, and therefore more macOS like (??)
The only difference in the smaller screenshots is the change from either full or medium to slight hinting, which can be done in the font settings of any desktop environment.
Something else to note: Chrome/Chromium does its own font rasterization using Skia, and it tends to not abide by font settings.
If you have glasses you're probably experiencing chromatic aberration. Not much can be done about that other than doing what you're doing and using a high-dpi display.
It's worse with extended color gamut monitors. sRGB is necessary for the color weights to be properly balanced.
> font rendering on your Linux look just as awesome as that on macOS, else read on to find out what beauty you have been missing.
It's probably me, but font rendering sucks on macos. The out-of-the-box experience on external screens which are not hidpi is worse than windows XP. I needed all kinds of small tools and the commandline to make MacOS not do a terrible job.
The macos forums have loads of complaints about this, of course almost all are answered with: "well, you got used to hi-dpi on your mac, that's how bad it looks on low-dpi screens, you should buy an thunderbolt/5k screen".
Btw, this used to be fine when i used MacOSX in ca 2012-2015. The subpixel hinting has explicitly been disabled since Mac mojave [1]
Whenever I use macOS with a non-HiDPI display I miss the old subpixel rendering.
And I also wish Terminal (not to mention XQuartz) rendered bitmap fonts properly (sharply) with antialiasing disabled. iTerm2 does it correctly, but it's sluggish compared to Terminal.
Fortunately I use HiDPI 99.9% of the time.
But I still wish they'd fix bitmap font rendering.
It's so bad that I gave up on using an old Macbook Pro with an external monitor. I use a cheap old Windows PC instead, which suffices for this particular use.
Why, oh why, are the before/after images cropped differently? The wiki page has a completely different scroll position! I want to actually use the images to compare the font rendering. I would also like to see a screenshot of MacOS to show the target look the author is going for.
It's at least partly a matter of taste. I personally never liked the font rendering on OS X (well before retina became a thing); it just never really looked right to me; too thick and too much blurriness.
Looking at the screenshots in the article, I personally prefer the "before" version.
On my Linux machine I use Deja Vu fonts for pretty much anything. I very intentionally don't install "Windows compatibility fonts" and the like. In my opinion it looks great, but someone else might have a different opinion.
I also used the classic "Xterm fixed" fonts for a long time and thought that looked great too (I switched to DejaVu Sans Mono a long time ago, mostly for better scaling) which still has a group of dedicated "fans" today, but many would probably consider it horrendous.
One thing I will say that's objectively bad is fontconfig configuration, which is quite arcane.
I'm right here with ya. I read other comments about how "bad" the Linux desktop experience is but my little xfce workstation has a consistent dark theme across all apps and perfectly legible fonts with very minimal effort -- and it was all in the first day of setting up my OS. I don't discount anyone's preferences I just personally don't understand it.
There are no default Linux fonts, unless you count console fonts drawn on a framebuffer. Text rendering on Ubuntu looked quite different to that on Fedora last time I looked at both.
When I said "default Linux fonts", I meant font rendering. Whatever you get without a default /etc/fonts/fonts.conf or user fonts.conf file is what I consider "default".
It's a meaningless statement because the Linux kernel does not render fonts or include anything like fonts.conf. Linux distributions are very different in how they configure and patch things like fontconfig/harfbuzz and what version of libraries they ship.
I remember using MacOS (X) well before retina and fonts were better than both windows and Linux at that time.
Then windows improved a lot (and I assume linux also did improve but honestly I stopped using Linux for the desktop long time ago so I don't remember) while MacOS support for low resolution text stagnated and I believe you if you say that on windows and/or Linux fonts may render better on low DPI than on a Mac.
But I have to nitpick your phrasing that MacOS hasn't been designed to run on non-hidpi: that can be right since the design is not something you just change due to neglect; a design is something that stays forever and you see vestiges of
> a design is something that stays forever and you see vestiges of
No, that’s not how it works - this is engineering, not Deepak Chopra. Apple is not big on backward compat and legacy and when a compromise has to be made to favor either their current line up or legacy/3rd party they always choose the former.
For instance new versions of macOS simply don’t support subpixel AA anymore because no standard Apple config involves non hidpi displays. Additionally, macOS renders via framebuffer scaling, not flexible UI scaling (with some good reasons for this compromise), but it’s a compromise and non retina font rendering suffers for it.
Correctly configured Windows and Linux offer superior font rendering on non HiDPI and Linux is easily tunable to be subjectively better on any config.
There was a distinct change to the font rendering sometime around macOS 10.14 (Mojave). I use the same 1440p monitor for computers running 10.13 (High Sierra), 11 (Big Sur), and Windows 10. I far and away prefer High Sierra to the others, with Windows 10 slightly better than Big Sur.
I don't have a citation handy, but I believe this was a conscious design choice to favor HiDPI displays.
Yes, they disabled RGB subpixel antialiasing because it's not necessary with 2x UI scale on high PPI displays. It's also helpful for using iPads as secondary displays with Sidecar, because the user might have the iPad oriented in a way that runs counter to typical RGB subpixel layouts which would cause text fringing artifacts (which is also why iOS/iPadOS have never had subpixel antialiasing).
Windows used to have hinted fonts that respected the pixel grid. OS X used antialiasing. As a result, Windows looked much sharper, but chunkier. For creating media that was going to be rendered to print, the Mac approach was better, because the exact size of text was the same on the screen as it was printed. Personally I preferred the Windows approach. With HiDPI it doesn’t make much difference either way.
I find the cutoff to sit around where 2560x1440 on 27" is. Anything lower has always looked less than great under macOS and has looked bad ever since they disabled RGB subpixel AA by default.
> A 27" 5K is 218 PPI, which is the minimum pixel density for desk work, IMO.
I completely disagree with that. 110 dpi is totally fine for desk work. Always has been and will probably continue to be for a really long time.
I take my 38" doing 3840x1600 at 110 DPI over a 27"K (or even two) doing 218 DPI any day.
Now if you do photography or work on designing bills (as in money), then you'll want higher DPI. But for "desk work", there's zero need for anything more than 110 DPI. It may be "nice" to have retina fonts but it's by not a requirement.
I'd even say the real world is powered by something like 95%+ of people doing "desk work" on 110 DPI (or less) screen.
Also, if 5K / 218 PPI is the "minimum pixel density" for desk work, what's the minimum density for creatives?
You're allowed to disagree with that, I stated it as my opinion. It's what works best for me for the type of work I do (programming, design, photography, video).
By "desk work" I mean the type of work where the physical distance between your eyes and the display is ~50cm. Mobile devices like phones have higher pixel density because they are held closer to the eyes (~25cm) than a computer monitor on a desk. Angular pixel density is the important factor here.
As kitsunesoba mentioned, I find >200 PPI much easier on the eyes when reading and writing text, my eyes don't get nearly as fatigued as they do on low PPI displays.
I think it's one of those things that once you get used to high PPI displays, it's really hard to go back to low PPI. Have you looked at a pre-Retina iPhone or iPad recently? Would you take that over the high PPI display common on every mobile device today?
Around 218 PPI is what I prefer too, but desktop displays with that pixel density are unfortunately expensive and uncommon. Only nearly a decade after Apple released 5k iMacs are similar monitors starting to become more available.
The default fonts with a base Linux install are at best terrible; not comics sans terrible but websites look off when compared to other operating systems.
Fortunately, you can remedy this by installing better fonts. Some people will even go so far as to install all the fonts from Windows and MacOS to give them the right look wherever they go.
Yeah? As someone who has been trained on typography who still learned how to literally cut letters for casting I must say I have no complaints with the fonts on my kde desktop.
Lot's of people have non-retina external monitors. I use a Mac Mini with a 46" LG 4k for most development work, and there the poor font rendering really shows.
Fortunately, I have found that you can enable subpixel AA in browsers and in CSS stylesheets, and that solves the problem. Just weird that subpixel AA is not on by default on non-retina displays.
Since then I have yet to see an environment meet that bar, and have wasted no more time configuring X, fonts, compositors, themes, key binds, etc. I look forward to this changing one day, but I'm not expecting it soon. The ecosystem moves between technical targets too often to reach sufficient polish to meet the bar.