Hacker News new | past | comments | ask | show | jobs | submit login
The sad state of font rendering on Linux (pandasauce.org)
181 points by iBelieve 15 days ago | hide | past | web | favorite | 167 comments



I prefer Mac to Windows. This is a subjective thing. Mac and Windows are optimized towards different use cases. Mac is optimized towards WYSIWYG display for desktop publishing. OS X displays fonts exactly as they'd appear on a page. Windows makes adjustments to character positions to align them to the pixel grid. This results in clearer text, but at the expense of things being slightly off as they'd appear in print.

Ultimately, different strokes for different folks. While you're not wrong that Cleartype is empirically better for most people for text content, it is objectively worse, by design, for those for whom positional precision matters.

> This occurs because Apple didn’t dare go near any ClearType patents that Microsoft got for their rendering techniques. For decades OS X remained a very ugly baby, until in 2015 they just gave us courage HiDPI in form of Retina. This was a bid to make all hinting technology obsolete and put everyone else to shame.

Is just wrong. Apple intentionally chooses to display fonts the way they do. This is quite the extraordinary claim -- that a company that is otherwise known for design would not focus on their fonts.

With regard to the freetype font size rendering. You claim that freetype doesn't scale the fonts smoothly, but then you also claim Windows does it correctly. Just FYI, Windows most certainly does not scale fonts linearly. OS X does of course, given its focus on publishing. Windows cleartype actually changes the font size in order to make text readable. If linearity of font scaling is a metric by which you measure font engines, then cleartype is a failure.

In case anyone doesn't believe the claim above, this article (https://damieng.com/blog/2007/06/13/font-rendering-philosoph...) goes into it in depth.


> I prefer Mac to Windows. This is a subjective thing. Mac and Windows are optimized towards different use cases. Mac is optimized towards WYSIWYG display for desktop publishing. OS X displays fonts exactly as they'd appear on a page. Windows makes adjustments to character positions to align them to the pixel grid. This results in clearer text, but at the expense of things being slightly off as they'd appear in print.

I strongly prefer Windows to Mac for font rendering. As you say, it's subjective. However, in today's world, the overwhelming majority of my computer use is screen-oriented. Printing accounts for a vanishingly small set of use cases for me, easily lower than 1%. And if the fonts were a few pixels off when printed, would I even notice? No. I'm not doing typesetting when I print; I'm merely creating a hardcopy of something so that I can read it when I'm not at a computer or printing a legal document to sign.

In 2019, it seems odd to prioritize print layout at the expense of screen legibility. When I use a Mac (on a normal ~100 dpi screen), the text looks softened with bad anti-aliasing and I feel like I've stepped through a time machine to the amazing year 2000.

On high-DPI displays the difference is less notable. But even there I think Windows does a better job at legibility; I find text more readable on-screen on a modern Surface than a comparable Mac laptop.


It's not about whether you're actually printing it, the print version is just an accurate reference to the correct shape of how the font is designed.

That shape doesn't necessarily line up on the pixel grid, so you can smush it around to get sharper edges on a computer screen at the expense of changing the shape/size/spacing of the font. Or you can follow the designed shape more correctly at the expense of smearing some stuff across multiple pixels.

There are pros and cons to each, and which one you prefer is where it gets subjective.

Some comparisons and discussion over here: https://damieng.com/blog/2007/06/13/font-rendering-philosoph...

The Windows text has sharper edges but it looks funny to me in comparison.


That comparison is very enlightening. The Windows version is very sharp, but the letter forms are completely distorted and it looks like it’s switching typeface as the point size changes, or even within the same sample. Look at the horizontal lines in the “z” in the larger sizes, they are very clearly too thin and don’t harmonize with the surrounding letters. Look at the relationship between the top of the “u” and “a” in “quartz”. The straight lines in the “u” serifs end at the exact same height as the curved top of the “a”, making the “u” appear, visually, to extend higher. You can clearly see in the Mac sample that there is some antialiasing on the “a” to fix this issue.

I can understand how years of Windows will make the Mac sample look fuzzy, but I’m convinced the correct and harmonious word shapes on the Mac are much more important for reading flow.


Back when Apple had Safari on Windows, I tried it out. I remember noticing the blurrier text right away, and wondered what happened. No, I wasn't imagining things, as Firefox had much sharper text. I couldn't stand it, and I stopped using it after 5 minutes.

Years later, I stumbled across Jeff Atwood's series on the topic.

https://blog.codinghorror.com/whats-wrong-with-apples-font-r...


Yes, I remember Safari for Windows also, and FWIW I preferred it. Of course in those days, I had CRT monitors so none of this really mattered, everything was slightly blurry anyway.


> In 2019, it seems odd to prioritize print layout at the expense of screen legibility. When I use a Mac (on a normal ~100 dpi screen), the text looks softened with bad anti-aliasing and I feel like I've stepped through a time machine to the amazing year 2000.

In 2019, what seems most odd is insisting that there is 'one best way'. The fact is that I fall into that vanishingly small base of users that notices the colors in windows fonts. I find Mac and Linux easier. I like the Retina displays. I'm not sure how you can argue against my preference for them. I did not say you should not have fonts you can read; merely that there is no one way that ought to be said to be better.


Yes, my recollection was that Apple was the first to market with vector fonts, hinting, and sub-pixel-rendering, and not by a small margin. I had to interact with their lowish-level C API that drew this stuff in the early 2000s -- it was definitely doing sub pixel rendering at a time when the competition was, uh, Windows XP.

Retina officially happened in 2013, not 2015, and I distinctly remember playing with scale factors in 2008 and being frustrated by a search for competitive panels on non-Apple laptops in 2009, so they've been pushing high resolution panels far earlier than 2015, which is exactly what you would expect from a company in the premium niche.

So yeah, OP is doing some sketchy revisionist history to dunk on Apple, but if Apple thought it was OK to release dark themes without updating SPR, hinting, etc to work on non-white backgrounds (which I suspect is the real issue here) then they do need a wakeup call.


Windows XP supported ClearType for sub-pixel rendering. It was off by default, I think because there was a slight performance impact and it didn't give good results on some older displays available at the time.


Then I'm guilty of the same thing as OP: I saw in one particular instance that XP wasn't doing SPR and assumed it wasn't capable of it. OSX was definitely doing it at the time, though, because I had to deal with a visual issue that revolved around it, and it was everywhere by default.

EDIT: Did some googling. This is a bigger, older, juicier drama than I knew. It even looks like Nathan Mhyrvold and Woz were involved!

https://archive.nytimes.com/www.nytimes.com/library/tech/98/...

> Cleartype was suspiciously similar to approaches developed many years earlier by researchers at Apple Computer, IBM, Xerox Corp. and elsewhere.

> Gibson decided to stake his own claim to history by creating a Web page to document what he calls the original pioneering work in sub-pixel technology by Steven Wozniak, designer of the Apple II computer.

I believe this is the site: https://www.grc.com/ctwho.htm


There's a blast from the past.

I remember downloading that GRC demo to prove to a friend it should work on any laptop, not just the Mac.

I have a vague memory that SPR on Windows XP was very oddly integrated when it did arrive. You didn't download an update or utility, you went to some Microsoft page that turned it on, and tuned it for your monitor. So for most people it may as well have not existed.


too strange to believe combinations of ultra-aggressive management plus occasional engineering brilliance plus team sports engineering efforts; before the Internet, paper print was one side of mass consumer applications aka the big money. People here know that super-smart father figure founder John Warnock literally burst into tears on stage as bitter rivals (MSFT AAPL) turned buddy-buddy behind closed doors to knock Adobe off of their type perch, right ?


I believe it. I haven't heard the story told from that angle yet, though.


>> Retina officially happened in 2013, not 2015,

2012, actually.


> This is quite the extraordinary claim -- that a company that is otherwise known for design would not focus on their fonts.

It shouldn't be _that_ extraordinary. Much of my life experience has taught me that people/groups known for a 'thing' reinforce their prestige via a sort of circular reasoning. This can cause people to gloss over frayed edges. Sometimes dubious decisions are even taken as the "right way" moving forward! If the paragon of cheese says acrid, fishy cheese is great cheese, there's probably something to it, right?

OSX, depending on the version, has an uneven margin here and there, or a weird aliased image during setup.. It's pretty clean all things considered, but Apple's reputation is a force multiplier and a floor on how their designs are appraised.

Hot take: The touch bar is goofballs. Going from a magnetic connector to usb-3 is a usability regression. Removing the headphone jack is user hostile.

"Courage" is an interesting way to describe those decisions. It's either that, or they're taking their reputation defense shield for granted. Eh, if cracks start to appear, they have enough money to do things right for a while.


> Going from a magnetic connector to usb-3 is a usability regression.

It's a trade-off that I'll take for having an industry standard power supply and a replaceable cable. Used to be if your power cord frayed at either the MagSafe end or the brick end you were out $80 for a replacement.

Now you can buy a 60W USB-C power delivery brick with 3 bonus USB-A charging ports for $25 and if your cable ever goes bad it's a cheap replacement from Anker or similar. And I doubt you'll ever need to replace that cable because Anker is apparently better at making them than Apple ever was.

I have one of these under my desk with cords for charging my laptop and phone/tablet: https://www.monoprice.com/product?p_id=24425 Monoprice has frequent sales for even cheaper.


Hot take: The touch bar is goofballs. Going from a magnetic connector to usb-3 is a usability regression. Removing the headphone jack is user hostile.

These are takes a lot of Apple users would, to varying degrees, agree with, so I'm not sure how "hot" they are. :) You can make a case for all of those choices, too, but despite the reputation us Apple fans seem to have, most of us just aren't going to say "I hate this, but because Apple did this, I must actually love it." In fact, we tend to be very, very vocal about things Apple does that piss us off, and most of us have a list of such things that goes back decades.

(Personally, I think removing the headphone jack is user hostile on the iPad Pro but not the phone; I think MagSafe is superior from a UX standpoint to USB-C, but USB-C has some advantages when you have USB-C ports on opposite sides of the laptop, a test several of Apple's USB-C-only gadgets fail; and the Touch Bar is a solution in search of a problem.)


Right - Apple even made a selling point out of the Quartz display rendering engine using PDF-compatible models internally (in much of a similar way that NeXT used Postscript-compatible models). It was designed specifically to enable true display-to-paper printing. No other platform comes remotely close to that yet.


But who cares about that when you’re using a computer screen and it all looks like a blurry mess?


Let’s try an analogy: most small loudspeakers are unable to get good volume without distorting the sound. You have a choice between playing music that sounds as intended but at a very soft volume, or to crank up the volume and make it sound like a noisy buzzsaw. This is analogous to the choice between type that looks like it is supposed to but a blurry mess, or type that looks sharp but has crazy pants distorted letter shapes.

Arguably there’s a place for each of these approaches. If you have an extremely low resolution screen, or underpowered speaker, it makes sense to accept the distortion to be able to use the screen/speaker at all. But if you have fairly good speakers or a high resolution screen it just doesn’t, because the volume is loud enough as it is, and the blur is actually pretty sharp.


I totally agree. Just the other day I was chatting with a friend about how I was considering getting a higher-resolution display for my PC because text rendering looks so terrible next to my iMac. He told me not to bother, because he has a higher-res PC display and it still looks bad on his desk next to his Mac.


I've been using 4k displays on my laptop and desktop for years, and they look fantastic.

It can be annoying to tweak scaling in Windows at first, but once you get the right setting, you never have to change it again. It takes about 10 min of setup (if it takes any setup at all).


Fonts are _never_ scaled "linearly." A 5 point font magnified to twice its size will look different than the same font at 10 points. See https://alistapart.com/column/font-hinting-and-the-future-of... for a discussion. (And look in Knuth's "Metafont")

Windows handles fonts correctly. And Cleartype is an option you can turn on and off. It makes fonts look "more correct" on many types of displays. Are you saying it's better not to have that option available at all?


> Fonts are _never_ scaled "linearly."

Correct, but when printed on paper (using movable type for example), the width of left-aligned text set in the same font at a continuously increasing point size typically has a nice 'linear' slope on the ragged right edge. Cleartype does not have this. Cleartype changes the text width drastically between point sizes close in magnitude

The author criticized freetype for having a non-linear apparent font weight as the font point size is increased linearly. The author said this was bad without much justification, while simultaneously claiming cleartype is better, despite having the same issue of font metrics being discontinuous over a continuous point size.

> It makes fonts look "more correcct" on many types of displays. Are you saying it's better not to have that option available at all?

Not at all. Options are great. However, saying something is objectively better when in fact there are multiple use cases for which it is better and many for which it is worse is disingenuous, and that is what the author is doing.


I'm not sure what "more correct" means in this context. Coming from a MacOS background, it is my opinion that ClearType makes fonts look worse.

Regardless of my personal opinion, it's clear that other people share this opinon. For instance, Word doesn't use ClearType; they had problems with text on different colored backgrounds.[0][1]

[0]: https://en.wikipedia.org/wiki/ClearType#cite_note-Word-3

[1]: https://blogs.msdn.microsoft.com/murrays/2014/05/31/crisp-te...


Mac on the left, Windows with cleartype on the right. Adobe InDesign

https://imgoat.com/uploads/74ad4786c3/203900.png


Can you disable ClearType and still get antialiased font rendering? From what I remember it's either ClearType messing the fonts up or essentially getting no smoothing at all, and a quick googling seems to confirm this.


Yes. I just turned it on and off, and I still get "antialised font rendering" but without taking advantage of sub-pixel rendering. You have to tune it based on your LCD screen geometry



To me, all fonts on OS X are different shades of bold. Cannot stand it.


“What they didn’t account for is that an overwhelming majority of legacy software had 96 DPI hardcoded into it, as well as the problem of mixed use of HiDPI and non-HiDPI displays where you need to somehow scale the entire UI back-and-forth when moved between displays. Since Windows never had the font rendering problems that Apple did, majority of the computer market share didn’t back the ultra-expensive 4K screens even to this day (2018). Software maintainers too didn’t buy into the idea of massive code rewrites and UI tinkering to support DPIs other than 96. As of today, 4K experience is still a mixed bag that solves one simple problem for Apple, but introduces multiple complex problems for everyone else.”

Uhmmm.... [citation needed]?

I don’t have a single Mac app that assumes anything about 96 dpi and pretty much the entire Internet is 2x now.

Windows on the other hand... yes. That’s a clusterfk in HiDPI.


I don’t have a single Mac app that assumes anything about 96 dpi...

Yeah, this is the point where I was like: I'm sorry, but this seems so wrong that it casts the whole article into doubt for me. Anything written in the OS X era doesn't have any assumptions about DPI baked into it -- and if any application in the "classic" MacOS era baked in an assumption about DPI, it would have been 72, not 96, because Apple stuck with a "1 pixel = 1 point" mantra in the (very) early years. One of the entire points of OS X's Quartz rendering system, which was basically "Display PDF," was to be effectively device-independent. There were no enforced assumptions about resolution.

There are arguments to be made that ClearType is a better technology for rendering text on displays than Quartz, but it's folly to pretend that there's not subjectivity to it. Personally, I've long preferred Quartz's rendering approach in nearly all cases; the text is arguably softer, but particularly on non-HiDPI displays, the letter spacing with ClearType often seems subtly off in ways that I find more distracting.


> Personally, I've long preferred Quartz's rendering approach in nearly all cases.

All cases? Have you ever compared them side by side on a low DPI monitor? I have a old but high-end 24" 1080p monitor that I bought for its color fidelity. When I'm in OS X, the fonts are atrocious compared to Win 10. Yet, I prefer OS X's font's when they are displayed on the 4K screen.

Apple moved to HiDPI long before Windows PCs, so you need an a very old Mac or external monitor to see what he's talking about.


Why was this downvoted? This paragraph was one of the worst in the article.

> Since Windows never had the font rendering problems that Apple did

Uhm I cannot reliably use windows with any sort of scaling. It's not as bad as linux, but it's still very bad.

> Software maintainers too didn’t buy into the idea of massive code rewrites and UI tinkering to support DPIs other than 96

(on Windows)


I presume the author downvoted me, which is fine. But that entire paragraph shows a misunderstanding of the Mac world — it’s just not accurate.


> Uhm I cannot reliably use windows with any sort of scaling. It's not as bad as linux, but it's still very bad.

That seems like a massive overexaggeration. While there are still rough edges, running Windows on my MacBooks retina screen has been just fine for majority of popular apps. Including mixed HiDPI/normal DPI use cases.


Doesn't that just reiterate my point? "While there are still rough edges" == "I cannot reliably use windows with any sort of scaling"

Take QGIS for example. It doesn't properly scale. On Windows you get an miniscule interface, on mac it's the same size as all other apps, just blurry. I'll take that any day.


I agree, if people are still complaining about windows fonts the last version of windows they probably ran was vista. Windows has come a long way and the entire surface product line depends on beautiful font rendering on high dpi screens.


>Uhm I cannot reliably use windows with any sort of scaling. It's not as bad as linux, but it's still very bad.

Why? Bad font rendering, or bad app scaling? I've found most if not all apps scale well. Some of them require you to go to compatibility settings and tell Windows that yes, the app can resize by itself. After you do that, they won't look blurry anymore. An example that comes to my mind is eMule.

There are a few apps that won't scale well and widgets will look misaligned, but they are very, very rare.


> OS X objectively has the absolute worst font rendering. It does not use hinting at all and relies only on grayscale anti-aliasing. This makes everything appear very bold and blurry.

This almost reads like satire.

It's well-known that Windows prefers to distort letterforms for the sake of crispness, while Macs preserve letterforms for the sake of fidelity.

Saying that one is better than the other is entirely subjective -- there are many, many articles on the subject. There's absolutely nothing objective about it.

If it were so "objective", then it seems quite odd that Macs would be the predominant choice of graphic designs and type designers, who would be expected to care about this the most...


Steve Jobs took type-faces and font rendering more seriously than anyone else in the computer industry except for maybe Donald Knuth. He discovered calligraphy while at Reed College and it changed him forever.

<aside> I am still amazed at how crappy MS Word at typesetting. I write these beautiful papers in LaTeX with incredible math type setting and then sometimes I am asked to convert to Word -- it almost makes me cry to ruin my "art" to convert it to Word. </aside>

It it an affront to my aesthetic taste to look at a Windows UI.


Not defending Windows, but I find the common latex font clumsy and hard to read.


Computer Modern is very thin. I personally prefer Computer Concrete, in a similar vein, because it is simpler and easier to read on low-DPI displays. Although I’m not sure about this, it feels as though Computer Modern was designed to take into account a bit of bleed, which is why it looks better as a high-resolution inkjet print.

Regardless of which font you use, TeX will produce a much higher quality document (esp. math) than Word, at the expense of an upfront time investment in learning to use it.


I am personally also not a big fan of Computer Modern, but I've heard graphic people praise it. It is also very easy to use a different font.


> This almost reads like satire.

After I spent a big part of my day with adjustments trying to make Windows font rendering suck less so I could get some work done, I also thought it might be satire, but it feels more like reading someone's antivaxx conspiracy theory.


I worked on an application that converts HTML to PDF and font rendering on Windows was incredibly frustrating. The kerning made text unreadable at smaller fonts. It's difficult for me to believe someone would actually prefer the way Windows does font rendering.


As others have noted this is a hilariously bad post, because in practice Windows (which is so enthusiastically praised here) is incredibly inconsistent in font rendering program by program, and on most non-hidpi screens looks like trash. Especially when using scaled displays (such as 1.25 or 1.5x scaling on a 4k screen), where half of software turns into a blurry, unreadable mess.


As seen even in Windows itself, including but not limited to various places in the Control Panel. The situation is even worse if you are a laptop user that sometimes connects to external displays with a different pixel densities. Things will often either get upscaled or downscaled depending upon the display used at the point that the application was opened, sometimes requiring the application to be closed and re-opened to render at the correct resolution or - worse still - requiring you to log out of Windows and back in again! Trying to use high-DPI displays on Windows is an unmitigated nightmare.


I think Microsoft did the best they could since HiDPI requires the app to cooperate with the OS and supporting three separate modes makes it so more applications can actually work even if it's annoying.

- Some apps are DPI unaware and just get scaled by the OS at the pixel level.

- Some apps are DPI aware but only at startup. If you move screens they need to be restarted.

- Some apps can dynamically switch their DPI and respond to the OS prods to change.


My big question about this is Why?

Why is Windows such a mess about this and how did Apple manage to pull it off with virtually no fanfare?

For a while was connecting a Retina MBP to two non-Retina monitors. Sometimes, when dragging a window from the laptop display to a non-Retina monitor I'd see a flash of everything scaled up. Other than that, I've seen none of the problems that seem to still plague Windows 10.

I'm sure the answer has to do with the history of Windows - I'd love to see someone like Raymond Chen dig deep into this.


The why has to do with how Windows and macOS approached scaling.

On macOS, to get anything above 100%, you render the whole interface at 200% (including layout), and the OS scales the resulting pixel output to the display's scale.

On Windows, every application has to detect the scale of the monitor it is being displayed on and do layout at the exact scale. (Actually, that isn't entirely true, if you wrote an app before the high dpi additions, you always render at 100% and Window scales it up for you, which results in a blurry mess.) It is very easy for layouts at 125% to have gaps and varying width lines depending on where on the device pixels the coordinates land.

Adding mixed high DPI support to Sublime Text for Mac was very simple compared to adding it for Windows. On Mac the OS tells us when to render at 200%, and everything in regards to layout is then doubled. On Windows we have to re-layout and snap dimensions to device pixel boundaries to get something that looks crisp when moving a window from one display to another. The code changes were far more invasive for Windows.

We support high dpi on Linux, but haven't added mixed high DPI support yet. Gnome limits users to integer scales (1.0, 2.0, 3.0), but with tweak tools it is possible to get fractional scaling. All of those are fine, it is just when you need to re-layout in fractional scales on the fly that you have to expose the scale of the current display to the entire UI library, or sometimes have gaps and overlaps.


I haven’t worked on Windows in quite a while, but IIRC one problem is there are three different font renderers in there. And nobody ever goes back and fixes old parts of Windows when new ways of doing things are invented. So it’s pretty easy to end up with a screen showing as many as three different renderings of three different system fonts.

I used to have a fun way to demo this where you dive through the archaeological UI and rendering eras (including XP-style control panel, with a hideous sidebar link hover rendering bug that I don’t think ever got fixed) until you end up at the iSCSI control panel, which I think may be literally unchanged since the year 2000.


Actually Windows 7 was better than Windows 10. Technically W10 is superior (different scaling % per display, because every display will have a different DPI) but in practice, on W7 you had the same scaling everywhere (less ideal), but also crisp text everywhere. In W10 you have different scaling but the display with the lowest scaling factor will have blurry text everywhere...


Try looking at Computer Management on a partial-scaled display (125% ish). It’s impressively bad.


This guy has some pretty weird opinions. Not only do I much prefer macOS's accurate font rendering aesthetically, the entire issue of sub pixel aliasing is moot with high-DPI screens anyway.


I look forward to the day when I can drop in a high-DPI screen to my home office setup that works just like my current non high-DPI screen, but with a higher pixel density.

(5K ultrawides to replace my 34" monitor simply don't exist yet, and the current 12" MacBook can't drive a regular 5K monitor.)


> Some users even disable all ClearType rendering and anti-aliasing, claiming that it reduces eyestrain and that anti-aliasing damages eyesight. It’s kind of like anti-vaxxing (hello from 2019 if you are reading this in the future).

I hope the anti-vaxxing comparison is only for the "eyesight damage" part, otherwise this paragraph is bullshit in its entirety. From my own personal experience, i vastly prefer antialising disabled, assuming that there fonts used have hinting available that support it. Or even better, bitmap fonts made to be clean and crisp (as opposed to just being rasterized versions of vector fonts). Of course if you try to use fonts that disregard hinting and aren't compatible (or even tested at all) with antialiasing disabled, like the font the author's site uses which looks like total crap on my PC, then, yeah, compared to that i prefer AA with CT.

I find annoying that you need to mess with the registry to disable antialiasing in Windows 10, but at least the option is still there.


SPR absolutely gives me eyestrain, though I can't blind test it because on a 1920x1200 display I can easily see the fringing.


I'm surprised there was hardly any mention of pixel grid fitting, or even just accuracy. My favorite article on font rendering is one from Antigrain Geometry[1] that goes a fair bit further in improving subpixel rendering imo.

Nonetheless their use of the word objective is very annoying since I greatly prefer grayscale with no hinting even on Linux at 96 DPI. It's a bit blurry, but it consistently looks right. No kerning issues.

Also, it is laughable to suggest that Windows never had issues with font rendering. Just look at Microsoft Word struggling to balance between accurately displaying documents and rendering the fonts with Cleartype as shown in the AGG article.

[1]: http://www.antigrain.com/research/font_rasterization/


I don't know...turn on subpixel with slight hinting, disable autohinter, enable lcdfilter and the result is pretty damn good. For monospace truetype fonts I use grayscale instead of subpixel, which I think is cleaner in the terminal... unless of course I'm using a bitmapped font. It took me quite a while to get it looking good, but ultimately I guess it's a matter of taste. Do you want to preserve the look of the font? Do you want to smash it into the grid? Do you want colors on the fringes? With Linux I can choose.


> Do you want to preserve the look of the font? Do you want to smash it into the grid? Do you want colors on the fringes? With Linux I can choose.

I have no idea. I want it to look good. With macOS I don't have to choose.


The idea is that "look good" is subjective, case in point: you apparently think the font rendering on macOS looks good, whereas the author of the linked article thinks that font rendering on macOS looks bad and instead likes the font rendering on Windows 7.


I am of the same opinion as the author of the article. Mac fonts look blurry compared to ClearType. I usually use bitmapped fonts in Emacs and xterm, however, to sidestep the issue altogether. Fontconfig is always an option if needed. On the other hand, I really can’t tell the difference on an HiDPI display, even though I don’t use one (or want one, for that matter).

Whatever the defaults are in Firefox on OpenBSD look superb on my monitor.


> The traditional way of [installing the Windows core Web fonts] is through installing ttf-mscorefonts-installer or msttcorefonts. The msttcorefonts package looks like some Shenzhen basement knockoff that renders poorly and doesn’t support Unicode. I suspect that these fonts have gone through multiple iterations in every Windows release and that the versions available through the repositories must be from around Windows 95 days.

That does seem to be the case. Per Wiki (https://en.wikipedia.org/wiki/Core_fonts_for_the_Web):

> The latest font-versions that were available from Microsoft's Core fonts for the Web project were 2.x (e.g. 2.82 for Arial, Times New Roman and Courier New for MS Windows), published in 2000. Later versions (such as version 3 or version 5 with many new characters) were not available from this project. A Microsoft spokesman declared in 2002 that members of the open source community "will have to find different sources for updated fonts."

So while Windows and MacOS both have up-to-date versions of these fonts (Windows because Microsoft owns them, and MacOS because Apple licenses them from MS), the best Linux distributors can do is to package the last versions released before the 2002 re-licensing. (Or at least, that's the best they can do without paying Microsoft.)


There’s not enough resolution at 96dpi to render fonts correctly. You have to make compromises one way or another. Windows snaps to pixel boundaries more strongly, while Mac does no hinting to keep shapes correct.

The author, while his opinion should be taken lightly, is trying to get subpixel positioning into Chrome and GTK. This is something positive, and something Windows and MacOS already have. Qt can do it, too, but as of a few years ago KDE/Plasma was actively disabling it because of inconsistencies in Xlib and image rendering backends.


I actually do prefer font rendering on Linux to Windows. It strikes a good balance between Mac and Windows rendering.


He lost me at

> OS X objectively has the absolute worst font rendering.


"Do a controlled reading speed experiment against Windows."

Nobody will do this.


s/font rendering/wifi

s/font rendering/bluetooth

s/font rendering/sleep mode

s/font rendering/battery life

I've been using Linux for 15 years now (started on Mandrake), but those issues haven't been fixed in a decade.

I really want to support free software, so I keep using it, reporting bugs and donating again and again. Many do. We hope that like with Firefox, if we support it consistently, it will eventually overcome the situation.

But damn it's hard sometimes.


I don't use BT so I can't speak to it, but I've been dual Linux/Mac for ~20 yrs, and the linux laptop situation is spectacularly good if you use thinkpads. At least it has been for me. Dell ships some good ones too, but I haven't tried them.

On non-thinkpad/xps machines (which I'm not saying you're using or talking about, I'm just gonna assume), it's a crapshoot. But I think it's no better than running macos on a pc laptop or windows on a mac laptop (at least the last time I tried years ago, via their dual-boot thing).


Like the others, jumping around shooting "it's fine" is not going to help.

I'm writting this from an Ubuntu 16.04 on an XPS 13. Sometime it wakes up from sleep modes with no wifi or bluetooth, and I have to force a reboot. Bluetooth doesn't work out of the box, I have to use blueman just to have a working mouse. But even with these, headphones won't work.

I had a XPS 15 before. It was not better.


I have the 2018 XPS 13, running Ubuntu. It worked pretty well on 16.04, until I upgraded past kernel 4.15. Now there's a bug where it either won't stay asleep, or when it wakes up the WiFi and Bluetooth devices are non-functional (even reloading the modules does nothing). I have recently discovered that disabling BT+WiFi via the keyboard function key before suspending avoids the problem, but it's annoying. I think it's also related to Intel making firmware changes for a Windows sleep mode that stays connected to WiFi.

I think this bug report is tracking the issue: https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1799988


I'm not trying to deny your experiences, but sometimes you just have to accept the way the world is: there's no big company behind Linux support on laptops, so you have to go where the most users are. If you want to use OSX, you buy macbooks. If you want to use Linux, you buy thinkpads. It's sad, I wish I had more hardware choice, but I accept that I don't. Windows users are the only ones that do.

For the record, I'm current using Ubuntu on an X1 carbon 4th gen and it has worked literally flawlessly since day 1, no issues with wifi, bluetooth, or sleep, and it has great battery life. My X220 before that was 98% perfect.

Thank you for reporting bugs, though!


I often use Linux on various laptops and have never had Bluetooth, wifi, or suspend problems.

And I'm not even using ThinkPad or Dells.

People talk about those issues like they happen to everyone or are very common but they aren't in my experience.


Precision 5520 (similar to XPS 15), 18.04.2, everything working perfectly.


Wifi and Bluetooth seem largely solved to me, at least on Intel chipsets. Sleep mode has been rock-solid on every Intel chipset I've had in the last decade.

Battery life has been a consistent problem, historically. However, I've moved to Thinkpads again and my T480, with the big external battery, gets a solid nine hours when working (running dev stuff in Docker, VSCode, a browser with a ton of tabs, etc.). It get maybe ten and a half or eleven hours doing similar on Windows, and that delta isn't perfect but it's not that bad anymore.


There is always somebody to pop in to say "x works in my case". Like those "oh my zsh" users that never ever had their prompt broken or those vi lovers that "just sync their config file on github and that's it".


I find that more often there's somebody who pops in to say "x doesn't work." The people who've never had an issue will seldom go and say things are working like they should.

Sure, it sucks that wifi, bluetooth, and sleep mode don't work for you - so maybe Linux isn't a good choice for you if those are important. And the Linux community should definitely work towards fixing those issues, but the same stuff happens on every operating system.

I've had hardware just straight-up not work in Windows, then I head into Linux and it works immediately - and I'm not saying that means Linux is necessarily better, it means different hardware works better with different software.

And yes, the "works on my machine" comments are useless, but IMO no more useless than "it doesn't work for me," as if that somehow means... anything? My touchpad doesn't work on Windows, does that say anything about the viability of the OS? No, my touchpad just doesn't work good with it.

Though it is admirable you continue to use Linux, most people would just go back to what works at that point.


It probably has something to do with Linux Desktop's long, long, history of evangelism and condescension. Basically nobody goes around telling people they're idiots for using Linux or Mac and should switch to glorious Windows, but you get plenty of people still to this day doing that for Linux Desktop. So then someone will say "I can't use Linux Desktop because X", and the evangelist will proceed to blame that person for not being good enough to use their personal OS of choice.

There's also the fact that Linux likes to tout how it is made by the community. If I have a problem with Windows, I blame Microsoft. If I have a problem with a driver on Windows, I blame the company that made the driver. But if I have a problem on Linux, who is to blame other than the community? And then the community turns around and blames everyone else.


> it sucks that wifi, bluetooth, and sleep mode don't work for you - so maybe Linux isn't a good choice for you if those are important.

...or maybe his hardware's not well-supported. And what's well-supported is by any metric significantly improved from, say, five years ago. If you buy fairly mainstream hardware, at least on the Intel side of things, I think it tends to be fine. My experience has led me to think that a lot of these ongoing complaints (the ones that aren't actually stale) are some weird hardware somewhere along the line.

When buying neither of the last two laptops I bought (HP Spectre x360, Lenovo ThinkPad T480) did I do any sort of Linux compatibility check. For the ThinkPad I figured it'd be fine because Lenovo historically has had solid support, but I kind of expected there to be problems with the x360 that never manifested. I got rid of it for other reasons, mostly around HP's support being pretty lousy and demanding I go without a work machine for two weeks because a keycap had come broken from the factory, but the Linux support? Just fine.


And when talking about other platforms, you'll get plenty of "x doesn't work in my case" posts. What's your point?


That "it works on my machine" is quite possibly the most useless, middlebrow-dismissal comment that could be made when discussing problems. It working for you doesn't make it work for the other guy.


"My case" is "my current company deploys thousands of Linux notebooks" and what I am saying is that while there are certainly problems with the Linux end-user experience, it's not generally the baseline stuff anymore.

Desktops have been fine--not great, depending on your DE and your preferences, but fine--for a while now. (I'd say that Ubuntu 14.04 was the first widely stable Linux distro I ran into.) Laptops are getting there. There's room to improve, but a lot of these complaints are getting stale.


I have suffered from all those issues on non-Linux OSes as well (Linux has never been my primary OS. I’ve split between Windows and OSX). While I’m sure the fix in Linux may be a little involved, the fix on the Mac/Windows side can sometimes mean waiting months for an update.

So it’s not all that bad and all OSes have their warts.


Yeah, yeah, I know the song.

I'm always in dual or triple boot. This month I had 5 laptops just in my home.

The last time I had wifi problems on Windows was with XP, my friends with macs always have better battery, wifi and sleep than I, and autonomy on the same machine is always better on the Windows partition.

You are just repeating what you can read on so many forums. Something that I used to repeat to encourage people to keep trying.

But it's dellusional. And if we want to solve those problems, we need to aknowledge they exist.


Try a laptop with Linux installed/supported at the factory. Return when it doesn't work. Repeat.


These things have been slowly improving over time. Pretty much ever item you mention (as well as font rendering) has improved over the last 15 years. 15 years ago it was common for laptops to just not wake up at all from sleep, now the problem is some functionality doesn't wake up correctly. Not perfect, but better. Battery life has also improved a lot, though still has room for improvement. Etc.

The problem is a combination of priorities, server functionality takes precedence, and that it is good enough to get work done. It will continue to slowly improve over time and the only way that will change is if someone decides its important enough to spend the time to work out all the issues with each of these subsystems. Given how much fun that sounds (not much) it will probably have to wait for someone with $$$ to pay to have it addressed.

Either way Linux still does the best out of the competition that I've seen. The BSDs don't do as well as Linux and there are no other real non-Proprietary options at this point.


I haven't encountered issues with WiFi in years. But that depends on the driver. Intel are quite good with their iwlwifi. If you are using some garbage driver on the other hand - then your WiFi experience will be very miserable.

This has nothing to do with Linux, but with WiFi makers who can't make drivers. So pick your hardware with doing some research first.


> So pick your hardware with doing some research first.

The big linux fallacy.

Firstly there is no hardware that can guaranty that everything will work. None. No dell. No system76. There are just some that have less problems than others.

I've many friends saying they never have any problems. I also know they lie, because I come to their home, and I see them using workarounds every time for issues they don't even see anymore as they became part of their workflow.

Secondly, there is no hardware that is guaranty that everything will keep working. Each update may break your machine in some imaginative way. The last time it did, I lost ethernet support on thunderbolt. It's like a personnal chaos monkey, checking my work ethic is resilient to random havock.

And lastly "This has nothing to do with Linux, but with WiFi makers who can't make drivers" is not true anymore, with most phones using linux and having a wifi chip. The problem comes back with different chip anyway.


If you buy hardware of those who make junk drivers - you are shooting yourself in the foot. Same applies on any OS without exception. No point to complain, if you don't want to do any research before buying.


If the drivers are so junk, why were they accepted into the Kernel? Isn't part of the point of forcing all drivers to be open source so that they can be kept to a certain standard?


> If the drivers are so junk, why were they accepted into the Kernel?

It's possible they weren't junk originally, but stopped being maintained. Or some other reason.

Also, WiFi drivers can rely on firmware, which Linux developers have no control over. Which is a bad thing, but it's quite common unfortunately.

In general, when choosing WiFi, find the driver which manufacturer actively maintains. Intel is a good example of such.


Funny, "so they can be maintained" is another supposed reason that Linux forces drivers to be part of the kernel. Seems like that justification isn't holding up to scrutiny.


That's a necessary, but not a sufficient requirement for quality. The driver needs to be both upstreamed and actively maintained obviously, to retain its quality. If you think just getting upstream will magically make it good without any further effort, then you think wrong.


Right, so why force everyone who wants to write a driver to upstream? Just make a stable driver ABI they can interface with and be done with it. If their driver is good, it'll work good, if it isn't, it won't. Same as today really, but without the need to constantly recompile and force people to open their code (like that even matters because, as you said: firmware blobs).


If you don't have both - quality will suffer because developers will make any mess they want without aligning with the kernel. One infamous example is Nvidia's "official" GPU driver. All its problems are caused by it not being upstreamed.

> Just make a stable driver ABI they can interface with and be done with it.

That will also hold the kernel back. So let the driver better be compatible for some version window, but not forever, even if it's provided with dkms. That's a good, not a bad thing. Freezing driver interfaces forever will turn the kernel itself into a mess. No one does it, including Windows.


> One infamous example is Nvidia's "official" GPU driver. All its problems are caused by it not being upstreamed.

Alternative perspective: the lack of any kind of stability in the driver ABI causes all the problems with the nVidia driver.

> That will also hold the kernel back. So let the driver better be compatible for some version window, but not forever [...]

Who said it had to be forever?


> Alternative perspective: the lack of any kind of stability in the driver ABI causes all the problems with the nVidia driver.

Intel and AMD maintain their drivers upstream, and don't have anything resembling Nvidia's problems (general lack of integration). The later ones are very clearly the result of them refusing to upstream it.

To put it differently, Nvidia can make their driver integrated, they just don't want to and it has nothing to do with how fast kernel interfaces are evolving. Case in point, Nvidia do update their dkms exposed parts of the driver, to address kernel changes. If they won't, their driver won't work at all, even through dkms.

> Who said it had to be forever?

You said "be done with it", which means no one will maintain it expecting it to be forever compatible? Otherwise I'm not sure what your point was.


I'm giving nVidia the benefit of the doubt and assuming they have legitimate reasons for not wanting to open source all their code. That makes this a legitimate conflict, Linux enforces all driver code must be open, and nVidia doesn't want to open their code. No one is at fault per-se, there are two solutions to the problem and neither party is interested in compromise.

> You said "be done with it", which means no one will maintain it expecting it to be forever compatible?

I don't know why you'd immediately jump to "forever". Nothing is forever. I just meant "a sufficiently long time". Like with Windows APIs, which still have a lot of compatibility with applications compiled 20 years ago.


> I'm giving nVidia the benefit of the doubt and assuming they have legitimate reasons for not wanting to open source all their code.

They don't, because there is nouveau already and Nvidia are being rather nasty in preventing it from fully working.

See: https://www.phoronix.com/forums/forum/linux-graphics-x-org-d...

> Like with Windows APIs, which still have a lot of compatibility with applications compiled 20 years ago.

Try using Windows drivers from older Windows kernels with newer Windows. They won't work. So it's not anywhere close to 20 years.


True about drivers, but Windows drivers do work for much longer stretches of time than Linux drivers do simply because you don't have to recompile them for every minor change to the kernel. There's no reason Linux couldn't have stable ABIs for major kernel versions other than that they don't want to.


> Windows drivers do work for much longer stretches of time than Linux drivers do simply because you don't have to recompile them for every minor change to the kernel.

So do Linux dkms drivers. Windows is just using "dkms" like approach for everything. But it doesn't have any advantage in compatibility time period.

> There's no reason Linux couldn't have stable ABIs for major kernel versions

Not sure what you are talking about. Linux does have such stable interfaces.


> Not sure what you are talking about. Linux does have such stable interfaces.

Not sure what you're talking about either. I've never had to write a Linux driver, but it is a frequent complaint that one has to recompile drivers for new kernel versions. If the ABI was stable, that wouldn't be necessary.


> one has to recompile drivers for new kernel versions.

dkms requires compilation, but it happens automatically with kernel upgrade. Read more about its design:

* https://wiki.archlinux.org/index.php/Dynamic_Kernel_Module_S...

* https://en.wikipedia.org/wiki/Dynamic_Kernel_Module_Support

* https://github.com/dell/dkms

So not sure where you saw "frequent complaints", those are likely from people who have no clue what they are talking about.


> dkms requires compilation

So not a stable ABI at all then.


Stable API, not stable ABI. Installation through compilation is normal. Those who complain about it have no clue what they are complaining about, because they don't even see it when dkms drivers are recompiled. So again, go learn about it, before saying it's a problem. Otherwise you are tilting at windmills.


I don't think you're understanding anything I'm saying if you think that an API that requires recompilation from source of drivers on every kernel release is at all comparable to a multi-year stable driver ABI.

What was this all about again? Oh right: Linux has bad support for some hardware, you blame the "driver manufacturers" for writing bad drivers, I blamed the kernel for accepting bad drivers.

Here's the crux of my argument: If it is the manufacturer's responsibility to create and maintain drivers, then they shouldn't have to open source them and mainline them because they did all the work and took responsibility.

However, the kernel devs refuse to provide a stable ABI for drivers, making this nigh impossible. They claim that this is because it ensures higher quality drivers that can be maintained without the manufacturer, but that is clearly false because many drivers are crap and not maintained despite being mainlined.

So instead of taking any responsibility for bad drivers, people like you just blame everyone else. A tried and true practice of Linux Desktop evangelists.


> Here's the crux of my argument: If it is the manufacturer's responsibility to create and maintain drivers, then they shouldn't have to open source them and mainline them because they did all the work and took responsibility.

I already said about both requirements. Only those who don't upstream or don't maintain their drivers have such problems. Remove either one - and problems are guaranteed. Stable ABI is only going to cause issues because developers won't care to fix anything thinking that "they are done with it" like you said. And review by the kernel maintainers is a good thing. There should be no trust for blobs.

TL;DR: things are working well with responsible hardware makers. So use their hardware, and avoid irresponsible ones. There is no reason for Linux maintainers to change their approach which works.


When I got my 2015 Windows laptop, sleep mode and wifi worked much better on Linux than Windows, where they were basically broken. It seems simply incorrect to say these things are broken in any real general way in Linux. It just depends.


It's crazy how consistently I, and thousand of people launchpad.net, have been all unlucky for so long.

We really should start to take this karma thing seriously, because I keep getting answer like yours saying I'm just not seen things properly. So I surely must be wrong.


It’s not that you’re wrong. And it’s almost certain Linux has additional issues with IO stuff due to manufacturers not prioritizing Linux drivers.

However, Windows and OSX have their fair share of problems as well. One of the differences is that in the Linux world folks largely try and find fixes the,selves, so they have more postings on the internet, whereas in the OSX/Windows world the discussion basically ends at “wait for an update”.


What are you comparing it to? Every Windows tech support forum, and every driver/hardware vendor support forum? There isn't a centralised equivalent for Windows so what are you using to get a sense of how many people have equivalent issues with it?


I had Ubuntu giving me 10 actual hours including heavy IDE, browser with lots of tabs, docker composer with rails app and postgres, etc.

(This was a gigabyte aero 15x with a 94(?)Wh battery)

I was stunned.


> but those issues haven't been fixed in a decade.

This is far too broad of a statement to make, hence why you get all the "works on my machine" posts. Your post is just as anecdotal as others', with no actual numbers or anything to back up the claim.

Anecdata - I own a Lenovo y50-70 running arch with an Intel igpu, hidpi display, and Intel wifi. Everything worked flawlessly from day one. Surprisingly, the hidpi display worked far better than on Windows and I could leave behind the telemetry, updates, and poor input latency.


No I get those comments because linux users are driven by an ideology and wants to defend it.

The thing is, even among all those comments authors, I know all of them are only talking at best about "that times where it worked for them". They all lived the problems I'm talking about, and are filtering.

Taking arch for example. 3 years ago, I was at a friend's flat. He was using arch because "it just works, never have a problem". And of course, when we set for watching a video, it didn't work. "Codec problem. Just a quick fix he had to do later. Usually it works fine.".

I do training all the time. Half of my job is training people, so I see many laptops with many OSs. I always see the same issues.

It feels like a recurring joke in a TV show.


As I mentioned above, a particular Linux kernel version not supporting your hardware well doesn't mean that other hardware is equally poorly supported. That's the reason you get rebuttals, because your "doesn't work for me" is no more valid than someone else chiming in with "just works for me".

> Taking arch for example. 3 years ago, I was at a friend's flat. He was using arch because "it just works, never have a problem". And of course, when we set for watching a video, it didn't work. "Codec problem. Just a quick fix he had to do later. Usually it works fine.".

This has nothing to do with Arch. And presumably if that codec issue even happened, if you want everything under the sun installed for you out of the box you should be using something like Mint or Ubuntu. I've had zero codec problems myself, things always just worked with VLC and mpv.

> They all lived the problems I'm talking about, and are filtering

No one's saying that Linux works perfectly for all hardware. But it's not broken for everyone either as you claim.

Besides, I'd much rather spend extra 5 minutes installing tlp and tweaking sysctl for battery life than sit through hours of Windows Update installing candy crush and blocking me from doing work.


Font configuration is user solvable with configuration. Some of the options were literally forbidden from being distributed by parents but could be installed by the user.

Google infinality. Exact advice depends on distro.

Sleep is legitimately a pain.


I've read the same comments since 2004.

It doesn't help to solve the problem at all.

At best, sometimes, it encourages you to spend a lot of time trying to find a solution that may work, or wreck your system (Exact result depends on distro).


I've never spent more than 5 minutes setting up font rendering on a new distro using the package manager and the configuration gui.

I cannot imagine how I could break my system. It's probably true that there is a ton of bad advice a quick Google away especially since it's easy for advice to persist for many years after it's use by date.

I guess the lesson is don't enter random commands you don't understand from guides from 2008.


I used to use Windows and Linux for a very, very, very long time. Never even thought "hey, those fonts don't look quite as good as they could". Then one day I tried Mac. Period.

Oh boy, ever since I just can't use anything else because of the fonts. It visually hurts my eyes.


Font rendering is one of a handful of things that keep my Mac as my daily driver, and my PC is just for gaming (and occasional experiments with Windows dev, just to learn new things).


https://www.freetype.org/freetype2/docs/text-rendering-gener...

> It turns out that font rendering in the Linux ecosystem has been wrong since scalable fonts were introduced to it. Text must be rendered with linear alpha blending and gamma correction, which no toolkit or rendering library does by default on X11, even though Qt5 and Skia (as used by Google Chrome and other browsers) can do it.

The primary effect of no gamma correction is that light-on-dark text is too thin (and dark-on-light is thick but readable).


I used to care a great deal about font rendering, but when I got eyeglasses I had trouble learning to focus with them. To make it easier, I disabled anti-aliasing, giving the fonts maximum sharpness, and making it obvious if I was focusing correctly.

Surprisingly, I got used to this and never switched back. The most legible font is the one you're most used to, but if there's any objective measure of quality, it has to be sharpness. No anti-aliasing = maximum sharpness. It might look ugly at first, but you get used to it quickly. I recommend trying it.


OS X objectively has the absolute worst font rendering. It does not use hinting at all and relies only on grayscale anti-aliasing.

And yet OS X's font rendering being by far the best for me is a key thing that has kept me on the Mac despite all the other warts. I find text elsewhere horribly blocky. It's all very subjective, it seems. Also, I think the grayscale antialiasing is a new thing, it certainly didn't used to be the case, but maybe they switched how it worked once retina screens became the norm.


My subjective take on font rendering, based on tweaking and hinting web fonts, is that MacOS has the best font rendering (since their laptops have been using retina displays, with the exception of the Macbook Air; even here the font rendering may look a little blurry but the fonts look quite good and are readable); Linux has quite good font rendering (they may make the letters look bigger on low resolution displays, but things are nice and readable); and Windows font rendering is very uneven.

Most browsers have their own take of Clear type font rendering when rendering web fonts. While some make web fonts look quite good (Firefox, Internet Explorer/Edge), Chrome has had issues with using settings which make fonts harder to read; I had to increase the weight of the font I use some to compensate for this. Clear Type, on the default settings Chrome used for a long time, is really great, if you’re rendering a Windows font like Calibri or Cambria. For anything else, the results are uneven. (I think Chrome finally started tweaking things in Windows to look better)

In terms of the linked webpage, his comparison is unfair: He is comparing how Arial, a Microsoft font, looks in Linux compared to how it looks in Windows. Liberation Sans has the same metrics as Arial, so is not a good comparison font; he should had used something more OS-agnostic, such as Bitstream Vera Sans (DejaVu Sans if you want more languages).


For everyone criticizing his views on MacOS X, try using an external display with a lower PPI. It's not obvious on native laptop screens with 220+ppi.

On my Dell U3415W (109ppi) the issues he pointed out are very obvious. An equal sign (=) for example has a much thicker and blurrier bottom bar than the top. The rendering of the H in "History" is different than the H in "Help" in the menu bar.


Personally, the font rendering in a Windows terminal looks abysmal to me. I hugely prefer my linux terminals (out of the box).

Just my 2c


That's not a terminal, but a console, and uses ancient GDI graphics.


Even then, it depends on your configuration. I was amazed at how different the fonts looked on Fedora vs Ubuntu on identical laptops.


Weird. I just opened a PDF on my linux PC, everything displayed perfectly. I haven't done anything with fonts, everything is default Fedora install.

Linux seems to have a lot of detractors around these parts. This person obviously cares more about fonts than freedom, privacy, and respect for the end user.


I absolutely can reproduce the main issue he is describing at the end. Open https://trailofbits.github.io/ctf/ and look at "Willing". Compare with https://pandasauce.org/images/fonts/72-liberation-browser.pn.... On my linux system - where I spent considerable time making the fonts look acceptable after moving to it from Ubuntu - there is also the kerning issue in that word. That's a shame.

There might be inaccuracies and subjectivity in the article, but if really all engines have that issue that's a real problem.


If I look at 'willing' closely on that page, there is what I perceive to be about a 20% increase of spacing between the the second 'l' and the second 'i' character.

I didn't notice it at first, I had to look at it very closely.


That's horrible, isn't it? The author does have a valid bug there that really is a big issue if you care about this sort of thing (some people just don't care how fonts look and can't notice kerning issues), but the whole comment thread is discussing very subjective rendering differences between Windows, Linux and OSX, where Linux isn't even bad. Though I think the author did himself no favor with not leading with that bug.


Meh. Not a huge deal, like I said, didn't even notice it. If I zoom to 110% or more, the spacing is fine.


The kerning is absolutely perfect on my system, both Firefox and Chrome, both portrait and landscape. It looks nothing like the disaster on the article's snapshot.

(Debian Stable, 3x 1920x1200 monitors, very likely I have tweaked most tweakable things at some point or another -- this system is 15 years old.)


Could you make a screenshot? But if the system is very old it is possible the bug is not present yet, the author mentioned it is caused by modern changes colliding with the hardcoded assumptions about the grid placement. If those are still current on your system it might just still work like it was supposed to.


When I say "the system is 15 years old" what I mean is that there is a continuity between the Pentium-III-550 with 128MB RAM -- running Debian 3 on a spinning IDE disk -- and the i5-4570 on my desk now with Debian 9.8 on 16GB and ZFS on two SSDs. The software has been continuously upgraded, the hardware has been occasionally upgraded; everything is up to date, but if I did something exciting to /etc/fonts.conf in 2014, that's probably still there.

> everything displayed perfectly

But what do you consider 'perfect' font rendering? Maybe it's a lower bar than the author.


I use Fedora and I've never noticed anything wrong with the font rendering. However, now that I've read the article, I'm guessing that's why I notice more eye strain when doing lots of reading on my computer versus the other (Windows) computer in the house.


Fedora includes fonts to cover most every common Unicode glyph. If you visit my HN profile page, I see few boxes on Fedora, compared to Android or Debian.

Additionally, their font hinting looks good out of the box and it can be configured in GNOME Tweak Tool.

I’m impressed at the Fedora team’s attention to visual detail.


I'm probably less picky than the author of the article, but issues I do notice are (in Linux, but since I only use it I don't know if they exist in other OSes too):

-some websites render with a very strange font, that includes quote symbols '"' being very tiny, which I'm not sure if actually intended like that or a problem in linux specifically, and sometimes all letters being rendered in such ugly way to be hard to read

-symbols in mathematical formulas (some types of arrow, ...) rendered as colored emoji, even if they shouldn't in that situation since it's not a chat program according to the unicode spec

-sometimes unicode chars becoming a square box, even if having tons of fonts installed


> according to the unicode spec

Does the Unicode specification cover how glyphs should look and what colour they should be?


> Does the Unicode specification cover how glyphs should look and what colour they should be?

Umm... yes? Unicode includes “reference glyphs” in its character charts. They’re called “reference” for a reason: implementations of those glyphs should appear substantially similar.


I recently moved to nixos with sway for my desktop and after trying Firefox with wayland there I realized I had never seen text in linux so sharp.

I read the whole post but could not tell if what he mentions implicates both xorg as well as wayland?


This article is "OBJECTIVELY" bad.


I came here to bitch about the fact that Mac has "objectively" the worst fonts, when I've preferred it over Windows or Linux for years. Glad to see everybody has my back.


Really interesting read on how fonts render on different systems. However a huge reason why I went from Win to macOS are the incredibly frustrating scaling issues. At work I gave up and just set my laptop display to 100% since I’d rather deal with the occasional tiny text than always having incredibly blurry fonts in certain crucial apps like Outlook.

I do agree that macOS fonts are nearly unreadable without a Retina/4K display. However I’ve never noticed any scaling issues on macOS.


There is a way to restore Infinality-like font rendering on freetype2, which improves text readability. Linux users may find these instructions helpful:

https://gist.github.com/cryzed/e002e7057435f02cc7894b9e748c5...

Skip the "Removing the infinality-bundle" section if you don't currently use Infinality.


I honestly think freetype on Linux looks much better than windows (too thin and sharp) and mac (too muddy).


A related article from AGG talking about some of the subtleties of font rasterization: http://www.antigrain.com/research/font_rasterization/


"The sad state of <minor desktop feature> on Linux"

Linux on the desktop is death by a thousand cuts.


Yep.

I just tried a three month experiment with using Linux as the primary OS on my laptop. That experiment came to an end last week, and I'm back to macOS.

What killed me was:

- wake-on-open working about 33% of the time. The other 66% of the time required a reboot.

- substantially worse battery life

- clunky handling of Exchange calendars. The major Linux applications are fine for personal use, but they really struggle in an enterprise environment.

- buggy rendering of Word docs (both OpenOffice and Abiword, but the bugs were different)

- no easy way to change screen resolution over VNC. This turns out to be important if you actually try to use screen sharing to get stuff done, and use two or more different platforms as the client.

- the straw that broke the camel's back: my linux email client worked well (for emails - not calendars, etc) for almost the entire three month experiment, until last Friday it somehow decided that my password was different from what it actually was, and insisted on interrogating the enterprise email server several times a second using the bad password. The Exchange server here is configured to lock an account after N incorrect passwords. So effectively the client auto-locked my account every time I fired it up. I still have no idea why it's using an incorrect password, and debugging it wold require my sysadmin to essentially stay on the phone with me and keep hitting the "unlock account" button, or whatever he has to do. My relationship with him would not survive this, and using Linux isn't worth pissing him off.

So, back to the Mac for me. I'll try again in another five years and see if anything's improved. It wasn't any one big thing. It was a lot of little things. Bugs that probably were work-around-able, but not worth it. Little bits of friction in the office applications interfaces that made using them just a little more painful. Multiply by a thousand, and you have a substantially worse productivity environment (unless you're a software developer 100% of the time, which I'm not).


Considering the problems... were you actually trying to run Linux on a MacBook, which is pretty much a full proprietary, non-documented platform with bunch of Apple proprietary and undocumented power management controllers?


No. Dell XPS 15.


Regarding the VNC point in your post, I’ve experimented with this a whole lot in recent times getting a good remote desktop solution going. TigerVNC server, using its embedded X server, plus the TigerVNC viewer, work absolutely perfectly. Even including the viewer supporting proper dual head, resizing, switching and all.

Of course, if you want to use your existing X server it might not work as well, I haven’t tried it.

For most of the rest of your complaints, I use web apps for everything, even on OS X / Windows. Sad at the state of native apps on both Windows and Linux.


I can't use web apps. My org doesn't allow any proprietary data to exist on the cloud.


#s 1 and 2 are hardware. I genuinely haven't had those problems in ~4 thinkpads I've used over the years that have been linux-only.

Linux is a lot easier on g-suite enterprises than in exchange/office ones. Considering how much of the latter's going online now, maybe it won't matter at all later.


I think it depends on one's requirements.

If you want a system that can be customized heavily (for example you don't like using the mouse and want to use keyboard for everything), Linux is a godsent.

If you want perennial consistency in your work env, it's very helpful too. For example, I've been running the same setup for 12 years, bringing it over from the old laptop every time I got a new one.

In exchange for that convenience, I have to wrestle with wifi/ethernet settings once every few years (when I change my hardware); for me it's a small price to pay.


Absolutely, the trade-off is a no-brainer to me. As a software engineer I would hate to be stuck on windows and see my productivity suffer.

I think windows is not inherently worse though, it is probably great for how some programmers work or think, but to me Linux distros feel like a better match


"I think it depends on one's requirements."

Computers are tools, endpoints these days. I have been happy with Linux but then again my needs are not super-complex.

I have a recent freetype with a defaultish configuration and the results on my small by today's standards monitors are readable.


I would not use it on a laptop, but it my OS of choice for my desktop. Battery life, wifi, power management, etc. are all terrible. On Desktop, almost none of those things matter enough.


I quite like the fonts on linux with my 4k monitors, subpixel smoothing, and slight hinting. Mac and Windows are also fine on a ~200 dpi monitor. Believe this is a solved problem for those who do not obsess about fonts.


The bad part is that you need to do a s/objective/subjective/g on it. Once you realize it's subjective, it's a fun fine informative rant


macOS had subpixel rendering until the most recent version (Mojave), where it's disabled by default.


Funny how you singled out Linux, a free/open source OS.

Because just a few days ago I read that fonts, as well as UI elements in general, are a complete mess in both Mac and Windows, two paid/closed-source OSes, when you try to use them with Hi DPI displays.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: