Hacker News new | past | comments | ask | show | jobs | submit login
MacOS-like Fonts on Manjaro/Arch Linux (aswinmohan.me)
170 points by aswinmohanme on Sept 25, 2020 | hide | past | favorite | 158 comments



The headline's a little misleading IMO - this article isn't about making Linux render text like macOS does (hinting and anti-aliasing methods etc), it's just about replacing the default fonts

Edit: Headline's been changed, it was "Get macOS like font rendering on Linux"


It was only a few years ago that, if you didn't want everything to look like crap, you needed to install Infinality-patched libs and explicitly configure font hinting. How far we've come.


it was also astonishingly hard to do. turns out just because you can tell things don't look quite as nice doesn't mean you know how to tweak the infinality settings to achieve a better look - or at least i didn't. i finally gave up and went back to the default fonts. (ubuntu looked much nicer at the time, but even the package that claimed to reproduce ubuntu fonts on arch didn't quite work)


Few years? Last time I had to install Infinality patchset was about 10 years ago. Font rendering on Archlinux has been looking great for years now.


I was thrown by this, too, hoping to read about rasterization.


Ah I went through this, then trying to get the trackpad to work as nicely as macOS and then trying to get universal shortcuts that work nicely across the whole system... and then realised I'd created a crappier version of macOS inside Linux and switched back.

Edit: just to clarify I'm not bashing Linux, more my usecase.


There's a project working on exactly that, improved trackpad designs. Support varies between trackpads, but from what I've seen there's a lot of progress in this area.

As far as I know, shortcuts are already pretty much global. Keyboard navigation, copy/paste, open/close/save, it all pretty much follows the same Windows standards. It takes time to learn to use all the shortcuts though, and a lot of them aren't intuitive. I've had the same experience when I tried macOS, none of the shortcuts made sense and behaviour was often non-intuitive coming from a non-Apple platform.

If you're happy with macOS you should totally stick with macOS, but luckily there's people working hard on solving the most glaring issues with modern UIs. Once trackpad support improves, a lot of people might feel a lot better at home than they do right now.


Command+, to bring up preferences is surefire, does linux/windows have the same yet? I remember Command+W not working as consistently as well.

Word and like jumps in Windows also felt backwards to me, doing the equivalent of w instead of e in vim, and couldn't be consistently combined with shift.

Support for Alt+Dpad(+Shift) and Command+Dpad(+Shift) is important IMO. As well as Double/Triple-click+Drag: it should select additional units not letters.

These last paragraphs are OS-level but the features were missing or inconsistent.


> Command+, to bring up preferences is surefire, does linux/windows have the same yet?

I just tried this in macOS (Catalina), in Chrome while I happened to be looking at a design page in figma.com and nothing happened.

Not so surefire I guess. To be fair - I assume the web page was capturing that shortcut. It worked on the new tab page.

The point though is that this is an app thing, not a macOS thing. If every app on Windows and Linux decided to use the same exact shortcut for opening preferences, then we'd have that. On macOS, Apple does not strictly enforce Cmd+, for opening preferences - any macOS app can use that for whatever they want.

> Word and like jumps in Windows also felt backwards to me...

Jumping words is done with Ctrl+left/right arrow in Windows and Linux. On macOS it's Option+left/right arrow. On all systems, it can be combined with Shift to highlight the word.

> Support for Alt+Dpad(+Shift) and Command+Dpad(+Shift) is important IMO. As well as Double/Triple-click+Drag: it should select additional units not letters.

Not sure what this means, but on my Linux desktops I have absolute freedom to make my keyboard and mouse to do just about anything I can dream of. Meanwhile, in macOS I am often told that if I am not loving the way that Apple has chosen for me to behave, then I must be expecting the wrong thing...


I really miss the compose key functionality from X11 whenever I use Mac or Windows. No need to remember weird number codes or open applets.


You can do very similar things on macOS. `Option+E, A` generates á, `Option-U, U` yields ü, and so forth. There are tons of alternate characters available like this using the Option keys, generally covering the most used glyphs I've needed. Beyond that, the Rocket app is fantastic for finding and inserting emojis and more complicated emoticons by human-friendly name.


That’s not the same, that’s basically just AltGr. In X you can do ‘Compose, o, o’ → °, ‘Compose, C, o’ → ©’, ‘Compose, s, o’ → §, ‘Compose p o o’ → «a character HN doesn’t allow (U1F4A9)» and ~6000¹ other combinations. X also has a “mac” layout variant I’d like to hear a macOS user’s opinion of (haven’t ever wanted to use it myself).

[1]: ± some depending on if you count multiple ways of creating the same character, etc. Eg. ‘≠’ can be made by combining / and = in either order. Also ‘Compose, number, s’ for footnotes, ‘Compose, +, -’ → ±, ‘Compose , <, '’ → “‘”, ‘Compose, -, >’ → “→”, ‘Compose, =, >” → “⇒” are nice. And υɳⅰеηⅽоⅾе.


Have you checked out Karabiner[0]? I've read[1] about how you can do some pretty neat things with it.

0: https://karabiner-elements.pqrs.org/

1: https://blog.jkl.gg/hacking-your-keyboard/


I just tried Cmd-, in both Safari and Chrome, and in both cases it brought up preferences. That a website might interrupt that doesn't prevent it from being a MacOS standard.


it all pretty much follows the same Windows standards

This is why I don't use Linux on the desktop. (About half of my servers are Linux.)

Every couple of years I'll bring out my old HP laptop and spend most of a week installing all the newest desktop environments to see how they've progressed. I'm always disappointed.

I've been counting on Linux to be the big disruptor for two decades now. But more and more it's just becoming the poor man's Windows, by aping Windows conventions.

If I want Windows, I'll run Windows. If I want macOS, I'll run macOS. What I want is something different, not something that tries to be a watered-down version of both.


There are so many possible graphical environments on Linux. Gnome for all its faults isn't really windows lite. KDE has a LOT more functionality built in than windows is liable to just with kwin. With KDE + compiz + plugins the amount of functionality explodes, if you enable the close animation where all windows break into pieces and fly off screen literally.

In 2009ish I had a ui where you could zoom out to a giant wall and rearrange all the windows on your virtual desktops and use a macish expose to bound to a mouse key to pick from the windows on the current desktop. The visual metaphor was better than macs and windows and to boot there were so many knobs you could tweak virtually anything.

Tiling window managers like i3wm and 17 more are both minimal and powerful. Notably i3 has MUCH more powerful keybindings and treats individual monitors as virtual desktops.

With xmonad your window manager is a haskell program with awesome a lua one.

If you want to go all in on emacs you can make it your window manager too.

Honestly its only windows lite if you want it to be so and really everyone is basically xerox lite to some degree or another because we are basically using the same basic metaphors described therein.


I've actually seen the trackpad project which is really cool. The issue I had with shortcuts was finding consistency across different software, the basics but also all the text manipulation shortcuts that exist globally on macOS, it felt as though everytime I downloaded something new I needed to set it up again. It's quite possible I hadn't configured everything correctly though.


I'm using Linux distros for like 10 years now and recently I (again) didn't even know how to close a window. Is it Super+W? Super+Q? Alt+F4? All of those 3 might work. How do I access app preferences? Ctrl+Alt+P? Ctrl+Alt+O? Ctrl+Alt+S? You know, because "preferences", "options" and "settings". All those shortcuts work depending on the app.


At least Linux and Windows shortcuts are easier to remember as they start with basically words, not a combination of weird symbols you never learn cause you never look at them (cause who looks at the keyboard when they type).


Not everyone is English speaking. Mac shortcuts are very easy to remember because they are consistent. I'm not sure what weird symbols are we speaking about? Cmd and Option buttons? They have both a symbol and a name. What about Ctrl and Alt? What's the difference?

I used to hate Mac shortcuts, because I never really spent more than 30 minutes with the system. I think there is nothing to hate - they are just different and IMO better.


That's kinda been my experience, too. The amount of detail Apple has put into their drivers and software is truly underrated (and overshadows by their more recent scummy business practices).

The surprising thing to me is that most of the small details I rely were in OS X 10-15 years ago. If a Linux distro were created to mimic OS X 10.4 I would be perfectly happy using it as an everyday OS and recommending it to others. I keep hitting dead ends in my research, and it seems like the major desktop environments have a lot of architectural problems preventing them from implementing some basic UX improvements.

The barrier of entry to help develop these features also seems high to me, coming from web development.


What recent scummy business practices did Apple do?


Trying to run and enforce a monopoly on repairs, for once.


Their products are manufactured by literal slaves in concentration camps.

https://www.businessinsider.com/apple-forced-uighur-labor-ip...


By which you mean a single organization has claimed that many Uigher people were forcibly moved to new areas, where they found work in factories making goods for a variety of western companies, including Apple.

Apple and Foxconn both claim to be unaware of any irregularities, and Apple said they're investigating.

I rather think the OP meant something more like their practices surrounding their App Store than this not-quite-substantiated and hyper-sensationalized claim.


We have many credibly reports that slaves have been placed in concentration camps virtually entirely because of their ethnicity and or religion where they are obliged to work for free. The malfeasance of China in this regard and others for example using religious minorities as literal organ banks to be slaughtered for parts has been discussed for years. Anyone who claims at this point to be "investigating" is entirely dishonest. Characterizing this as one report is entirely incorrect.

They haven't changed because moving away from china would cost them billions and they as most people would rather keep themselves warm with a pile of money than worry about the ethical implications.


Nobody here is disputing that China is abusing Uigher people.

But now you've shifted from the part I found dubious--that companies are knowingly engaging in slave labor--to a more clearly established pattern. One can be both horrified by China's treatment of Uigher people and skeptical that Foxconn, Apple, or any one of the 82 companies to which you've referred are complicit.


The linked Business Insider piece falls into a category of stories that come out fairly regularly which could be described as "exposes on truly bad things about the global supply chain that many technology companies benefit from but we've learned we get way more attention if we say 'APPLE <sub>and also many others</sub>' and focus on Apple specifically." This is a pretty classic example on BI's part. The NGO in Australia that the BI piece links to describe their own findings with "The Australian Strategic Policy Institute (ASPI) has identified 27 factories in nine Chinese provinces that are using Uyghur labour transferred from Xinjiang since 2017. Those factories claim to be part of the supply chain of 82 well-known global brands." BI puts the focus specifically on one of those 82 well-known brands, writing, "Although the report highlights that the displaced Uighur workers are present in many different companies' supply chains including Nike, BMW, and Amazon, Apple features prominently as a case study." That's both true and misleading -- they could also have written "Nike features prominently as a case study" or "Adidas features prominently as a case study." The big Apple suppliers mentioned in case study #3 also contract for Samsung, HTC, HP, Microsoft, Oppo, Nintendo, Sony, Oculus, and others.

None of this "excuses" Apple; it's just that there's something faintly disingenuous about focusing on them as if they were a unique source of this problem. We could just as easily say "your Nintendo Switch is manufactured by literal slaves in concentration camps," but we don't, because... it's not as satisfying to be outraged at Nintendo, I guess?

At any rate: I strongly suspect the OP was referring to recent controversy surrounding the way Apple runs the App Store.


82 global brands are profiting from Chinese slaves would be even more accurate and doesn't in any way diminish the guilt of both Apple and literally everyone who bought a phone manufactured by slaves.


The keyboard shortcuts are a major pet peeve of mine on Linux. It wouldn't be technically hard to make the default text editing/WM keys use Super, but I think that Windows-style and Emacs-style shortcuts are too much of an ingrained assumption at this point for it to be feasible to offer system-wide Mac-style keybindings too.

The trackpad woes are a clear priority for some members of the Linux community and I can visualize a future in which trackpads work nearly as well on Linux as they do on macOS. Offering consistent Mac-style keybindings everywhere (maybe through a GTK setting, like switching between Windows-style and Emacs-style?) seems like a harder problem, especially because of the politics and ingrained assumptions.


Yea, I went down that path, too. Super puts less strain on my fingers, and feels more natural (also, I've been using 95% Macs for half my lifetime, soo...it's a lot of retraining).

It surprises me that there isn't a distro or easy to use tool to use Mac-style keybindings, given the overlap of Macs and Linux systems in tech.


You should look into kinto, it completely remaps the modifier keys system-wide and worked pretty well when I tried it (I switched to i3wm so I use super for i3 shortcuts now)

https://github.com/rbreaves/kinto


Ooh, that's really cool. I hadn't even thought of dynamic rebinding as a potential solution—it seems a little more hacky than proper native functionality (especially since application menus will still display the CUA/Windows shortcuts), but it sounds like a great way for people used to Macs to comfortably use Linux and Windows. Thanks for sharing!


I for one would love for [a version of] Linux to use the keybindings of macOS (in particular emacs bindings in all text fields) and really good trackpad support.

There are many subtle things that works amazingly well on macOS. Most immediately, scrolling with the trackpad is smooth, responsive, and smooth. On Ubuntu 20.04 (XPS 15 9550), my experience was pretty much the opposite and with visible tearing. I'm not trying to bash Linux, I want to move to it, but these are the things holding me back.


The shortcuts thing stung me at first too, but I pushed through it and I'm glad I did. There were two places I used Mac-style, super-key bindings; text navigation and copy/paste. For copy/paste, ctrl really is fine. I don't use it much and it was just a matter of retraining muscle memory.

For navigation (end/beginning of line/word) I just learned Vim bindings, which are better anyway and work on any OS for pretty much every text editor.


Even if I wanted to relearn (I don't), you still have the problem of inconsistencies. In terminal copy is Shift-Ctrl-C, in Firefox it's Ctrl-C, and you don't have all of the emacs bindings my muscle memory relies on (which is a big deal for me - I use them all and all the time).


Literally the only app I've every used in all my years of Linux that doesn't have Ctrl-C/V is the terminal (for obvious reasons). But that's so easy to re-map that it's not worth complaining about:

https://github.com/pkulak/dotfiles/blob/master/.config/alacr...

I had no idea that Mac used emacs bindings all over the place though. If I was used to using them, that would probably have been a deal breaker for me too.


That is amazing. Pray-tell, what exactly do you use the terminal for?

All the control characters have pre-existing meaning, for example in bash Ctrl-C sends SIGTERM to the foreground process, while emacs uses it a prefix command, etc.

This is the crux of the problem. You _cannot_ steal the control characters if you want the terminal to be a terminal and that means you either end up with an inconsistent mess or you pick a new and neutral (say, Cmd) modifier for your UI needs. Apple got it, Microsoft didn't (because they didn't do anything Unix like). That Linux chose to imitate Microsoft is equal part historical and tragedy. EDIT: typos'R'Us


Hmm... not sure what you mean. I just swapped ctr-c and ctrl-shift-c so that copy would be the same in every app.

Not to drag us too deep into philosophy, but you can make a pretty good case for the Linux way as well. On my machine, the super key is mapped exclusively to window manager shortcuts. It's how I move, close and resize windows, change splits, open apps, switch workspaces, pop open temp terminals, emojii choosers, a calculator, all that quality of life stuff. Copying and pasting is an app thing, and apps get ctrl and alt as their modifiers. It does make sense to me.


You clearly aren't an emacs user. Remapping the control and alt key will make it very difficult to use. Same goes for a few other "TUI" applications like it.


No, not at all. I won't use Mac-only software. ;)


Or just highlight the text and middle mouse button to paste. A standard in X11 for a long time that drives me up the wall on Windows and MacOS when I wonder why the text I just "copied" isn't pasting.


I don't want to sound snarky or overly skeptical, but I've seen a lot of people say that the trackpad under macOS is a lot nicer than that under linux, but I don't see what they mean. I have a ~6yo linux laptop as my primary machine and I pretty regularly help my mother-in-law with her ~1yo macbook pro and, other than the mac's trackpad being quite a bit larger, feel no difference between them. I grew up using macs, so I get that they often have that extra bit of quality that's not always easy to put into words, but I just don't feel that with the trackpad. Just curious what the difference is for you?


There isn't much difference nowadays. I think it's just a matter of preference as long as the trackpad is made of glass, which aren't hard to find on PC's anymore.

I even prefer my PC trackpad on Linux vs my Mac trackpad on macOS because on Linux there are many more settings I can configure to get it working exactly to my liking.

Personally, I don't like the cursor acceleration profile in macOS by default. You can change certain values through the terminal, but the Synaptics and libinput options on Linux are way more in depth and packaged in a nice GUI (at least in KDE Plasma).


I haven't used very many laptops, but I am one of the people that will say the mac trackpad is better. I can only give anecdotes, but I have thought about what feels different.

2012: Sony Vaio, running linux: Trackpad was infuriating, so I used an external keyboard + mouse. I'm not sure what it was - the sensitivity of the trackpad was very high so it was hard to point at things, combined with separate buttons.

2015: Macbook pro. The trackpad was so good, and gestures so well done and supported, that I _won't use a mouse_ anymore on my work machine. (Five years later, still feel the same.) Clicking was as easy as tapping on the trackpad, and scrolling with two fingers (as well as two-finger click as right-click) works flawlessly.

2015: My wife has a windows laptop, and its trackpad is infuriating. While in theory it can be clicked anywhere, I always have to _Very Deliberately_ click (significantly harder than on my macbook's trackpad) before it can reliably register.

2018: Using a newer (2016) macbook, with touchbar and a larger trackpad. This has a larger trackpad, but it somehow doesn't register incidental thumb-bumps on the trackpad as clicks (an issue that my older mac sometimes would do).

2020: I moved to a 2020 macbook pro, my wife has a new windows laptop, and we have a Chromebook. I've used them all -- and still notice a strong preference for the mac's trackpad.

Windows trackpad handling has improved a TON since her old laptop, and it no longer feels like I need to press super hard to get clicks to register, but it still feels more finicky about recognizing clicks than the mac does, while still being harder to get the pointer where I want it to be. It also seems to react to accidental touches on the trackpad that the mac would ignore while typing. The chromebook's trackpad seems better than I expected for a $300 machine, and is better than her old laptop's trackpad, but it still doesn't seem quite as responsive as the mac's.

I'm not sure how much of this is hardware versus drivers. Windows laptops seem to be Very Good, in fact, these days. I still prefer the mac's trackpad behavior better, though.


In a word, precision.

PC laptops are catching up lately for sure though. Almost there. (I havent even tried some of the latest stuff, I heard Razer has really good "Precision" trackpads?)


Theres a level of precision and responsiveness that I wasn't able to replicate. I even installed Arch on a macbook pro once and attempted to replicate it but it never felt right no matter how much I fiddled with parameters. This was years ago however and I know theres been some more work done in this area and someone with better linux skills might be able to do a better job than me.


What gestures were you missing?

I have libinput-gestures working just fine with 2, 3 and 4 finger events, although I vastly prefer a 5-button vertical mouse (Evoluent) and keeping my fingers on the keyboard over a touchpad...

With some small scripts I was able to have things that macOS can't do out of the box at all, like maximizing windows, snapping windows - pretty much anything I could script, I could do with a touchpad gesture.

One thing that really matters - you have to have the right brand of touchpad for it to work. Mine is an Elantech (ELAN0501).


>trying to get universal shortcuts that work nicely across the whole system

Did you succeed? For me that was the point where I gave up and just went back to OSX


I agree that MacOS fonts look nice, and also that Linux can look really bad if not configured properly. But that has been less an issue with modern GUI oriented distributions which configure fonts quite well out of the box.

It is still very cool to be able to tune fonts that precisely on Linux, and it's a major feature.

Also, on many linux distribution you can modify your font configuration per user, here ~/.config/fontconfig/fonts.conf (path may vary).


I find Linux's font handling weird. I use consolas for programming and on the same system it can look different in different editors. And then it never really looks the same as it does on Windows despite being the same TTF.

I don't really have the patience to faff about with font config files, it should 'just work'.


3rd party applications/editors can render fonts however they'd like. VS Code's font rendering is almost certainly not native and I don't think Chrome's font rendering is either.


Chrome uses Skia which - on Windows - uses DirectWrite which is basically native however...

DirectWrite has a whole bunch of flags that configure sub-pixel and anti-aliasing as well as natural widths etc. So it might not look the same. To make matters worse it changes what settings it uses depending on the size of the type IIRC and even had a few font-specific tweaks there as well.

Windows apps in general might not look the same depending on whether they are using GDI, GDI+, DirectWrite (all considered native) and any combination of those flags... and of course it depends on whether the user has ClearType turned on and if they have used the Tuner to calibrate the rendering further for their preference...


I've found it's not so much the distribution, but the desktop environment, combined with an HiDPI display. I think I haven't had the need for any special changes in years, other than maybe changing the hinting style based on personal preference.


> It is still very cool to be able to tune fonts that precisely on Linux, and it's a major feature.

Agreed. While much of the discussion surrounds making font rendering more Mac-like, fontconfig enables people to modify font rendering to suit their personal needs. For example: one user can turn off anti-aliasing to enjoy sharper fonts, while another turn it on to enjoy more accurate font rendering. Heck, you can even tweak things down to the font and application level without relying upon each program exposing that functionality.

This is not the sort of feature that will appeal to many people, but it is incredibly useful for those who use it.


Personally all my Linux machines use Adobe Source Sans as the default font, but Inter would be my replacement for San Francisco.

https://rsms.me/inter/


+1 for Inter. I’ve been using it as my system font on KDE for years and it renders really well at every display resolution I’ve tried. I’m picky about my OS fonts too.


nice font! what are you using as terminal font?


I use JetBrains Mono now but I've also used Source Code Pro before. Really good fonts.


I use Adobe Source Sans and Source Code.

With Inter, I'd probably use Hack, an update on the great DejaVu Mono which was also the source of Apple's Menlo, their previous default. It shares some features with Inter like a single-story g.


Why don't people just grab the San Francisco fonts off GitHub?

[1]: https://github.com/search?q=san+francisco+font


Apple’s license I don’t believe grants you those rights.

https://stackoverflow.com/questions/32178464/is-possible-to-...


You can use it on your own computer, you just can't publish content with it that isn't iOS-related. They're not going to come knock on your door because you've set SF to be your system font.


The result would be the San Francisco font... rendered terribly.


Yes, I have tried SanFransisco and the results were terrible. Also I didn't want to look at information rendered by a stolen font.


The "SF Pro" font family is directly downloadable from Apple and free for personal use. https://developer.apple.com/fonts/


[flagged]


I would argue that it can. The question is if an individual finds it unethical or not, not if it's being stolen. Legally speaking at least.


The argument is that for something to be "stolen" it has to be missing afterwards. An unauthorized copy is copyright infringement, legally speaking, not theft.

Words mean something and there is a reason why right holders tried to coin this (pillaging, murdering) piracy to give it a bad connotation. Then the various "pirate parties" said if you want to call large swaths of the population pirates, then we will be taking that word from you (somewhat similar to pride movements).


No contest. You've convinced me.

For the record, I have little to no issue with "unauthorized copies", for the most part. I just disagreed with some of the phrasing.


Interesting argument.

Secrets can be stolen, but they're not missing after.

Spies steal secrets, not copyright infringement them :)


> Secrets can be stolen, but they're not missing after.

You technically lose the secret in this case, since it's not a secret anymore.


Espionage is also not theft in a legal sense


> The argument is that for something to be "stolen" it has to be missing afterwards.

...assuming the word has only one usage -- which it does not. The word can also be correctly used to describe an act of appropriation.

"Stolen" is not necessarily a synonym of "theft"


> ...assuming the word has only one usage -- which it does not.

To me it sounds like propaganda against information sharing, by coining negative words, like "piracy":

https://en.wikipedia.org/wiki/Copyright_infringement#%22Pira...


Digital information cannot be stolen.

The courts ruled on this last century. At least as far back as the 80's when people were stealing HBO from satellites with TVRO dishes in their backyard.

It's time to stop beating that horse.


The courts ruled it a crime, they did not rule it to be theft. Presenting it as such is a politically motivated distortion of the the facts.


I'd go for Lucida Grande. Never should've changed from that.


Some people prefer to respect copyright.


Eh, I don't get it. When looking at the before and after screenshots of Wikipedia, I greatly prefer the "before" fonts. It looks easier to read and less squished. Though they both suffer from the much bigger problem where the lower case L is indistinguishable from the capital i.


Back when flat-panel displays first became mainstream, Microsoft and Apple took different directions as to how to utilize subpixel rendering.

Microsoft chose to align strokes as close as possible to pixel boundaries for a sharp, crisp look even if it meant distorting the shape of the glyphs. Apple on the other hand decided to render fonts as close as possible to their original shapes even if it caused some blurriness around pixel boundaries.

Linux made its own flavor of both options, and left it to the user to decide which one they wanted. The default on many distros is closer to the Windows look, though. So if you are used to the Windows look, you will probably prefer the default option than the tweaked-to-look-like-Mac option.


I was going to make the same point that this only really makes sense if you are used to the way that Mac OS renders fonts. I switched back to Linux a couple of years ago and this is one reason why at this point I just couldn't go back.


> Either you love gorgeous typography or just don’t care. If you are the former read ahead on how to make the font rendering on your Linux look just as awesome as that on macOS, else read on to find out what beauty you have been missing.

Not an impressive way to start. I do not like the fonts on MacOS at all. IMO Mac fonts are the worst of all the major OSes. Windows, Linux, and ChromeOS are all better than the atrocity that is Mac fonts. Of course, that is because I use those other OSes more than MacOS, so they're the basis for me to decide what is "correct".


Obviously this just comes down to personal preference.

All the major companies have poured lots of money into engineering typefaces specifically for their UI and brand. They're all designed to be easy to read and give off just enough personality to not be distracting. Objectively, I'd be surprised if one were better than another.

I like some of the open source sans serif fonts just as much as Apple's SF fonts. Google's Roboto has a bit too much personality IMO, but it might be better tuned for Android than the web.


I don't care much about the fonts, i guess I'm the person who doesn't care about typography. i want my linux mouse pointer to behave as under macos.

I can't quite put my finger on it, but somehow on Linux (Mint cinnamon) the pointer always ends up in a place just slightly off from where I expect it to be. Also the movement feels jerky...


It's the acceleration curve. MacOS is slightly exponential.


Which can be adjusted (at least in GNOME)


It's adjustable on every Linux variant desktop that I've used. I think they just probably don't know how mouse acceleration works. Currently that is Cinnamon


Well if you find it let me know. Also normal wheel scrolling


I've thrown away half a day trying to figure out how to get the scroll wheel to behave. No cigar. It still scrolls one line at a time in a terminal, and two pages at a time in firefox.


Try severely reducing the value of mousewheel.default.delta_multiplier_y in your Firefox about:config.


Thanks! Never realised there could be a per-program setting to fiddle with!


Most programs don't have internal settings for this, but Firefox does things its own way--and tends to break regularly because of it.

Also try reducing mousewheel.min_line_scroll_amount if changing the other setting doesn't help. Try a value of 16.


Is there a number you suggest?


Firefox' scroll behavior seems to be non-deterministic on Linux, so there doesn't appear to be a universal answer.

The default is 100, which seems to work fine most of the time, but for the OP's issue, reducing it by a factor of ten might be a good starting point.



That's amazing. Thank you for bringing a hearty chuckle to my morning coffee.


I still don't understand how the removal of subpixel rendering in macOS 10.14 is not yet reversed. I can visually relax much more when I work on my 10.11 El Capitan systems.


Because

1) the Mac world has moved onto high-DPI screens mainly,

2) subpixel rendering provides virtually no benefit on high-DPI screens, and

3) subpixel rendering adds a ton of complications across the board (you can't scale anything, for starters), and was always hacky to begin with -- e.g. taking a screenshot and putting it on the internet gives it a rainbow halo effect on everyone else's screen

Honestly, these days with graphics cards that support Retina, the "right" way to do subpixel rendering on non-Retina screens wouldn't be with fonts at all. It would be to render everything internally normally at 3x resolution (3x horizontal by 1x vertical), and then apply subpixel rendering to the whole screen when downscaling for display. But with so few non-Retina screens being used by Mac users, why bother?


Thank you for your explanations. I particularly like your comment on down scaling the triple resolution.

Actually I am using a Retina display. I’m still noticing the difference very much. Hence I don’t think subpixel rendering would be only a technique for obsolete devices.

But I don’t think subpixel rendering is a performance issue, my MacBook Pro running 10.11 displays everything fast even though being 6 years old.

And if the complications were already there, perfectly working, why remove them? I understand that it is hard to add these features into the iOS libraries, but then the others should have been kept as is.


> Actually I am using a Retina display. I’m still noticing the difference very much.

I'm curious why you think you notice a difference? Macs have never had subpixel rendering for Retina, so if I'm understanding you correctly, it's not something you could know from experience. Why do you think it would make a difference you could notice if they did?

(And of course, Retina rendering is far sharper than lo-dpi subpixel rendering.)

I suppose you might be able to take an image of subpixel-rendered text e.g. at 24 pt, scale the bitmap to 50% width/height on a Retina display, and put it next to the same text at 12 pt, and see if you could tell the difference. Although you'd need to be careful to get the same precise vertical baseline alignment.


Using a macro lens, I took photographs of text on macOS 10.11. I don't know where your statement is originated. But here, the text is indeed displayed with subpixel rendering – on a Retina display.


Apologies, you're correct -- my information was wrong. [1] Now I'm curious to experiment myself to see if I could tell the difference! And congratulations on your excellent eyesight. :)

The only seeming authoritative source I could find that justifies why it was ultimately removed is from an "ex-macOS SWE" here [2].

[1] https://graphicdesign.stackexchange.com/questions/8277/how-d...

[2] https://news.ycombinator.com/item?id=17477526


Retina and other Hi-DPI displays don't generally use the RGB column layout that ClearType and other subpixel rendering algorithms abuse to provide higher luminance resolution. Using ClearType on a PenTile or AMOLED display is just going to give you color fringing with no actual increase in resolution.


Indeed. On the other hand, all Mac displays and almost all external desktop displays provide the RGB pattern. Otherwise, there is still the option to disable the sub pixel antialiasing.


> Actually I am using a Retina display. I’m still noticing the difference very much.

I suspect you are not running at the "correct" 2X Retina resolutions, which for whatever reason are no longer the default on Apple Macbooks. Instead, they end up running at a 1.75x or 1.5x of native resolution, which results in slightly less crisp rendering for everything, not just fonts.


Yes, I am running at the "correct" 2x Retina resolution, indeed. I understand your suspicion, and I confirm that deviating from that yields worse results.

Actually, the requirement to stay with the 2x is a reason I dislike macOS 11.0 Big Sur (even more than 10.14+) because it increases paddings everywhere. Hence, its effectively a loss of real estate. This loss can be mitigated by increasing the resolution to <2x retina, but of course with subpar visuals, unfortunately.


I wish Apple would fix this so 1.75x or 1.5x were equally crisp.

The solution is conceptually simple: just like the iPhone X renders at 3x, Macs should be able to render internally at 3x as well, so that downsampling to 1.75x or 1.5x will still have full detail, zero blurriness.

I wonder if the reason it can't is performance, if it's battery, or if there isn't enough memory on the video card or something.

But seeing as MacBook Pros can support multiple 4K monitors... it seems like the memory and performance are there, no?


What you are describing is essentially what MacOS does. E.g. on a Retina display with a 2880x1800 screen, Apple used to render “2x” assets to a 2880x1800 buffer, so that it has the same amount of space as a 1x 1440x900 screen. If you want more space (which is now the default), it renders using 2x assets to a 3400x2000 or 3840x2400 buffer (these numbers are approximate), then scales it down to 2880x1800. So it’s never scaling anything up, only down. Of course it’s still not as sharp as rendering at the native resolution. Using 3x assets wouldn’t help unless the actual resolution of the screen was higher.


Do you have a source for that?

Because I'd love if that were true, but every explanation I've seen contradicts that.

The default isn't more space, as you write -- it's actually less space, for bigger elements. Which is why you would need to render at more than 2x. It does, indeed, scale up -- there are plenty of articles from when Retina's scaling options came out that state it does lead to a slight amount of blurriness because of this.

To be clear, under Display > Scaled, the "More Space" option is true 2x retina, while "Default" through "Larger Text" are the ones that upscale.

You can actually verify this yourself -- it you take a screenshot of any UX element at "More Space", and then take a screenshot of the same element at "Larger Text", they're pixel-for-pixel identical. For everything less than "More space", MacOS is scaling up.


What? The default is most definitely "more space" as of a couple versions of MacOS ago (or maybe it was based on the product, e.g. when they came out with a new version of the MacBook Pro in 2016 or so). I know on my 2018 MacBook Pro 13" the default was one notch over on the "More Space" side vs. exact 2x Retina. And that only makes sense, as running as if you only have the space of a 1280x800 1x screen would make me go nuts, you can hardly fit anything on the screen. I think that's what drove Apple to change the defaults from the exact 2x Retina, despite the minor loss in quality from having to render at the larger size and scale down. On iMacs which have bigger screens (I'm typing this now on a 5K iMac) exact 2x Retina is the default.

You are correct that if you go down to the "bigger text" side of things that it does scale things up, and for those sizes using 3x would give a sharper image. I hadn't even considered that though because I think most people think either the exact 2x retina resolution is fine, or if anything they want more space. The only people who would use the "bigger text" option are probably people with poor eyesight in which case it doesn't matter if its slightly more blurry.

EDIT: see screenshot here: http://imgbox.com/uxcHERt3 Default for MacBook Pro 13" is "Looks like 1440x900" which requires rendering at a resolution of 2880x1800, which is then scaled DOWN to the native resolution of 2560x1600.


We're in agreement on how it works technically:

> I know on my 2018 MacBook Pro 13" the default was one notch over on the "More Space" side vs. exact 2x Retina

That's what I meant -- it's less space (one notch over, the one labeled "Default") compared to exact 2x which is labeled the "More Space" option.

> The only people who would use the "bigger text" option are probably people with poor eyesight in which case it doesn't matter if its slightly more blurry.

I guess that's where we disagree -- my eyesight is great but I like the text on my screen to be comparable with the size of body text in books, not the size of footnotes. I like a comfortable amount of text information on screen, not crammed. And the fact this is the default option makes it seem that Apple agrees.

And that's precisely why I wish it didn't add the bluriness from the upscaling, why 3x internal would be valuable.


No, the one labeled default is one notch higher on the more space continuum than exact 2x Retina (at least on the 13 and 15” MacBook pros). On other machines, like my iMac, the default notch is exactly 2x Retina. Check out my screenshot and do the math yourself. “Looks like 1440x900” is the default notch, which means rendering at 2880x1800, which is higher resolution than the MacBook Pro 13’s 2560x1600 screen.


I stand corrected, thank you.

There were a bunch of articles way back when the scaling options were introduced that claimed anything less than "maximum space" introduced blurriness... but they were obviously wrong.

I just did the math and double-checked with screenshots, and indeed — on my 13" MacBook Pro the default is higher than 2x, not lower. It's only at the leftmost "larger text" that blurriness is introduced.

Thanks so much for the info, and I'm happy to know I am getting maximum clarity out of my Mac after all! Always good to get my misinformation corrected.

And so, never mind about the whole 3x thing... you're right, unless you need text extremely large. Cheers!


It wasn't "perfectly working", there was a lot of text on translucent views either looking horrible or changing antialiasing type when animated on macOS.


I was not aware of that. On the other hand, I am mainly looking at static text with opaque background (including PDFs). At least there, the subpixel hinting would benefit the keen readers' eyes.


If you're reading a ton of PDF's, try comparing Acrobat Reader with Preview -- they use entirely different font rendering. Preview is unhinted undistorted macOS rendering that preserves exact letterform positioning and widths, while Acrobat uses hinting to distort letterforms but align closer to pixels.

If you're really looking for maximum crispness you might prefer Acrobat. It makes subpixel rendering unnecessary for all perfectly vertical and horizontal strokes, since it's trying to avoid antialiasing them in the first place.


Indeed, Acrobat rendering is vastly superior in that respect. Thank you for that mentioning that.

Apart from the Acrobat solution, on macOS 10.11, it is possible to turn off text smoothing in Preview.app's PDF settings (which retains subpixel rendering). This gives, to my eyes the best PDF rendering that macOS had up to now.


I'm pretty sure that most developers are not using retina screens as their second display. Maybe things are different in The Valley, though...

But I get it. From their perspective, those aren't Apple screens, so they don't need to support them.


As screens gain resolution pixels get smaller. At some point there's a trade-off between computation power and effect. At some point sub-pixel rendering is indistinguishable from pure integer pixel rendering to the human eye on higher-resolution screens.

100% speculation: Apple has a history of very narrowly supporting only their latest few generations of equipment. It's possible that they're ignoring "legacy" devices in these decisions with the justification probably being that "They'll buy a new one eventually anyway."

Edit: added missing words.


>Apple has a history of very narrowly supporting only their latest few generations of equipment.

It has no such history, quite the opposite. You can install the latest OS on 5 year old phones and 8 year old Macs.


I mean, that's only the latest few generations. Android is worse for sure, but Windows does seem to support old PCs as long as the specs aren't as so bad as to be beaten by an older raspberry pi.


> I mean, that's only the latest few generations.

What do you mean by few? Apple releases new phones every single year and has for the past 12.

5 years means 5 generations.


It's hard to go back to scalable fonts after getting used to absolutely crisp bitmap font in my text editor on Linux. I don't care about the font shape, but the difference in sharpness on regular screens is so startling.


And no screenshot? That would be useful.


Back when I was a Linux user, I used to do almost the same thing. With regards to the fonts, though, I preferred the URW Nimbus familiy[1], which I found by accident on a pre-installed Debian package. I have also experimented with the GNU FreeFont family[2], although I recommend the former.

[1] https://www.urwtype.com/en/shop/?fontshop=datei%3Asearch_db%... --- Sorry I couldn't find a better link.

[2] https://www.gnu.org/software/freefont/


I really like the Nimbus family and it was my goto Interface font before TeX Gyre came along.

Also loved the FreeFont family, it was what I used before Nimbus.


To tweak font settings , see

https://wiki.archlinux.org/index.php/fonts

I currently use

https://github.com/belluzj/fantasque-sans for Monspaced Font.

For everything else, I use Noto Sans. Looks Well.


Fantasque-sans used to be my favorite for monospaced font - still like it a lot. Now a days, I use Fira Code which also looks quite good.


> <edit mode="assign" name="hintstyle"><const>hintslight</const></edit>

Nor GTK, nor Qt do support "full" hinting as of few recent releases.

Pango, and Harfbuzz sacrificed that ability in exchange for subpixel letter positioning.


If a correctly remember the consensus was that font rendering is even better in Windows?


There's no consensus. Cleartype is the most advanced technology out there AFAIK, because it considers many things macOS' font rendering method does ignore. However, the end result is that fonts get distorded heavily, whereas other OS such as macOS, iOS, Linux and Android do respect the fonts' original shape.


Here's my (somewhat oversimplified) take:

- macOS uses no sub pixel hinting since High Sierra. Instead, it's just grayscale antialiasing. To make up for the unnatural thinness that occurs when grayscale antialiasing is used, fonts are artificially bolded by a small amount.

- Linux/FreeType uses very nice-looking sub pixel hinting and maintains the clarity of fonts pretty well. The spacing between characters may not be pixel-perfect, if I remember correctly.

- Windows/ClearType snaps fonts to the pixel grid very aggressively, resulting in some amount of distortion. Many of the default Windows fonts look really good with ClearType, while others appear strange, especially at smaller sizes. In terms of technology, ClearType is the most advanced.


I’m not satisfied with your macOS glyph dilation explanations.

I don’t know the timelines involved, but my impression was that macOS’s glyph dilation has been there from the start, or at the very least waaaaay before they threw away their subpixel antialiasing. (When you wrote “sub pixel hinting”, I’m presuming you actually meant subpixel antialiasing, though feel free to correct me; but hinting and antialiasing are entirely different things, and “subpixel hinting”, which I imagine would mean hinting to subpixel boundaries (which would also seem a weird thing to do), isn’t a standard term.)

Using greyscale antialiasing doesn’t inherently make things appear thinner or thicker. The only thing that will make things appear thinner or thicker is if the antialiasing is done in the wrong colour space, typically meaning linear versus gamma-corrected.


> (When you wrote “sub pixel hinting”, I’m presuming you actually meant subpixel antialiasing, though feel free to correct me; but hinting and antialiasing are entirely different things, and “subpixel hinting”, which I imagine would mean hinting to subpixel boundaries (which would also seem a weird thing to do), isn’t a standard term.)

Yep, I think you're right about this. I can no longer edit my comment, but pretend I did a quick s/hinting/rendering/g.

> Using greyscale antialiasing doesn’t inherently make things appear thinner or thicker. The only thing that will make things appear thinner or thicker is if the antialiasing is done in the wrong colour space, typically meaning linear versus gamma-corrected.

Huh, thanks for the correction. Do you have any idea what the "Use font smoothing when available" setting actually does, given that it makes fonts thinner and more similar-looking to Windows rendering when turned off?


That’s the checkbox that enables glyph dilation. See my other comment in this thread for a couple of links about that.

Bit of a crock calling it “font smoothing”, I reckon, but that’s what they call it there and I can’t stop them.

(In all this about macOS, I’m going almost entirely on what I’ve seen and heard from others. I’ve never had a Mac.)


In the macOS User Guide, it says "Font smoothing reduces jagged edges for some fonts. When text smoothing (or “antialiasing”) is on, smaller fonts may be harder to read."

That seems misleading, though, because there is clearly some amount of antialiasing happening regardless of which option is chosen.

I think you're probably right. To be honest, everything in macOS seems to be designed with that switch on, so turning it off makes fonts in many areas look noticeably worse.


> The spacing between characters may not be pixel-perfect, if I remember correctly.

Behdad has since axed usage of hints from True/OpenType fonts. Though the rationality of "more precise positioning" is kind of irrelevant on modern HiDPI screens.


macOS does its own peculiar brand of glyph dilation. A couple of links about this: https://twitter.com/pcwalton/status/918991457532354560, https://twitter.com/pcwalton/status/918593367914803201. I would count this as heavy distortion, far heavier than anything any other platform may do.


That's definetively not the consensus.

Both macOS and Windows were designed at a time when screen resolutions were under 100dpi. My take is that Apple favored font rendering to mimic printed text as much as possible and Microsoft went with legibility at all costs with ClearType.

But to me, Windows font rendering is awful, of just terrible taste.

High resolution screens makes it all much less of an issue, but both Linux and Windows still struggle in that regard as well.


Was never aware of such a consensus.

Whenever I sent Linux screenshots, my windows colleagues remarked how much better fonts looked in my screenshots.

Sub pixel rendering, which they all had (Likely a Windows default at the time?) sounds good but just looks blurry in practice.


I think it's possible that people perceive sub pixellated fonts to be uglier in some cases, but I know early on Microsoft aggressively marketed the studies they'd done that reading speed and comprehension were definitely superior. Not sure if that research has been replicated recently.


> Sub pixel rendering, which they all had (Likely a Windows default at the time?) sounds good but just looks blurry in practice.

It adds some funny color artifacts when the capture is not perfectly aligned to the screen because subpixel information is encoded as color variations.


Yeah, I often hear about how much better font rendering is on windows and MacOS but I actually find them to be inferior to the clean font rendering I have on linux; blurry red & blue around my text is really not the definition of beautiful text to me, but I suppose this is about taste..


Basically the default configuration on each is different, and some have different features.

On macOS there is little or no hinting, the result is less contrast and, at smaller text sizes on standard-resolution displays, eye strain; the upshot is that the shape of the fonts is generally more faithful, of course, macOS undermines this by completely changing the weight of fonts when you enable subpixel antialiasing (which is why there's CSS all over the internet turning that off).

On FreeType2 w/ Pango/Skia/etc (Linux/BSD/Android/iOS), the subpixel rendering actually works properly and doesn't change the weight of the font, and on the horizontal axis the hinting is mild or completely off, and the vertical hinting is less mild or quite aggressive. Most FreeType2-based renderers now position glyphs on thirds of a pixel, when the display can supports that. I personally prefer the default fontconfig on Arch Linux over every other text renderer.

Windows has ClearType, it is by far the sharpest text of the bunch, and at standard text sizes that can make it very legible. The tradeoff is that it completely mangles the shape of fonts, and the result only vaguely resembles it; furthermore there is such a thing as too much contrast and I find that ClearType can feel harsh and causes me a form of... aesthetic fatigue?

Overall it seems that there is a whole spectrum of ruthless efficiency vs. aesthetic considerations. I don't think there is a universal scientific answer to these questions.

Of course, once you have 3x standard resolution displays it doesn't much matter as long as you aren't mangling the fonts some other way, at some point you don't even need to antialias.


There's no consensus really. People have different preferences. And even in case of preferences, lots of people are just used to what they see every day.


From what I remember this is holy war territory.


Whenever I'm sent a screenshot from a Windows user, the the text looks quite janky, with a red and blue blur around letters. Though I don't know if that's actually what they see on their screen.


It may have happened when Windows got subpixel smoothing and major Linux distros were still using Gnome 1.x, but I think even macOS 9 was much better than Windows at that time.


I liked RiscOS best. Definitely on CRTs.


If you want to make this configuration local for your login (user) only, put the xml config into: $HOME/.config/fontconfig/fonts.conf Rather than the global one in /etc/fonts

Reference: https://wiki.archlinux.org/index.php/Font_configuration#Font...


Some ways this article could be improved:

- /et/... should be /etc/...

- not assuming your editor is neovim.

- not assuming those fonts are installed.

Some people like the font hinting and antialiasing in macOS. I don't. Fonts get really thick and hard to read.

Retina mode makes fonts vastly more readable but it requires a $1000+ Apple monitor. On Linux I can plug a $300 monitor, switch resolution to 4k, set display scaling to 2x and save myself $700+.


The way Arial (or "sans serif" in general) renders by default on Linux has always irked me. It's got annoying proportions, and just seems off.

What really sucks is that websites like Wikipedia use it, and you have to suffer reading this horrible font.

I'm super-thankful for this post. I'm definitely going to try this out.

Thank you again!


Here's how to install more default font replacements on Fedora: https://github.com/silenc3r/fedora-better-fonts

Would be great if Linux distributions provided similar expanded font setup out of the box.


After using hiDPI monitors for years (Mac, Windows, Android, iOS) both examples look horrible to me.


I'm more of a fira sans / fira code guy.

However, setting up the fonts that you want on linux can be a bit tricky and I've lost many hours trying to get a config that does it for me, so good job for the guidance that you provide.


Is there any way to make Ubuntu web browser (any) rendering closer to MacOS? When I browse the Web on my MacBook Pro, it’s really pleasing to the eye. When I do the same on Ubuntu,- not so much. Is there any way to fix it?


One of the top three reasons I use Ubuntu: no fiddling with fonts. They look good out of the box.

There's enough things to fiddle with on linux.


Why not use Roboto which is shockingly similar?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: