Hacker News new | past | comments | ask | show | jobs | submit login
MacOS Mojave removes subpixel anti-aliasing, making non-retina displays blurry (reddit.com)
466 points by laurentdc on July 7, 2018 | hide | past | favorite | 473 comments



I released a build of iTerm2 last year that accidentally disabled subpixel AA and everyone flipped out. People definitely notice.

But the reason I’m sad is that this wonderful hack was short lived: https://docs.google.com/document/d/1vfBq6vg409Zky-IQ7ne-Yy7o...

My pet theory is that macOS is going to pull in a bunch of iOS code and iOS has never had subpixel AA.


That must have been one of the nightly builds from end of September 2017.

For anyone who wants to see whats coming when Mojave gets released without subpixel AA, I made a few comparison screenshots on a non-Retina display:

iTerm2 dark subpixel AA enabled: https://static.imeos.com/images/iTerm2-3.1.7.png

iTerm2 dark subpixel AA disabled: https://static.imeos.com/images/iTerm2-3_1_20170924-nightly....

iTerm2 light subpixel AA enabled: https://static.imeos.com/images/iTerm2-3.1.7-light.png

iTerm2 light subpixel AA disabled: https://static.imeos.com/images/iTerm2-3_1_20170924-nightly-...

On Retina screens, I don't see much of a difference:

iTerm2 light subpixel AA enabled on Retina display: https://static.imeos.com/images/iTerm2-3.1.7-light-retina.pn...

iTerm2 light subpixel AA disabled on Retina display: https://static.imeos.com/images/iTerm2-3_1_20170924-nightly-...


I'm confused. Are the screenshots somehow different so that the retina and non-retina screenshots will appear different even when viewing both on non-retina (or retina) display?


It depends on your hardware. If your RGB are in a different spatial order from the originator it actually makes it worse.

Consider the 3 and 4th images, the "on light" AA and non-AA versions… Look at line number 10, the "let arr1 =…" line. Look at the vertical stroke of the "l" of "let". In the anti aliased version there is a red glow on the left side and a blue-green glow on the right when you zoom in. On the non anti-aliased one there is not such glow. Now lets lay that into pixels on a scanline…

    RGBRGBR_________GBRGBRGB   subpixel AA
    RGBRGB_________RGBRGBRGB   non-subpixel AA
So both are just a black gap in an interwise solid line of GBRGBRG dots (I'm hiding the 'rgb' deliberately there, it doesn't exist in the real world on the LCDs where AA works)

If we turn that back into what the computer abstracts as pixels we get…

    RGB RGB R__ ___ ___ _GB RGB RGB   subpixel AA  (see R and GB? halos)
    RGB RGB ___ ___ ___ RGB RGB RGB   non-subpixel AA
On 150dpi desktop displays it doesn't matter, your eyes aren't good enough. On displays which can be rotated it is a bad idea because the sub pixel AA hack only works in one orientation and that is just confusing.


On my desktop display I see a difference in the light screens. It's a very subtle difference, but enough that it might theoretically irritate me if I had to stare at text all day. Probably not in practice, it's not a massive problem or anything; but yeah, AA looks better.

Where I notice it the most is on the curly quotes and parentheses. Without AA they look... harsh I guess? Or maybe blocky. You notice the vertical lines making up the curves more. You also lose some of the definition around letters - for example without AA the dot on the "i" is much less noticeable.

On the dark theme it doesn't look nearly as bad, I have a really hard time seeing the difference there at all. Maybe that's because they're scaled differently? Or maybe dark themes just don't need it as much?


> “Without AA they look... harsh I guess?”

I’m noticing this misunderstanding quite a bit in the thread. None of those images have antialiasing disabled. What’s been disabled is subpixel antialiasing. So those images are comparing subpixel antialiasing and greyscale antialiasing.

Opinions between the two are mixed:

I’m one of the people that hates greyscale antialiasing (I find it makes text look fuzzier) and prefer subpixel antialiasing (which does a much better job of preserving the intended shape of the font glyphs).

On the other side, there are people who hate subpixel antialiasing (because they can see the colour fringes on the glyphs) and prefer greyscale antialiasing. Here’s an example from 1897235235 lower down in the thread:

This is actually better for me. I wrote to Steve Jobs a long time ago and asked him if he could switch to gray-scale font smoothing because the tricks they use with colours don't actually work with people like me who are red green colour blind. The end result was that text on apple products looked terrible to me and I couldn't use any of their products until the retina displays came out. In windows, you can use grayscale font smoothing easily.

Anyway, he replied "we don't have any plans to do this" or something like that. Turns out I won in the end.

Now if antialiasing was actually fully disabled, the difference would be extremely obvious, to say the least.


Thanks for the catch.

I wonder if this is part of the reason why I can't really tell the difference in the dark mode pictures, at least without zooming in all the way. I guess the colors from subpixel antialiasing would be less noticable when overlayed or transitioning from dark to light rather than from light to dark?

> the tricks they use with colours don't actually work with people like me who are red green colour blind.

Oh crud, this never occurred to me. It doesn't seem to matter how many times I remind myself that red/green doesn't work for everyone, I still find myself forgetting about it all the time.


I can see a difference between pairs of each screen shot. I'm pretty confident though that I won't be able to tell when I don't have these nice comparison images, or care about the difference in practice.


I can't tell which is which even when I blow them up to the point where I can see the individual pixels.


I opened them in my browser in separate tabs and flipped back and forth between them. Very subtle difference for me, but I could see it.


Yes, I can see the difference too. I just can't tell which one is supposed to be the "blurry" one. They both look equally sharp to me.


Zoom in on the pics.


Yeah, I put them both at normal zoom level and realized that they're both the same, except the retina is much higher pixel density so the text looks a lot bigger when it is set at 1x zoom.


>My pet theory is that macOS is going to pull in a bunch of iOS code and iOS has never had subpixel AA.

Some Apple engineer said on reddit that it's because subpixel AA is not so useful in Retinas and HiDPI, but slows down processing and complicates pipelines.

So it's part of the move to Metal and increasing graphics performance, even if it means external lo-res monitors will suffer.


> but slows down processing and complicates pipelines.

Rendering in monochrome is also faster than true 32-bit color, but we use 32-bit color because it provides a better experience to the user who is the ultimate consumer of the graphics pipeline.


>Rendering in monochrome is also faster than true 32-bit color, but we use 32-bit color because it provides a better experience to the user who is the ultimate consumer of the graphics pipeline.

It's almost as of it's a trade-off and monochrome so laughingly doesn't cut it, that it's a totally contrived counter example. Almost.


I actually use Nocturne pretty frequently to turn my screen monochrome if I am doing stuff outside of iTerm or my text editor — I find the use of color unmotivated and distracting in most programs, and especially websites


Here, here for Nocturne! I use it at night, switching to monochrome red night mode, and brightness down to the last setting. Sometimes in the morning I forget and think the screen isn't working.


It’s "hear, hear."


I just downloaded Nocturne and it's not working for me on High Sierra. Is the most recent version really from 2009? May I ask what system you're using?



The UX gains from mono -> color vastly outshine slightly smoother fonts on older external monitors.


Those with older external monitors might take issue with that.

I could replace my old monitors, but they still work well with good colour and brightness, so it's not exactly environmentally considerate, or even slightly necessary.

Adding colour has added very little to UX aside from true-colour icons, the UX itself is essentially the same. Colour is used as a theme on top of a UI perfectly recognisable, and much the same as, mono and 4 colour interfaces of the 80s and 90s.


I'd rather spend the bandwidth on resolution; I'm perfectly content reading text in black and white.

A current dense desktop monitor seems to be around 160ppi, so 8-bit grayscale would be around 280ppi, which is starting to approach a low-end printer.


Actually you really want more bits per channel, 32bit color is rather meaningless. HDR is 10 bit per channel.


I imagine subpixel AA is at least a bit trickier on phones and tablets because of device rotation.


Hit the mark there


Screen rotation is also a thing on desktop. My Dell has three external monitors, side by side, all rotated in portrait; use a decent window manager and you can have a decent 3×2 grid of windows that easily leads to Perfect Window Placement™ just by using Move Window to Top (Bottom) Half. Gnome and pals know how to handle both horizontal and vertical RGB subpixel arrangements and automatically switches configuration as you rotate the screen.

Mobiles and tablets have such high resolution monitors nowadays they probably run without subpixel AA at all…


Just a small nuance correction: monitors will not suffer, users will.


“Monitors will suffer” is a metonym, there’s no need to correct it, it was already correct. Metonymy is common in casual speech but less so in formal writing.

https://en.wikipedia.org/wiki/Metonymy


Thank you for the link, it is very interesting. Having read the Wikipedia article, I think that metonym here is a bit too heavyweight for the simple thing expressed in the comment.

Edit: personally I value simple and precise language. Constructs such as those mentioned in the Wikipedia article may convey similar meaning but the fact that they exist means that they allow for some variation in meaning and color which is unnecessary in this case.


You value precise and simple language, but other people value other things like clarity and brevity. It’s a tradeoff. Making something precise can mean adding extra words which sometimes, paradoxically makes it less clear and more difficult to understand. Even in extremely formal contexts like mathematical papers, it's inappropriate to be completely precise because it gets in the way of communicating ideas. And if we strongly preferred simple language, we would use https://simple.wikipedia.org/ instead of https://en.wikipedia.org/

The original statement is clear from context, since the literal meaning is semantically impossible (monitors are incapable of suffering).


So you're saying that overly pedantic writing causes clarity to suffer?


>Thank you for the link, it is very interesting. Having read the Wikipedia article, I think that metonym here is a bit too heavyweight for the simple thing expressed in the comment.

Metonym is just a linguistic term. Such terms are not constrained to describe high uses of language by great masters of writing. They are merely names for specific constructs or linguistic phenomena.

A drunken sailor swearing at someone at 3am could be using a metonym just as easily as Wallace Stevens.


‘[The performance of] the monitors will suffer’ is implicated here.


They’re not suffering much though. All these comparisons to USB and monochrome are ridiculous.


Informative as always, George.

> I released a build of iTerm2 last year that accidentally disabled subpixel AA and everyone flipped out. People definitely notice.

On a Retina display at least I find that the most striking difference is that most text appears to have a lighter weight. Maybe that's the difference people are perceiving, rather than color fringing or whatever?


It's simply a different setting for stem thickening, the subpixel AA modes added a lot. It would definitely be possible to set the same stem thickening for grayscale rendering, and on hi-res monitors it would look pretty similar.


They give justification for it in the video linked somewhere else in the comments. They say "it works better on a wider variety of displays" in reference to the gray-scale approach. I am guessing they are probably switching to oled across all of their products.


> They say "it works better on a wider variety of displays

That's PR-speak. Translation "It works acceptably well for all pixel layouts, and it is easier for us to implement".


ex-MacOS SWE here. Subpixel antialiasing is obnoxious to implement. It requires threading physical pixel geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain!

Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).


"Listen guys the code for this is a mess. Can you just buy the new Macbook?"

(love my latest gen Macbook Pro, but this is the classic problem with Apple's approach to product design)


>"Listen guys the code for this is a mess. Can you just buy the new Macbook?"

More like:

"Listen guys, the code for this is a mess AND slows your graphics down even when it's not needed. We already have had Retina displays for 6+ years already, and Hi-Dpi displays are the norm these days for third party too, so there's no reason for everybody to suffer from old tech, like there was no reason for everybody to have a floppy disk or CD-ROM on their laptop just because 5% had a real use for it. Part of the way it is with Apple, and has always been is that we move fast, and cut things off first. Until now, this exact tactic got us from near bankruptcy to being the #1 company on the planet. So, if you don't like it, don't let he door hit you on your way out".


> Hi-Dpi displays are the norm these days for third party

Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.

I would even go so far as to say that the majority of people who want to buy a monitor for doing text based things (ie business) will buy a bog standard monitor.


It's taken a while party because of cost and perhaps a bit more because of the horrible ways Windows and Linux deal with HiDPI. You wouldn't want heterogenous DPIs or non-integer scales on those platforms. On Linux it seems heterogenous DPI is still very experimental and ugly. On Windows some apps are buggy and others are ugly when dealing with heterogenous DPI. On Windows non integer scale does actually work, but it makes some apps size things horrifically. Needless to say Microsoft's multiple approaches to DPI scaling have made a mess, and Linux never really had a unified way of dealing with it.

If you're on a MacOS platform, with the current price of HiDPI IPS displays, the time is right to grab just about anything. If you're on Windows or Linux, it's still a great time so as long as you're keeping all monitors the same DPI and probably integer scale.


I'd kill for a 4K 100 Hz MacBook display in the current 15 inch form factor.


Heterogeneous DPI works fine on GNOME3/Wayland these days. I had a 4K laptop running at 150% (it was a tiny screen) and it was perfect.


The only HiDPI monitor I'm willing to use is $2000. I consider 60hz to be unacceptable for work.


>I consider 60hz to be unacceptable for work.

Well, the vast majority of the people consider 60hz totally acceptable for work - and work fine with it.

For the huge majority of the people refresh ration > 60hz isn't even a concern.

Better resolution on the other hand is a marked improvement (either as more screen real estate or as finer detail retina style).


You're absolutely right, but I'd wager it's mostly because they haven't been exposed to 120Hz yet. The moment Apple introduces a 120Hz screen on their iPhones, people are going to want it everywhere. Much like HiDPI displays.


They have it on the iPad IIRC.

But isn't that more for quick moving stuff, like games or (in the iPhone) quicker visual response to pen input and such?

For regular computing (reading webpages, editing stuff, programming, watching movies) I don't see much of a difference.

Heck, movies are 24p and we're fine with it.


I absolutely love scrolling on 120Hz displays. It feels so much more natural when the letters aren't blurry as they move under your fingers. Indeed, the iPad Pros have the feature, but they aren't nearly as popular as iPhones. I tried on the Razer Phone, can't wait to have it on mine.


100-200 Hz displays are the next logical step for laptops and mobile phones.

Check out iPad Pro devices with 120 Hz display. It makes a big difference in readability of scrolling text (try to read a web page while it's scrolling), responsiveness and smoothness of motion.


And even if you got that monitor, would a MacBook Pro be able to drive it at its refresh rate?


These days even mobile phones can drive external monitors at 4k 60 Hz. I think it's reasonable to expect next gen MacBook Pros to be able to drive two 4k monitors at 120 Hz+.


Looks like the 2018 model still only supports 4K at up to 60Hz [1].

[1] https://www.apple.com/lae/macbook-pro/specs/


And the majority of people (and businesses) won't care.

The ones that do already have "Hi-DPI terminal windows and really sharp text" as an almost-mandatory requirement.

Think about it. Sub-pixel text rendering (and AFAIK it's just text, not images w/ subpixel positioning? although that's an incredibly interesting idea for a hack!). Maybe subpixel vectors/lines?

Either you care about high-quality lines or you don't... what Apple's basically doing here is moving the cost to the hardware instead of the software, and the win is simplified, faster, less-energy-using device.

I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks (don't make me source my dates on retina displays). It's the norm for "executive" users (ie: software developers, graphics users, finance, book editors, etc.).

If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.

However, thinking it through, I would _really_ like then a "subpixel mode" for either the OS as a whole (kindof like an FSAA?), or especially certain windows. Maybe there's a way to do that as a hack?


>If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.

True, but not being in search of the best quality doesn't mean that any quality will do.

I've been happy with text rendering on my bog-standard monitor so far. I'm not sure I will still be happy after this degradation.

But as it seems my current Mac Mini is the last of its kind anyway and I'm going to have to move on from Apple.


>I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks

That's Apple's market, not people holding on to 10 year computers. Not even those that still buy the MBA.


> Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.

They are not. But I guess the point of the parent poster is that Apple require its customers to spend on latest items instead of supporting old tech.

Basically, it is saying: If you can't afford this, don't buy it. It is fair given that Apple products are already marked up in price.


1x Monitor is only acceptable for gaming in my opinion. I cannot believe people are working on 200$ screens. I mean you can also code on an 80 char terminal in theory, but why would you


16:9 monitor is only acceptable for watching movies in my opinion. I cannot believe people are working on TV screens. I mean you can also code on a Commodore 64 in theory, but why would you?


Why? I find it useful to have two pages or terminals side by side. 16:9 matches human FOV pretty good, which is a reason it's so popular.


More lines of text.

Historically 16:10 monitors have had the same width as 16:9 ones (e.g. 1920×1200 vs 1920×1080) so there's no difference as far as having things side by side.


I know right, can you believe some people pay less than $3,000 for a basic workstation?


I think you're being sarcastic, but I honestly can't. I recently bought a new workstation with a $3k budget, and I regret going with a laptop because the budget wasn't enough for me to get an adequate laptop that will still perform well in 3-5 years and get nice monitors.


I was being sarcastic, but also not taking into account currency conversion rates (and possibly the availability of stuff in your country)

Regardless of that, if all you're doing is General Business™ then surely all you need to do is grab an all-in-one box, shove as much 3rd party ram in it as will fit & then nab a couple of monitors and an unbranded mechanical keyboard off amazon.

I did a lot of work on a £700(~$900) Optiplex with 16gb of ram and that was more than capable of running ubuntu on 3 monitors while being small enough to fit into a handbag


Is not that a lot of people have a choice.


Is not that people who don't have a choice spend $2k and $3k on Macs.


> AND slows your graphics down even when it's not needed

So just let me turn it off. Or turn it off automatically when I'm scaling. Or leave it as it is because no one's ever cared about it. Removing it is the worst option of all.

> We already have had Retina displays for 6+ years already

External displays are the thing here.


This is 'Xbox One is online-only #DealWithIt' level sentiment. Fuck the consumer, right?

I really hope the market punishes you for this sentiment.

And P.S. Apple is strong because of the iPod and iPhone - macOS is a tiny part of their business and Apple is strong in spite of their shitting on Mac customers, not because of it.


The Macbook air which apple is still Selling is NOT Retina...


The just sell it as a lower tier option for developing markets and those entering the ecosystem. If they so decide they could drop support for it in a heartbeat and leave the users only running HS.


I suspect it won’t be supported by Mojave. If so, it may be a moot point.


MBA (2012 >) is supported by Mojave


Slightly awkward that the current MacBook Air isn’t retina, however.


It won't be awkward once Apple updates that computer's display before the release of Mojave. I'm looking forward to that.


If the "update" also includes removing all the ports except USB-C, getting rid of MagSafe, and switching to an inferior keyboard design to save a few fractions of a millimeter, I don't see much to look forward to.

Seems more likely that Apple will simply discontinue the Air and that the MacBook represents their new vision of what laptops ought to be like.


Agreed. Discontinuing the Air seems much more likely. This is so aggravating to me. Air + retina display would have been the perfect laptop for years now, but they simply refused to make it. I hate many of the changes Apple has been making instead, and the new keyboard is an absolute deal-breaker. Don't know how I will ever willingly give Apple my money again...


Your general skepticism about other aspects of the coming revision of the MacBook Air may be warranted -- nevertheless a retina display would be a great improvement.

Further, if your prediction is correct, why wouldn't Apple already have canceled the Air? what you describe is already on the market: the escape-less MacBook.

I don't feel as skeptical or negative about Apple's plans for new laptops. We shall see, soon!


I do actually suspect the Air will disappear as an offer in due time.

When the Air was released, it had too many limitations, and too high a cost, to take over the macbook's role. Its primary market was people who really valued the mobility over all else. But as they iterated it closed the gap, until it overtook the macbook, and the macbook offer shrivelled up and died.

If you swap 'macbook' and 'Air' in that narrative, you see history repeating - today we're at the first step of that. "The new macbook" is too limited and too expensive to take the Air's place, but give them a few iterations, and I suspect the Air will become unnecessary in the line-up.


I still believe that a big problem MS had was their relentless commitment to backward compatibility. I agree at some point you have to dump last years legacy, but Apple seem to want to dump last _weeks_ legacy.


Promising/maintaining backwards compatability is how you market and sustain Enterprise software contracts. For consumer products, it's a bonus if you still get updates after 2 years.


apple still sells a whole line of non Hi-DPI machines it's arguably a little early for them to kill this stuff off. They should drop that line or upgrade it to hd-dpi and then in 2-3 year kill off non HI-DPI support


Meh. I see their point. Subpixel AA was about using software to simulate better hardware. But, at the end of the day you really do want the better hardware so that you can use your software budget to simulate an even better generation of hardware. Repeat ad-infinitum, modulo scaling limits.

Pretty sure this is what Alan Kay means when he says that people who are serious about software will make their own hardware.


Maybe we could wait with the step where we don't simulate better hardware until (almost) everybody has upgraded to hardware where that is unnecessary.


You can use Windows for that. Apple have different mindset and it shows in many aspects, e.g. removing big USB ports, audio jacks.


Mac sales massively happen on machines with retina display though.

A 1x monitor will be a cheap second monitor or something attached to a Mac Mini (please keep it alive Apple) or Mac Pro (one day it will come). From Apple's point of view I imagine all of these are just super small niches.


One problem: people have no good reason to.

The software folks have a good reason to remove subpixel AA; it's a hack to get better horizontal spatial resolution out of cheap monitors. The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.

By removing subpixel AA at a time when HiDPI monitors are cheap, Apple is probably doing us a service. In a few years, hopefully everyone will be on a DPI that can render nice text. You may think it's unnecessary, but consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.

I know it's pretty irritating to suggest that forcing users to buy new hardware is doing them "a service." But really, I see no conflict of interest here. Apple themselves probably hasn't shipped a 96 DPI display in half a decade, and they don't even sell their own external displays, so they don't stand to gain that much here other than software performance.


> In a few years, hopefully everyone will be on a DPI that can render nice text.

My 1x monitor can render text just fine using sub-pixel AA. It's a proven technique that has been around for ages and the text looks great on it.

> consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.

If I start to become a heavy user of Hanzi and Kanji then I will consider purchasing a monitor that displays it better, but currently not an issue.

> Apple is probably doing us a service.

Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?

> so they don't stand to gain that much here other than software performance.

I don't understand the performance argument people keep bringing up. Subpixel AA has been around for ages and worked fine on much older hardware. Why is it suddenly a bottleneck that needs to be removed when my decade old hardware can still do it?


>Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?

No, of course not. If you phrase it like that, that is, ignore the point of removing it entirely and all of the benefits that come with it, of course it's not beneficial.

However, 96 DPI was never ideal. HiDPI won't just avoid a regression, it'll improve your experience, period. And with this new software change, you'll get a performance improvement, from better caching and easier GPU acceleration, and software engineers can delete thousands of lines of potentially buggy code made to work around the interaction of subpixel rendering with other graphics code.

Like, subpixel rendering is easy to understand in theory and that might delude one into thinking that it's just a minor annoyance, but it's an annoyance that permeates the stack. You have to special case the entire stack wherever text is concerned. Effectively, the text is being rendered at 3x the horizontal resolution of everything else. This brings in complication across the board. An example is that you pretty much have to do the alpha blending in software, and when you do it requires knowledge of the final composition. Doing alpha blending in software sounds easy... But it's only easy if you do it wrong. Dealing with gamma for example. Which makes life worse because you already have to do this for the other part of your graphics library, probably with a case for both GPU acceleration and software.

Basically, you'd have to render the whole screen at 3x horizontal resolution to remove the need for all of these compositing hacks. Nobody does this because subpixel AA was a hack from the get go.

In an era where battery life and TDP are actually important, finite resources, we want to remove these hacks.

P.S.: if you've never tried rendering large amounts of text on screen before, I highly recommend you try writing a piece of software that needs to do this. A few years ago I struggled to get 60fps rendering text for an IRC client, and of course caching the glyphs was horrendously complex thanks to subpixel AA. It would've been easy if I could've just used GPU compositing like all other graphics, but instead I resorted to hacks everywhere. Maybe today's computers don't need the hacks, but it's worth noting that yes, rendering fonts is resource intensive - you're just living on top of decades of optimizations and hacks to make it look like it's easy.


> No, of course not. If you phrase it like that, that is

But that's what it is, that's not just clever phrasing on my part.

> HiDPI won't just avoid a regression, it'll improve your experience, period.

I get that, but if I wanted to spend money on that improvement I would do that. The point is my choice is no longer between maintaining my current standard or upgrading, its between regressing my current standard or upgrading.


> The point is my choice is no longer between maintaining my current standard or upgrading

Do you have to upgrade the OS? Nope. It's a choice and a tradeoff.


Unless you want to selflessly support some botnet you have to update your OS pretty soon after it's released. Certainly more often than you have to upgrade your screen.


How well does Apple support old OSes?


I'm not sure if this was sarcastic or not, but I'd have to say pretty well. I have a 2008 iMac that's stuck on El Cap, and it still gets OS and Safari security updates.


Battery life and TDP are irrelevant with a docked laptop + external monitor, which is specifically what people are concerned about here.


Well, I'm not, and neither is Apple, because we are not the common case. I, too, have my Thinkpad docked all the time, connected to two external monitors that are 96 DPI. That does not mean I do not support the move past 96 DPI, I absolutely do despite the extra cost, because I do have HiDPI setups and after having one I have little interest in putting any more money into low DPI equipment.


The cheapest HiDPI monitor I would even consider buying is $2000. There are no ultrawide HiDPI monitors in existence at the moment.

Of course HiDPI is a regression in some ways. You can get far better non-retina monitors (low latency, higher refresh rate, wider range of aspect ratios) at any price point.


> The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.

Please point to me a reasonably-priced HiDPI/Retina Ultrawide (at least 34" and 3440x1440 reso) monitor. I couldn't find one.


34” 3440x1440 is 110 dpi, which is 1x, not retina. To do retina at those sizes you’re looking at 8k. Only dell has an 8k monitor, and it’s not ultrawide, is very expensive, and most of the mac line-up probably can’t drive that many pixels.


And please point me to a reasonably-priced HiDPI display that's 16:10 or better. Anything over 20" will do.


Of course, I dont have answers. I didn't say HiDPI displays that suit everyone's needs is available for cheap. However, the main problem is that everyday people have a lot of 1080p 24" monitors sitting around. There were probably fifty to a hundred at my old office. Because we didn't have any huge size requirements or uncommon aspect ratio needs, we were able to take advantage of HiDPI displays that cost to the order of $100. To me that's a price point the average person can afford. If it causes some folks to have to shell out a lot of money to stay on 16:10 or have a reasonably large monitor, that is a reasonable trade off; most users don't need em, hence why it's difficult to find good 16:10 monitors in the first place, for example.


> The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.

> Of course, I dont have answers

The questions were related to backing up what you claimed. You obviously could not do this.


I don't see USB drives with USB-B port or headphones with jacks going away. I don't think that it'll be different with displays in foreseeable future. macOS users are forced to buy hidpi displays, but they are tiny minority and other people will continue to buy cheaper displays, it won't turn industry. Microsoft could force everyone to upgrade by disable anti-aliasing in new Windows version, but I highly doubt that they would do that.


> Microsoft could force everyone to upgrade by disable anti-aliasing in new Windows version

Given how shoddy Win10's support for HiDPI is across applications at the moment (currently I have a choice between running my 4k laptop at full resolution and having a correctly-sized UI in all the applications I use), I seriously hope they don't do that.


Do you deal with mixed DPI, for example, via external displays? I have very little issues with Windows 10 on 4K, so as long as I don't mix DPIs. Maybe it's because we're using different software, though.


I see absolutely nothing wrong with USB-B or headphone jacks. I'm far from an Apple fanboy, running a Thinkpad with Linux and a Google Pixel 2 (and yes, I am pissed about not having a headphone jack.) I'm not just embracing whatever Apple says. But once I switched all of my machines to HiDPI, I saw the light. This, to me, is not a minor improvement in experience. It is a huge improvement in experience, and in my opinion essential for people who deal with languages that have a more dense script.


Japanese looks fine on my 2009 Mac.


Japanese can be rendered acceptably with subpixel AA, but it is significantly worse, especially for complex kanji, than a HiDPI display. For Chinese where you don't have less dense character sets like kana, it's even worse.


Well, now is close to that time. Everybody who cares for graphics already has a Retina Mac or a hi-dpi display.


> Everybody who cares for graphics already has a Retina Mac or a hi-dpi display.

Elementary level of product owner analysis I must say. Proactively constraint your product into a niche category.. tsk, tsk.


>Proactively constraint your product into a niche category.. tsk, tsk.

Yeah, it hasn't been working for Apple this past 2 decades. They're only near the $1 trillion mark, could have been way better...


It’s less “the code for this is a mess”, and more “the code for this can’t be anything but a mess”.


> It’s less “the code for this is a mess”, and more “the code for this can’t be anything but a mess”.

Ok. But so what? As a customer that's not my problem, and Apple has a HUGE war chest to pay developers to maintain the mess. I have nothing but 1x displays externally and this makes Mojave a non-starter.


>Ok. But so what? As a customer that's not my problem,

It is. Customers pay for code debt too -- with bugs, slow performance, and slower moving features.


I really don't believe that subpixel rendering is such an inherently complex problem that there is no clean way to implement it.

From the OP it sounds more like its a pain to implement due to the number and design of abstraction layers in the modern macOS graphics stack.


The gp’s exposition is explicit but makes no sense to me, in a scenegraph architecture all that would be very straightforward for example.


It's not like tech debt has no cost. This is engineering effort that can be spent in other places; it's lots of other tasks that become easier because this complication doesn't need to be supported. It's reasonable to argue that'll turn into real benefits in the software in other ways.


Personally, I think that is moving in the wrong direction. This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly.

Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.

If we ever want to have a hope of computers looking like, e.g. this Microsoft fluent design mockup [1], which is clearly window-based, but the windows are tilted slightly and shaded giving it a physical appearance, we'd have to go in that direction. If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display.

Also, one thing I've never seen discussed are virtual displays inside a VR world. The VR headset has plenty of pixels, but you can't simulate a full HD screen at a few feet distance, let alone a retina screen.

[1] https://cdn0.tnwcdn.com/wp-content/blogs.dir/1/files/2017/05...


> This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly.

It's pretty common for regular AA to be borked in those situations, browser support for AA under transforms hasn't ever been awesome, and it's only been decent in the major browsers pretty recently.

> Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.

Text & vector rendering with regular AA are currently aware of virtual sub-pixels, when they're not borked by transforms like in your examples. But making the browser rendering engine have to be aware of the order and orientation of your specific LCD hardware sub-pixels of your monitor would be very difficult for browsers to support. There is a reason subpixel AA has never worked everywhere.

When I first glanced at this article yesterday, I got the mistaken impression that Apple was turning off all AA, not the LCD specific ClearType style of subpixel AA. That would be a big deal, but that's not what's happening. The difference between regular AA and ClearType style AA does exist, but it's pretty small. I'm a graphics person, and I'm very much in favor of AA, but the outrage at losing subpixel AA feels overblown to me. It helps a little, but also has the downside of chromatic aberration. I'm honestly wondering if most of the hubbub here is people making the same mistake I did initially and thinking that they're losing all anti-aliasing?


Obviously the right direction is replacing sub-pixels with pixels. Maybe you meant this is premature?


>If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display.

Which is what we _should_ be doing. What do you expect, some resurrection of the low res display market, or perhaps the return of VGA?


> Subpixel antialiasing is obnoxious to implement

Totally, but

> there's no denying that subpixel-AA text looks better on 1x displays.

I get that it's hard, but at the end of the day what users are going to notice is, when they dock their laptop at work, their display is going to look like fuzzy crap. Maybe most people won't notice? It's hard to tell without seeing a side-by-side comparison.


I guess Apple just wants everyone to buy retina displays, and don't care about people using what they believe is "legacy" hardware.


This wouldn't be a problem, if

1) They had Retina MacBook Air 2) They actually have a Monitor Line, that is Retina+ Display, Super great colour accuracy etc.

For 1st point Apple doesn't seems to care, and 2nd point, Apple should at least have a list of Monitor that work best or up to their "Standards". Or on sale at Apple Store.

These attention to details, in its ecosystem weren't as obvious when Steve were here. But recently they have started to show cracks everywhere.


This is a good indication they will remove the Air from the lineup this fall and replace it with a lower-end MacBook.


Who is buying the MacBook Air right now anyway? The pro is lighter, smaller, faster and better performance for the money. They are very similarly priced I am not sure why anyone would purchase the air given the updated 13 MBP is available.


> Who is buying the MacBook Air right now anyway?

I am. I want a keyboard that's not absolute garbage to type on.


That keyboard was awesome! My new MacBook has lost 3 keys in 1 year. Welcome to the future.

I really miss my MacBook air.


I've opted to use the MBA exclusively at work for years now. I have not found anything that beats the size and weight of it while still giving that level of power. The form factor is one of my top considerations, so a Pro isn't really something I need or want because the extra horsepower doesn't really add much for my needs.

I can dock it at my desk and hook it up to two huge displays and a bigger keyboard, and then not be burdened when traveling.


> I have not found anything that beats the size and weight of it while still giving that level of power

The current MacBook pro is only .05 pounds heavier than the 13 inch MacBook air, and noticably thinner than the air's thickest point. They are basically the same form factor at this point.

The 13" MacBook Pro with no touchbar is more or less the upgraded MacBook Air with Retina Display that everyone has been asking for for years.


I have to disagree. I bought both a 2017 air and a 2017 mbp 13” to compare, and ended up returning the MBP. The screen on the MBP is far and away better, which made it a painful choice. On the other hand, the air was superior for the keyboard, and also for noise/heat, and of course several hundred dollars in price. The MBP didn’t really feel any faster for the type of work I was doing on it.

Most annoying was that the MBP fan was kicking on in high gear consistently when I was doing nothing of note, while the air very rarely does so. It’s always cool and quiet, which I love.

If they just threw a retina screen on it, the current Air could be a hell of a computer for several more years.


I recently went from an Air to MBP 13" (not out of choice, work requirement) and the keyboards are horrible on both. The only noteworthy thing on the MBP one to me is the reliability issues, comparing the actual typing quality is apples to oranges.

But what I have realized is that in reality it's very rare that you have to actually use a laptop keyboard. Maybe only like once a week do I not use a proper USB keyboard, i.e. if having to visit a client or something.


> the extra horsepower doesn't really add much for my needs

Can you say what your needs are? I'm skeptical that I would be happy with the speed of my compiler on an Air.


Mainly I've got multiple browser windows open with way too many tabs. I use a plugin to prevent them all from being actively loaded, but still have a ton active at any one time.

I also usually have one or more Excel windows open for data analysis, possibly Powerpoint or Keynote, and then some other lightweight stuff (Spotify, Sublime, etc.).

I also am most frequently using this across two 27" Thunderbolt displays (and if I'm desperate for a 3rd, I can open the MBA screen in addition).

I'm not a developer, so my needs may be less intensive than yours.


> I'm not a developer

Yep, I guess that was the root of my question, thanks. :)


Do you travel a lot? The only upside I can see of the lighter form factor is if you have to carry it around in airports quite a bit.


No, but I do schlep it home every day. Plus, when I am running around to meetings all day, it is just convenient.


The smallest (11") macbook air weighs about 1kg, and it doesn't have that fucking retarded keyboard.

I have chronic pain and the idea of paying an extra almost half kilo in weight for my laptop is enough that I gave a Windows PC a try for a year.


Sadly they don’t make the 11” anymore, and the retina 12” is too crippled for ‘power user’ use.


They still sell the 11-inch Air through education channels. You can buy one from a e.g. university's store if you or someone you know is student/staff/faculty/alum.


The non-Touch Bar MacBook Pro is £1249 vs the Air’s £949. That’s a big difference for a lot of people. People like my partner, who doesn’t care that it’s essentially 4-5 year old hardware with minor revisions.

I would never buy an Air myself because that screen is pure trash, but she genuinely doesn’t care.


It's cheaper, fast enough, and has a non-garbage keyboard.


For the majority of the consumer, which is 80% of the market, 10% of market are nerds and professional each. Of which many of those are price sensitive, and the $999 MBA is the entry level machine which Apple offers. And as of right now, it is ridiculously over priced even by Apple standards.


I bought one two months ago as a secondary machine because my old Air died.


There are the two LG ultrafine displays specially made for MacBooks that are sold solely through Apple.


And look at their stellar ratings and reviews.

"Hey, we know you probably spent $1200-2500 on a laptop. Buy a $700 or $1300 monitor to go with it!" (And sorry, $700 for a 21" monitor is ... meh).


I run 32 inch displays at work at 4k at 1x. It is great. Luckily I use Windows so I get crisp text.


> buy retina displays

External retina displays don't exist.

HiDPI displays aren't available in a number of common monitor sizes.


I have a 2015 rMBP with nVidia 750m and recently I switched from Apple Cinema Display to a 4k screen. At any of the graphical modes between the full 2x retina and 1x the whole OS gets extremely sluggish to the point of unusable.

Some people from Jetbrains have looked into it and concluded it was the font rendering in IDEA bringing down the entire system.

Do you think this change can improve this? Because it certainly sounds like it.


Sorry for the rant, but given the price of the JetBrains Toolbox, it infuriates me that they've been giving us HDPI users the middle finger for a couple of years now.

First of all, switching to grayscale helps A LOT. mojave does this system wide, but the JVM (or IntelliJ, I don't really remember) has its own font rendering system, so you need to change that in IntelliJ directly.

I was hurt terribly by this issue. My problems have been solved by three things: - IDEA update that reworks the statusbar undertermined progress bar, which maxed my cpu just for a dumb anymation - Going from an iGPU to dGPU mac (you seem to have one, so congrats, it would be way worse with only your iGPU) - Using an experimental JVM from this thread: https://youtrack.jetbrains.com/issue/IDEA-144261

That said, JetBrains messed up with their 2018.1 release again. I'm mostly using Android studio, which lags behind IntelliJ's versions. 3.1 merged the stupid progressbar fix, but I'm not looking forward when 3.2 hits stable, with 2018.1 merged in.

Bottom line is that there are multiple factors in JetBrain's custom OpenGL UI inefficiencies combined with Oracle's JVM bugs/inefficiencies that takes down the whole system, but it's ultimately not only a font smoothing issue.

JetBrains doesn't really care, or at least doesn't show that they care. They first said "we don't notice that", and then released some JVM patches in the comments of an issue.

Of course, Oracle also shares part of the blame: none of this would have happened had they kept Apple's JVM Quartz support.


Top comment on the bug tracker:

> Joshua Austill commented 24 Jun 2018 06:21

> @Gani Taskynbekuly OH MAN! I knew I was having this issue way worse at some times than others, but I'd never made the connection to sleep mode. Since reading your comment I've just been shutting my laptop down instead of closing the lid and letting it sleep and the problem has all but gone away. I searched High Sierra sleep issue and this is a well-known issue with bug reports. This means this isn't a JetBrains issue, it's a mac issue. It is annoying to turn off graphics switching and sleep mode and have crappy battery life, but hey, I'll pay that price for a usable laptop. I owe you a drink of your choice!

Seams like a macOS issue. I heard even rumors that's not the only issue with that OS. ;-)


> I heard even rumors that's not the only issue with that OS. ;-)

Sigh. Could we please not fall into macOS trolling? I believe that it is a totally different issue that the one I'm suffering.

I haven't had this issue with any other program. There is no excuse for IntelliJ when Xcode, VSCode, Sublime Text and even Eclipse perform 10 times better as IDEA does.

Even if I do a cold boot and open IntelliJ directly, I will still get unbearable slowness. Sure, there might be a bug, but it doesn't affect me. If it was only a macOS bug, how would you explain that JetBrain's patched JDK works way better, and that 2018.1 is a downgrade?

My old laptop didn't even have a dGPU, so turning off graphics switching wasn't the problem. Many apps don't handle graphics switching correctly, and it's 100% their fault.

Also, do not confuse "top comment" with "latest comment".


Had this problem. On the latest version, it seems to be resolved for me, though I have an AMD GPU.

One thing that worked for me while it was a problem was switching off ALL antialiasing in IntelliJ, or at least going down to greyscale.


^ fixed it for me, was very frustrating though i ended u buying a whole new macbook trying to fix this before finding the software fix


So the Apple philosophy is, "when things get hard, don't bother, just walk away and abandon your current customers"? Sounds like a real winner.


Not so much hard but a hack. There is also no clear non-hackish way to implement sub-pixel rendering after 20 years of experience across at least three platforms.

Also since we no longer have consistent sub-pixel layouts, sub-pixel rendered text is no longer portable to another screen, such as you get with an image containing text on a web page.


Is that really what you took away from this?


> Subpixel antialiasing is obnoxious to implement.

Isn't it completely abstracted away from 99%+ of macOS development?


Yes, but if it dramatically increases complexity for all the lower levels of the rendering stack, it ends up creating extra work for those OS/framework developers, requiring separate versions of the code targeting different displays which must be maintained separately with different sets of bugs in each, preventing them from making large-scale architectural changes they want to make, restricting some types of significant optimizations, etc.


Apple has a $200B war chest. If they can't afford to deal with this problem and instead choose to make my experience worse that says something about their priorities as a company.


You must realize that not all problems can be paved over via money and engineering bodies, right?


Their priorities is not just everybody (and has never been). It's high end users, who can afford $2/$K laptops and hi-dpi displays.

Do they always get things right for them? No. But those are their users, not the average joe.


Yeah but isn't it already a solved problem with 1x monitors? Who is implementing things multiple times for no reason? I'm glad in the future it will be easier for Apple because they just won't hire devs to do that part then, but does this imply that 1x monitors will simply look inferior no matter what is done with newer versions of OSX+?


There's a broader change that is an improvement to the abstraction devs work with, which simplifies coding and results in more GPU acceleration.

Essentially, this article: http://darknoon.com/2011/02/07/mac-ui-in-the-age-of-ios/ will be obsolete.

If it's possible to realise those benefits while retaining subpixel antialiasing, it would entail a complete reimplementation.

On the other side of things -- Light-on-dark subpixel antialiasing has always been a little wonky, and Apple disables it in most dark UIs. Without significant changes, dark mode would've been greyscale either way,


This complexity is why we can't have windows that spread between 1x and 2x monitors since the introduction of Retina displays.

Ideally everything should be rendered (and cached) to the maximum resolution available (minimum trace widths considered) and then downsampled to the specific screens with subpixel data applied, but that's a lot of processing to do and battery life would suffer.


Eh? Before Yosemite was released in 2014, one window could span between a 1x and 2x monitor. I forget if it was always rendered at 1x, or if it was rendered at 2x and scaled for the 1x, but yeah it looked like crap on one or the other. Actually I think it switched depending on which monitor had more of the window?

But they only eliminated windows spanning across multiple monitors when they made spaces per-display.


It's still there iirc if you disable separate spaces per display.


I don't remember that, but, if you do, I trust your memory (and your hypothesis) over mine.


It’s already a solved problem if you want to keep using the software you already wrote, and never do anything different. (Nobody is stopping people from running OS X 10.13 indefinitely on their 1x displays.)

If you want to rearchitect your whole text / vector graphics rendering pipeline to take proper advantage of GPUs, then you are going to be much happier with your life if you don’t have to double your work to support legacy hardware.

I would love to see Apple implement this kind of thing http://w3.impa.br/~diego/projects/GanEtAl14/


>It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing

Most of the apple hardware supports ARB_blend_func_extended or even EXT_framebuffer_fetch, which lets you to implement subpixel blending with ease. Otherwise you just have to fall back to traditional alpha blending and render each glyph in three passes for each color channel, which is not that hard either.


If this was hard for apple imagine how much work it was for Microsoft with their complete lack of control over monitor displays. But they pulled it off.


Pulled it off, or they just don't care to streamline their code and get the best performance, but instead hold tight to all kinds of legacy crap in sake of compatibility?


Holding tight to 'all that legacy crap' is what makes Windows the de facto standard for desktop.

Despite people trying to insist on it being business practices from twenty years ago, the simple fact is that Microsoft dominated, and still dominates, bc of backwards compatibility, a lesson that Apple has never learned (or never cared to learn) and why they will never have more than a few percent market share on the desktop.

And despite the gossip bloggers constant nonsense, the desktop market is still be much alive and thriving.


>Holding tight to 'all that legacy crap' is what makes Windows the de facto standard for desktop.

Well, MacOS doesn't aspire to be the "de facto standard for desktod", or to be able to used in 10 years old machines in some corporation.

Isn't it nice that there are options with different priorities?


Well I may run dual 4k 32inch monitors at work but do not think I run a desktop. I run them off my laptop.


>..but instead hold tight to all kinds of legacy crap in sake of compatibility?

Yes the pulled that *hit off... imagine all the combination of components and drivers they needed to support. Way more of a challenge than a closed system like Mac.


This change seems to be dumping on Steve Jobs' graduation speech where he talks about taking calligraphy classes and how it made him realize the importance of good font rendering.


If our LG 21x9 displays all look like crap when we dock our MacBook Pros, then it’s just one more justification for the Windows lovers (yes, they exist) to move to Surfaces. Adobe doesn’t care which we run.


I find Windows anti aliasing to already look like crap vs. subpixel on a Mac, so is that what you mean?

On the other hand, I won’t tolerate 1x displays on either platform anymore, so this change hardly affects me.



> On the other hand, I won’t tolerate 1x displays on either platform anymore, so this change hardly affects me.

Well, buying 60 4K displays was out of our budget and the LG displays looked very nice. I guess we're just not rich enough.


Are 4K displays still that expensive?

You can’t buy 1024x768 displays anymore (well, outside of projectors) even though they would cheaper. The same will be true for 1x sooner or later.


If you aren't after a certain size or refresh rate, then 4K displays have gotten pretty affordable -- but if you want one with high refresh or a more exotic resolution things get quite expensive.

Example, the Acer X27 which offers 144 Hz is $2000: https://www.newegg.com/Product/Product.aspx?Item=N82E1682401...

LG's 34WK95U (5K ultrawide, not out yet but supposedly will be next month), is $1500.

Personally, I have a 38UC99 right now and I want to stick with the 21:9 aspect ratio, so I'm waiting for that 34UK95U.


I wish most conference venues would realize this. It is really dragging giving demos on projectors these days.


Also, only work on specific screens.

My main computer I am still using a CRT, and the flat panels I own some of them are with pixels in funky patterns, and when enabling sub pixel the image instead gain lots of random color fringes.

And you can't use rotated screens either. I saw people doing that to read source code easier, with multiple screens rotated, and rotated screens are used in some games, Ikaruga for Windows for example explicity supports rotated screen mode.


This and the document @gnachman posted has me intrigued. Are you saying that subpixel AA is not just like a multichannel mask? I would expect the amount of R/G/B foreground and background color for a given pixel to be the same regardless of what those colors actually are?


(note I haven't worked on subpixel AA for font rendering, but have for something different)

the amount of color you apply to each subpixel has different levels of intensity depending on how deeply into the pixel the shape goes, so you would need to blend with whatever is behind/in front of it. Also you can't just do something like write (128,0,0) to a pixel and leave it at that, you'll have a red pixel. It would need to be blended with what's already there.


I expect this would be the case with alpha compositing at least...

Afaict subpixel rendering is supersampling width with multiples of 3, then modulo via the horizontal coordinate into rgb triplets and divide by supersampling factor. I guess if kerning and position are also subpixel it might become more tricky to cache results efficiently.


> Subpixel antialiasing is obnoxious to implement.

I imagine 10 engineers that worked out antialiasing at Apple are handsomely compensated for their suffering.

I hook my mac to a cheap ultrawide. Will Apple give me 1k eur to get a new monitor with 4x pixel density so I can continue using my computer that I got from them only a few months ago or will my mac become obsolete? This is unworkable.


You sound very dramatic. You know, you could just not update the OS if it's that important to you.

As someone with a non-retina iMac I'm also affected, but I understand that non-retina displays are a niche category these days. Thinking about it, I guess the Macbook Air is the only computer Apple is selling that doesn't use a retina display.


> I understand that non-retina displays are a niche category these days

Social bubble detected


That's pretty interesting..

I already thought it was a reasonable engineering decision, though, because of one other factor: the bar for what a screen should look like has gotten massively higher over the last several years.

On 1x displays, yes, subpixel antialiased text absolutely looks better... but it still looks like shit compared to any modern phone, any MacBook Pro or high-res Dell laptop from the past several years, to any 27" 5K display or 20" 4K display, etc.

Subpixel-AA used to be the way to get the best text rendering possible, for all normal computer users.

But now, it sounds like it requires maintaining some bug-prone, nightmarish architecture, just to get slightly-better-but-still-terrible text rendering. That probably isn't worth it.

Love fish shell, btw!


I have not a single high-res display that's not a phone. Not at work, not at home. Nor does anyone else at my work or anyone else I know at home.


It's also easier to implement non-gamma correct blending, but seeing as it looks terrible and uneven, Apple so far has not done that.

More so, most technical people don't even understand that one, so good look getting a non-Jobs type to see the importance.

Non-retina displays are going to stick around for years. Not everyone wants high DPI, I for one much prefer working on a giant non-Retina monitor instead, for more real estate instead of detail. I don't know what they put in the water in the Bay Area, but everyone seems to be moving towards a lowest common denominator approach.

This is a terrible decision, and if they don't fix it, it'll be just another reason to abandon the platform.


> It's also easier to implement non-gamma correct blending,

Gamma correct blending doesn’t require threading the information through multiple graphics layers, as the parent comment put it. The difficulty of a gamma correct blending implementation isn’t even remotely close to the difficulty of subpixel AA.

> so good look getting a non-Jobs type to see the importance.

I’m not sure I understand what you mean. Mac was the first major PC platform that had configurable gamma correction and color calibration in the OS. It used to be known as the platform for designers, and more people using Mac cared about color and gamma for display and print than for any other platform.

> Not everyone wants high DPI... I prefer more real estate instead of detail

You actually don’t want high DPI even if you can have it on a large monitor? Or do you mean you prioritize monitor size over resolution?

Low DPI is not going to be a choice in the future... the near future. And once it’s gone, it’s never coming back.

> This is a terrible decision

Can you elaborate on how it affects you personally? Since you know about gamma, you do know the difference between LCD subpixel AA and regular AA, right? For some people, subpixel AA is blurrier than regular AA. For me, I see slight improvement, but it is by no means some sort of deal breaker. I’m not clear what the outrage is about, can you explain?


>Subpixel antialiasing is obnoxious to implement.

Yet, somehow, freetype guys did it all


Oh gee we didn't realise it was haaaaard.


Subpixel-AA was not removed from macOS Mojave so the title is completely wrong anyway.

This was covered at WWDC. It uses grayscale-AA now which is much easier to hardware accelerate and gives a more consistent experience across different types of displays.


If it's greyscale, it's not subpixel antialiasing.


You are right. But I think parent comment understands the concepts, and was misled by the name, which to be fair, is a misleading name. The names "subpixel AA" and "grayscale AA" are both terrible names.

Grayscale AA isn't somehow gray or monochromatic, it is regular anti-aliasing, it works in color, and it works by averaging (virtual) sub-pixel samples. The name is trying to refer to how it's not making use of the physical LCD color subpixels.

Both names assume that the term "sub-pixel" is primarily referring to LCD hardware, rather than general sub-area portions of a pixel. From my (graphics) point of view, the LCD case is a specific and rare subset of what people mean by "subpixel", which is what makes these terms seem misleading and awkward.


I hope we are all somehow misunderstanding how terrible this is.

The idea that they would downgrade the display support so that non-retina monitors--- and let's be serious, that is nearly all monitors that people dock into at work or at home-- are going to look worse in Mojave, is almost too absurd to be true.

I want to see this for myself.

Does anyone know if you can simulate this now, without installing the beta? Or if you can install or somehow use the beta without nuking what you already have?

Edit: I am still on Sierra, because the buzz around High Sierra made it sounds like it was a bad idea to upgrade. This is sounding more and more like Mojave is a no-go as well.


I fucking hated Lion when I got it on a new MacBook in 2012. I had been running Snow Leopard on my Hackintosh. I decided to buy a new Mac when I went overseas and Lion removed expose and replaced it with that (still) terrible Mission Control garbage. I spent a few weeks trying to downgrade it to 10.6 and it was impossible.

Final Cut X was ... something else. Anyone who claims you just need to get use to is must not like having good tools. It's a terrible video editor and Apple killed off their only pro video tool.

Around that time I just ran Linux in a VM on my Mac and got into tiling window managers. I tried a bunch and liked a lot of the, but eventually settled on i3 (which I still use today). I ran OpenSUSE at work. Someone at work was throwing out an old IBM dual Xeon and that became my primary box and I used my MacBook as a Windows gaming laptop, up until it was stolen:

https://khanism.org/faith/fate-and-destiny/

At my most recent job, I asked for a PC laptop (Dell or HP) and they gave me a terrible MacBook. I run Gentoo on it:

https://penguindreams.org/blog/linux-on-a-macbook-pro-14-3/

but I still hate the hardware and it's more difficult to get Linux on a MacBook than any other x86 laptop I've owned, including two Dells, an HP and an MSI.

I don't want Apple to go away because we do need competition in the market, but their operating system and hardware seriously needs to stop being a pile or poop.


I won't have anything bad said about FCPX. Yes, they botched the launch by releasing an unfinished, beta-quality first version. But every criticism of it was addressed within short order and it's solid as hell now.

The only reason why anyone continues to hate it today is because they're used to the terrible, horrible and stupid way that all the other NLEs work (including FCP6). It can be disconcerting to use FCPX because everything seems like a fancier version of iMovie, that it therefore must be not for "pros" or it's missing lots of "pro" features.

But it's not. Terrible, horrible and stupid NLE design isn't a feature. FCPX is the first one to get it right, but the rusted on Premiere / Avid "professionals" refuse to see it.


You can simulate this by adding the `-webkit-font-smoothing: antialiased` CSS rule to any website when using Safari or Chrome. Lots of websites already do this though, so you can use the value `subpixel-antialiased` to revert the text to that mode instead.

https://developer.mozilla.org/en-US/docs/Web/CSS/font-smooth

To really appreciate the difference, I recommend turning on screen zoom with ctrl-scroll in the accessibility system prefs. You’ll see the effect of SPAA as colouring on the left and right edges of text.


Wow that's terrible. Would be really sad. Even on my non-retina MacBook's display it reduces drastically the readability.


The quality of sub-pixel AA depends A LOT on the physical parameters of the display. It's basically the OS saying "I know that pixels aren't actually square, and I'll use that knowledge to better digitize the font's curves." That worked fine when everything was CRTs or the same technology of LCD displays. However the faults of sup-pixel AA show through pretty strongly on the variety of modern displays (OLED vs AMOLED vs. ...), and the rotation capability of tablet displays. Apple is in the process of switching LED technologies across their products.

That said, if there's anyone in the position to exploit their knowledge of the ACTUAL physical display and its sub-pixel layout, it's Apple. I'd expect Windows or Android to drop sub-pixel AA... but iOS/macOS? I'm mystified.


Subpixel AA was never about CRTs; they do not have a consistent mapping between logical pixels and physical rgb holes in the shadow mask. It was always about LCDs


Having actually been in the graphics industry before LCDs were prevalent, we did sub-pixel rendering. It just wasn't built in as a standard feature in OS font rendering until LCDs made it necessary.


I think Windows has a feature to configure the subpixel arrangement, what about MacOS? Sure they know about the internal screen, but not the one you plug in.


Don't screens tell the OS about their physical characteristics? At least the native resolution and the physical screen size (or dpi) are available to the OS somehow. I'd guess the sub-pixel arrangement might also be.


That's EDID you're talking about and AFAIK, it doesn't support that. On my android now but if anyone wants to fiddle around with what their screen provides, on Linux there is read-edid and parse-edid to query and interpret the raw data directly. It's not terribly much since it's 256 bytes max iirc.


i thought that subpixel aa smoothed by using the individual channels (r, g, b) of a pixel, not by taking into account the physical shape of a pixel.


I think we are saying the same thing. A pixel is actually three separate fixed color lights. By shape I meant their layout/arrangement and size.


Look at your screen with a magnifying glass. Then look at your phone's screen. And your TV screen.


Yes. But how do you know if it’s RGB or BGR or GBR etc? Sometimes there’s a fourth subpixel for white too.

On Windows there used to be a wizard where you had to choose which one looks better to let the system figure out your display’s subpixel arrangement.


The wizard is still there in the latest Win10


¯\_(ツ)_/¯

I’ve never used Windows beyond 7.


You can simulate this by going to System Preferences-->General-->Use LCD Font Smoothing When Available and unchecking it.

It also makes Retina-class devices look far worse. The fonts on my 5K iMac became much less readable after this change.


I do not think this is an accurate way to simulate the issue reported. This simulation will look worse than the actual issue, which is not present on 4K+ displays.


Yeah – on Mojave that checkbox still exists and unchecking it still makes fonts look different (less bold).


that's good to know! Then the difference shouldn't be that big on Retina displays.

I just compared on High Sierra and it's mostly that without subpixel AA types look thinner


I confirmed that the issue still exists by installing Mojave beta. The fonts looks terrible -- very thin and blurry.


OS X has never done gamma correction properly with subpixel antialiasing. Instead, it “dilates” the shapes slightly, but this is actually wrong. I suppose half the problem is getting used to the new, correct, line weights. This is more problematic on retina displays because the dilation affected the weight far more than the gamma deviations would’ve at high resolution.


What do you mean by dilation?


This turns off smoothing across pixels, rather than subpixel antialiasing. Not the same thing.


>> It also makes Retina-class devices look far worse. The fonts on my 5K iMac became much less readable after this change.

I actually prefer this setting to be off on Retina screens. You don't need to anti-alias fonts on a 230ppi display.


You might not need it. I do.


> that is nearly all monitors that people dock into at work or at home

And 1080p projectors in conference rooms.


That would be a dream. So many are still on 720p or lower, especially in schools etc.


> Or if you can install or somehow use the beta without nuking what you already have?

Yes, I'm running the beta right now. If you're scared about your data, create a new partition or virtual machine and install macOS to that.


I tried the beta when it came out and immediately downgraded because of the font, it looked worse even on the 5k iMac display.


This is the software equivalent to moving exclusively to USB-C. 18 months after the 2016 MacBook Pro, the vast majority of hardware currently in use still requires a dongle, and 18 months from Mojave's release, the vast majority of external monitors in use will still look blurry without subpixel AA.

Someone at Apple really seems to believe in pain as a motivator. And it may work in the short term, but in the long term, it always drives customers to other platforms.


> Someone at Apple really seems to believe in pain as a motivator.

I think they simply believe they're large enough player to make such calls unilaterally, and that the rest of the industry will follow. And I fear they might be right - such decisions enter into the calculations of more "commoditized" vendors like peripheral manufacturers when they design next iterations of their products.

I mean, that's why removing the headphone jack is such a big deal even for people not using iPhones. Not just because we sympathize with iPhone owners - but because Apple is big enough to make this stupidity a new standard everywhere.


Recently switched to Manjaro Deepin operating system. Everything looks nice and quite polished once inside the OS. I have a great pull down terminal. I can snap windows side by side. Almost everything is configurable without digging into configuration files. It's a rolling release so you don't get stuck on a particular version of a package until the next major release. You get the latest stable release of all software monthly or sooner using the built in package manager. Most of my development tools I install using the package manager. Git, docker, docker-compose, vitualbox, openjdk, node, go, python, (dont use pip use pacman to install python packages), (sdkman for kotlin, gradle and maven), visual studio code, postman,(dbeaver as gui for SQL, Cassandra, mongo, big table, redis, neo4j). Download and install intellij and sublime. WPS office is better than Microsoft Office in my opinion. There's also the AUR repository which are community built packages that can also be enabled in the package manager settings. Everything is really stable. Built in screen recorder and screenshots with annotations. You can map your keys to work like a Mac or a Windows computer even on a Mac. The latest Gimp is actually really good (not yet available on macos). There's also Krita for image editing but doesn't handle large PSD files well. Overall the development experience is much less frustrating and docker runs natively which means everything is much faster. I've had a lot of luck running music production DAW and VSTs and games under Wine. There's a great guide on the techonia website about how to dual boot Manjaro Deepin safely on a MacBook Pro. The best part is I can have this environment on any computer so I can have the latest intel processor if I want and the OS won't make my company issued external monitor look like crap. Unfortunately Apple runs more like a fashion company than a technology company. Making people feel compelled to buy the latest product.


Attempting take this idea seriously, I just catalogued apps I have open right now that I know/suspect have no good equivalent on Linux:

  * Safari - might seem unnecessary, but it's a lot of the web
  * Discord - I hear they have a Linux version in alpha, but...alpha
  * iTerm - best there is after years, why the hell can't Linux win at this?!
  * 1Password - guess I use the CLI? ugh.
  * iTunes - largely used for watching movies
  * Messages - This is surprisingly nice to have on the desktop
  * Microsoft Outlook - guess you are forced to use the web client
  * Xcode - heh
Stuff I didn't expect to see but was pleasantly surprised to see:

  * Spotify
  * Hipchat
  * Zoom
  * Skype
I could live without Discord, Messages, Safari, iTunes and Outlook if I got motivated (Outlook I have to have, but the web client is halfway decent). That leaves Xcode, iTerm and 1Password as dealbreakers. We know one of those isn't going to change!

I'm of course not including the apps I use less often but really like when I need them, like Lightroom, Photoshop, Illustrator, Keynote, Excel/Numbers, OmniGraffle, Sketch, and Things.

I think Linux is safe from me, except as the system I have next to my daily driver, and which I use for Windows (when I need to look at something there) and dual boot for Tensorflow.


Okay, I'll take a stab at this:

Safari - either Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem. At worst I think you'd find the experience on par, but when I owned a Mac, I found Google Chrome to be more performant than Safari on the regular, and its Linux build is just as quick. A lot of Linux folks recommend Firefox as well but I still am not convinced it beats Chrome in terms of out-of-the-box magical "Just Works" factor.

Discord - The client may be alpha, but this works just fine honestly. I've never had a problem. This is largely because the desktop app is Electron based, so it's running on a web browser anyway; it's very, very cross platform friendly. Slack too, if that's your thing.

iTerm - There are MANY good terminal clients for Linux. I personally use Terminator, which I find has a good balance of power features and stability. I particularly enjoy its terminal broadcasting implementation, but I'm in the unusual position of working on many parallel servers in tandem during my day job, so this feature is very important to me.

iTunes - If it's just for movie watching, I've found VLC to be perfectly servicable. mPlayer is also quite popular and there are many frontends.

Messages, Outlook - Here you've got a fair point. Outlook in particular is a pain point; there are workarounds to get it working in Thunderbird and Evolution, but they're just that - workarounds. Anything beyond basic email will need the web app; fortunately the web app isn't _terrible_, but yeah. Fair complaint.

Xcode - If your goal is to build Mac / iOS apps, there is no substitute thanks to Apple's EULA. For everything else, pick your poison; there are more code editors and IDEs on Linux than one can count, many of them excellent. Personally I'm happy with Sublime Text (paid, worth every penny) and a Terminator window, but I hear VSCode is also excellent, which is odd considering that's a Microsoft endeavor. (Now if we could just get them to port Outlook...)


> Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem

Google Chrome's extension ecosystem is undoubtably far, far ahead of what Safari has. As for usability, and…

> I found Google Chrome to be more performant than Safari on the regular

People use Safari because it integrates so well with macOS, is performant, and doesn't kill resources (CPU, RAM, battery, you name it). No other browser comes close, even on other platforms. Apple's just spent too much time here optimizing their browser that nobody else can match it (maybe Edge on Windows?).

> There are MANY good terminal clients for Linux.

iTerm just has everything and the kitchen sink. Like, it has some flaws, but it just does so many things that I haven't seen any other emulator do. Plus the author is really smart (he's at the top of the comments currently, if you want to check his stuff out)

> Xcode

Xcode can be surprisingly nice for C/C++ development, when it decides it wants to work.


> I haven't seen any other emulator do

Check back with Konsole.

I don't think I know all features, but just the things I know it does, because I use them:

- Tabs, obviously. Movable between windows, too.

- Arbitrary in-window splitting (tiling)

- Monitors

- Broadcasting

- Signals

- Profiles, obviously.

- Copy stuff as HTML & export whole scrollback as HTML and also print it / convert to PDF

- Can basically configure everything


Also peep Tilix. It's like the GTK answer to Konsole: super configurable yet accessible, with a sharp GUI. I think its arguably better than Konsole, except it doesn't scale as well as Konsole in Plasma (Plasma scaling is wretched).


> (maybe Edge on Windows?)

Edge feels like poor man's Safari. However the rest of Mac OS (in terms of GUI, not underlying OS) feels like poor man's Windows. :D


Give QT Creator a try. It's really a nice integrated C/C++ dev environment. Plus it's entirely cross-platform.


> Outlook in particular is a pain point

That depends on the environment. I assume you're talking about Outlook as a frontend for an enterprise Exchange setup?

If your mail server admin configures IMAP and SMTP correctly, it's a breeze to get it set up (you will need SSL though). Use "DOMAIN\user.name" as username together with your AD password (and for team/shared mailboxes, use DOMAIN\user.name\mailbox.name, where mailbox.name is the cn or sAMAccountName attribute of the mailbox AD entry). Thunderbird can handle everyday emailing as well as responding to calendar events that way; I'm not sure about integrating "real" calendar stuff. Auto-completion of mail addresses must be done separately in Thunderbird, you'll need the AD structure information (root DN, plus the DN of your AD account so you can log in).

What you'll miss is forwarding rules access, but that can be done via webmail if the need arises.


Their Safari comment implies the need isn't due to a preference for Chrome, but rather for front-end testing. So yes, Chrome might make a fine daily driver, but those of us who have to occasionally code for the web need access to all of these browsers.


Frankly, no. If you (as a user) choose to use a platform-specific browser then that's your problem.

Even IE will at least provide a free VM image these days.


The idea of swapping away from macOS seems more enticing by the day, given the development directions of macOS (this is just one of the many nails; the main one is that Apple is likely going to merge macOS with iOS [or the other way around]).

Discord works perfectly fine in a web browser. IIRC the platform-specific clients are just running a web browser themselves. Would rather use PWA then.

As for iTerm, you don't need that when you use i3 but perhaps you can specify the features you need.

1Password, I recommend Bitwarden. Cheaper, open source, you can even run an open source backend.

Messages, no idea why you want that. I only use WhatsApp for IM, and even that is just a web app.

For development I recommend Sublime Text. You can use it for free, though I did buy it (as someone else commented; worth every penny).

Skype for Linux is actually fallen behind to the other ports. Though I don't use Skype I heard complaints about that.


I have been using Discord on Ubuntu for some time without experiencing any problems.


For passwords I use enpass. It stores and encrypts passwords in your own personal cloud storage account e.g. Dropbox Google drive. Desktop version is free. Mobile version has a once off fee to buy the app.

For office I use WPS office (word/excel/power point). It has tabs for documents and I have never had document compatibility issues (like I have had with libre office). I believe office 2013 runs under Wine too if you really need office.

I prefer the web clients for email anyway.

I use wavebox to integrate web based chat and email with the operating system for notifications.

Safari uses WebKit so it's very similar to the engine chrome uses - but Google forked at some point.

Deepin Terminal is an amazing terminal.

Can't help you with Xcode unless you switch programming languages :)

It would be great if Adobe released Linux versions of their products.

I don't do much video editing but I think there are good options on Linux.

Try it as a challenge I think you might be pleasantly surprised - if you use all the above apps how far Linux has come. Most of these apps work on Mac as well so you may find a few new good tools.


I recently switched my primary machine to Linux and was pleasantly surprised to learn that 1Password has a web version (which does require a subscription) and the 1Password X extension for Chrome and Firefox to go with that.


1Password 4 runs fine on Wine (the Windows version, of course).


Safari and iTunes?!


I made the switch to Linux (Netrunner Rolling with KDE) in the last year and have not looked back. It is definitely not for everyone as some things just have no good replacements, but it suits all my needs.

I do a ton of things in the browser, love the availability of a good Terminal at my fingertips (same as macOS), develop webapps and love the simple availability of a full local web server stack to develop on and so on. I use LibreOffice for my very few "Office" needs. I use Firefox as my primary browser (moved away from Chrome to Firefox before I made the switch to Linux). I use Visual Studio Code as my primary editor. I use Krita for my very few photo editing needs - simple stuff really like cropping or color picking. That's about it. Everything else happens in the Browser or on the command line.

As Netrunner is a derivative of Manjaro and thus Arch it's very simple to access a plethora of packages from their AUR system. It's like Brew on steroids.

Have been really happy with the switch!


USB-C still hasn’t arrived as the definitive standard.

Apple isn’t quite big enough to force an industry change in personal computers.


Actually Apple is definitely big enough to force change—they've done it multiple times in the past. Apple helped push the original USB standard into common use with the original iMac. Obviously it would have happened eventually, but the iMac made the transition occur much faster than it would have otherwise.

The problem with the USB-C standard is that USB-A now transcends far beyond the personal computer industry. It's now a power adapter standard for mobile phones and all sorts of gizmos, gadgets and accessories for the home and car. USB-A isn't going anywhere for a long time.


> Apple helped push the original USB standard into common use with the original iMac.

I don't think that's true. USB became ubiquitous because literally every PC included it after 1998. Yes, it might have helped that USB also worked with Macs, unlike previous connectors.


The USB port did start appearing on some PCs before the iMac—largely because Intel put it in its chipsets—but it was practically abandonware with precious few peripherals prior to the iMac. When the flood of peripherals did arrive, many of them were styled with translucent cables and casings to match the iMac, even when sold for Windows PCs.


> Actually Apple is definitely big enough to force change—they've done it multiple times in the past.

In many cases it's arguable whether Apple forced a change or it was a case of Apple "skating where the puck is going" and doing something that was inevitably going to happen given time.


I conceded as much in the paragraph you quoted from. What Apple did is force the market to move faster that it would have otherwise.[0]

Similarly, if Tesla never existed the electric car would still have been an inevitability. Perhaps it would have taken another decade to emerge. Perhaps two decades. But it would have eventually. On electric propulsion, Tesla is most definitely just "skating where the puck is going."

In both cases—Tesla and Apple—they're not just skating where the puck is going, they're also in control of the puck, even if only a little bit.

[0] And thinking about it further, I shouldn't have conceded the assumption that USB's dominance was inevitable. It seems that way in hindsight but was it really? It could have fizzled like Firewire. It is entirely possible that new incremental standards—perhaps an ultra-fast nine pin serial port protocol—could have filled the gap before another connector emerged to dominate.


I'd argue it is both. Apple isn't going to convince an industry to move to a closed standard, however they may accelerate the move to new open standards. For instance, moving away from floppy disks to compact disk media, and moving from serial/parallel/PS2 ports to USB.

Often these technologies are stuck until there is a first big adopter. Now Google, Apple Macs, and most of Android is behind USB-C for instance, there will be a lot more accessories that use it. Right now IMHO it is being held back mostly by cost - micro-usb and an A to micro-B cable are still the cheapest parts.


If you want to use usb 3.1 gen 2 speeds, you pretty much have to use Type C. The only other alternative is a full usb type a connector or the awkward micro b connector (which is rate).


Doesn't almost every external HD have micro B?


USB-C clearly is the way forward.


I'm not sure. As more devices and peripherals have gained USB-C, lots of horror stories of chargers frying devices and other catastrophic failures have shown up. This isn't my wheelhouse, but just from the reading I've done, it sounds like USB-C is a difficult-to-implement standard and that peripheral makers are more interested in getting something out the door quickly and cheaply. With the amount of power that USB-C can drive, this is risky.

If the end result is that you can't tell if plugging into your friend's charger is going to blow up your phone (or video game console), I think USB-C is going to fall apart.


Cheap chargers and cables that aren't compliant are not unique to USB-C, and why we have things like UL. There are a lot of people using quite frankly dangerous USB-A chargers for their cell phones because they found them obnoxiously cheap.


The USB-C only Mac Book was released 3 years ago.

PC’s have been slowly adopting it. Give it a couple of more years.


The funny thing is that with MacOs, they're NOT a big player at all..


With MacOS, I believe, they still know they're the best. As much as I'd want to move away from Apple because of their recent maneuvers, there's nothing that comes close to MacOS - it just works great. Users will see more appreciation if other Unix-like operating systems will catch up on their GUI. Once a BSD or Linux (I'd really want it to be a BSD!) gets a great UI, I'm switching.


GNOME is great, I personally like it as much as macOS. Try running it on Fedora. The latest Ubuntu also ships GNOME by default but I haven't tried it and I'm not sure if it's a tweaked version (if it's tweaked it's surely for the worse).


Gnome really is great. It's very simple though. If you transition to a search based launcher then you won't go back.


Well, the dock is the worst task manager/launcher of any major desktop UI, for starters.


Try out Deepin - I prefer it to any gui. I hear good things about the latest KDE too.


First time I heard about Deepin. Looked at a few vids and it looks good.

Another sign that a lot of innovation comes from China currently... The Matebook Pro X looks great also.


I don't see how they could "know they're the best" when Windows is the best. If you want the CLI tools from 1979 that come with your Mac, you can easily add them to Windows, the best and most widely used desktop operating system on the planet.

Seriously, there's nothing that comes close to Windows - it just works great and the manufacturer doesn't pull the rug out from under your feet every 5 years like Apple does. If there were something better I'd switch to that too, but there's not.


And even worse than outdated CLI -- the keyboard is a tool from like two hundred years ago (barely changed from movable type tech). Can't believe all these computers still use such old technology. Just because it works is a terrible reason for not updating it.


Hmmm, well I'd say that beyond a very superficial comparison with "keyboards" from 200 years ago (if there even were such a thing), modern keyboards have changed a lot and for good reason.

I'm sure glad that my ancestors decided to move out of their caves.


USB-C is great. Putting only a single port on a laptop is stupid.

Retina screens are great. Disabling the better font-rendering tech is stupid.

And doubling the resolution doesn't make subpixel AA obsolete either.


Microsoft has been having similar font rendering issues for years, so I guess Apple felt left out of this club. They spent all this awesome work & research creating ClearType and then when Vista(?) happened they suddenly decided screw everything, let's switch back to grayscale antialiasing for no reason... now everything on the taskbar (or seemingly anything DWM-rendered) is grayscale-smoothed, which makes me want to tear my eyes out.


> let's switch back to grayscale antialiasing for no reason...

The actual reason is that switching to hardware-accelerated font compositing means that you lose out on subpixel antialiasing, because of complications related to blending. The classic way of doing subpixel text rendering means you need three distinct alpha channels, and most hardware only lets you supply just one. There are ways of hacking around this with dual-source blending, but such a feature was not supported at the time of Vista. In Windows 8 and up, DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.


> The actual reason is that switching to hardware-accelerated font compositing means that you lose out on subpixel antialiasing

But what was this crazy obsession with hardware-acceleration of every rendered character and its grandmother? While you're busy Flippin' 3D I get it, you can temporarily render aliased Wingdings if that makes your life easier, but at least for Pete's sake I just want to enjoy not tearing my eyes out the other 99.99999% of the time. If there was anything slow or poorly implemented in XP it honestly wasn't the taskbar font rendering.

(Not angry at you; just annoyed at the situation they caused.)

> DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.

No, it (still) doesn't. This is my taskbar: https://i.imgur.com/zzIusQD.png


I can't substantiate my comment with academic sources, but here's my memory from the past decade: the people designing interface guidelines for mobile operating systems decided that carefully designing static visual assets for UIs was a waste of time (with similar technical complexity reasonings due to the number of different display resolutions/pixels per inch/aspect ratios)

Market context: Microsoft yearned to implement the Metro UI they'd been researching for a long time (Zune was its public appearance) — and Vista kind of soured people on Aero by osmosis; third-party Android applications always had consistency issues, so Google wanted to mandate something that would be generic enough to work (and be easily implemented) for all apps but also look distinctly specific to Android; Apple wanted to signal some shift to upscale high fashion (parallel to "everything is black glossy glass and aluminum" in its industrial design) and understood fashion-minded upscale as a mix of modernist typography with neon bright contemporary color schemes.

To make up for the boring visual aesthetic that often resulted, they recommended adding animations throughout apps. Most of these animations aren't very helpful, but they keep the user distracted for a moment.


I chose my words carefully: font compositing, not rendering. Font rendering still happens in software, but in order to composite it correctly into the surface requires DSB. I (hopefully) shouldn't have to explain the advantages of using hardware acceleration for app rendering in general.


High quality GPU font rendering is possible now.

http://sluglibrary.com/


I'm aware of Slug (and other GPU font rendering techniques). Slug, however, doesn't have subpixel AA (it doesn't make sense for it to, since its main goal is fitting in a 3D scene), and requires a moderately expensive (read: hard to do realtime) up-front bake process which converts the font into a pair of textures.


UWP for the most part doesn't seem to use subpixel AA, though (Edge only does so in the content, but not the window chrome).


And that was the exact reason why I left windows as workstation to Ubuntu. It did torture my eye every time I was looking at UWP app.


> then when Vista(?) happened they suddenly decided screw everything

What? ClearType was basically introduced in Vista and refined in Win 7.

Win 8 is when they start to remove it for various applications (UWP, and some others) due to its poor performance on retatable devices (tablets mainly).


ClearType definitely existed for XP


That's why I say "basically". They only made it the default starting from Vista.


I am actually quite happy with USB-C: Charging has become easier since I can has the ports on both sides of my MacBook. Dongles are necessary, however, they have been necessary for as long as I can remember with Macs. Of course, a native HDMI port would still be great …


macbook pro users of 8 years here, aside from ethernet/DVI at an office not once have I used a dongle, and tbh those two dongles aren't all that bad because i would just leave them at my desk.

I love my magsafe charge + 2 thunder + 2 USB + HDMI ports and can't imagine life with fewer.


There's on MAJOR difference though - you'd have to deliberately buy a USB-C only Mac to enter the dongle world. This change will be pushed to everyone with a software update and we'll be constantly nagged to update with popups if we don't.


I use a USB-C only macbook pro and I absolutely love it, I would not want to go back to a laptop with a lot of IO.

The main reason is that when I have to move somewhere, it is only one thing to remove (and then add back in) compared to the 5 minute shuffle I used to do.

There have only been one or two situations in a year where it caused a slight annoyance.


That is nonsensical, nobody is putting a gun to your head to use all the ports. Having them available is definitely better than not. And if you prefer dongles, keep using them.


I think what he is saying is that he can plug everything into his one dongle and then unplug everything just by unplugging his dongle.


for that to work you have to have tree dongles then, one at home, one at work and one to carry around, i have it like that, it sux


I just bought a second one to leave at work, I don't think it sucks at all.


i have 4 usb-c on my macbook of that one i used for a usb-c to displayport to connect to my 4k monitor and the other is used for a dongle, i dont know what this guy is talking about, to be honest i would love to have one regular usb to not to have to carry the adapter or dongles everywhere.


I had all the ports, and now I don't. I'm just saying that I don't miss them and that I prefer the way it is now with the dongle usage.


> but in the long term, it always drives customers to other platforms

I'm not a huge fan of Apple, but they've been pulling this stuff for ages, and it doesn't look like people are going anywhere.


Just install ChromeOS on your Macbooks, forget about macOS and you will be fine.


Or any GNU/Linux distribution, which actually offer a superior developer experience in my opinion.

People complaining about their Macs type posts constantly top HN. Usually the comments are filled with further moaning about how things have been so bad for so long with Apple etc., usually accompanied by threats of moving to another platform.


I certainly prefer Linux over native OS X as my development environment, if I ignore the issue of using the hardware.

In particular, with OS X I never need to worry about video drivers, switching dGPU on/off, etc. just to use my laptop in typical ways.


I complain about the Mac from time to time. I just think the alternatives are much worse.


Followed by comments about how Mac OS is somehow “the best” despite the fact that the vast majority of developers prefer Windows.


Or examples loner ago: Apple ditching DVD drives or Apple ditching floppy drives.


"Someone at Apple really seems to believe in pain as a motivator. And it may work in the short term, but in the long term, it always drives customers to other platforms."

Interestingly, it drove me to buy three maxed-out 11" MBAs before they were discontinued.

So I have not left the Apple computer platform but I won't be buying another laptop until I burn through these three - which could be as late as 2022-2023 ...

As for fonts/aliasing - I just won't upgrade past Sierra (or whatever ... I use Mavericks now ...)


Just like they did with USB and Displayport, while PCs continued to come with PS/2 and VGA ports for 10 years after Apple switched.

Legacy is technical debt. You have to move on eventually; for many, the pain is worth doing it early.


Not as long as the competition lags behind.


One guys dell monitor doesn't - subjectively - render input from a beta OS as pretty as it did the non-beta, and suddenly everybody is bringing out the pitchforks.

If you read through the comments, you'll find this link buried, which shows a developer showcasing the difference in the new AA. Basically font-smoothing is grayscale-only in Mojave.

* https://developer.apple.com/videos/play/wwdc2018/209/?time=1...

If I take a screenshot of some text and zoom in, I can see the colours, just like in the video. If I zoom back to normal size and change the saturation to zero (only greyscale) I cannot see a change in sharpness of the text on my 2009 macbook pro.


I'm on the Mojave beta and the difference is noticeable–it's just not the end of the world as many here are complaining.


>Basically font-smoothing is grayscale-only in Mojave.

This is actually better for me. I wrote to Steve Jobs a long time ago and asked him if he could switch to gray-scale font smoothing because the tricks they use with colours don't actually work with people like me who are red green colour blind. The end result was that text on apple products looked terrible to me and I couldn't use any of their products until the retina displays came out. In windows, you can use grayscale font smoothing easily.

Anyway, he replied "we don't have any plans to do this" or something like that. Turns out I won in the end.


Very strange, this has been configurable in System Preferences > General (currently "Use LCD font smoothing when available", previously there were more choices) for ages.


Yes I remember back in Leopard it wasn't just a checkbox, but a drop down menu allowing you turn it off completely or choose "Light", "Medium" or "Strong." I've always found the Strong setting the best for me and have kept it ever since. (There's a `defaults write` command to set this value still, but the effect is a lot less noticeable on Retina.)


I don't think that setting switches between grayscale and subpixel anti-aliasing.



I'm curious: do you mean the colour tricks appear the same to you as turning off antialiasing? Or is it worse than nothing?


For most people who don't like it, it makes the text look like you are watching one of those 3d movies without the glasses. It looks all blurry with red and blue fringing.


Subpixel AA only works when you know exactly what the output pixel layout is, when you aren't being scaled by some other technology and when you aren't showing the same thing on disparate displays, and it requires you to adapt whenever the screen rotates, which is common on tablet-style devices. Also, some people (like me) can see the discoloration it imbues. So I'll be glad to see it go.

(Attempts to disable it tended to get various results in various operating systems, but rarely consistently disabled it)


Really? Somehow that 'only' has included every LCD monitor I've used for more than a decade. Windows AA is more aggressive, but it is still a huge improvement over the jagged over-hinted font rendering you get without ClearType.

No AA makes sense for DPI greater than about 300, but for anything less, with visible pixels, which is probably most non-4K monitors, I would bet most people prefer AA on. In general, people always prefer AA for 3D and all sorts of other rendering, at lower DPI.

So, this is a typical Apple decision to abandon anyone who doesn't have the latest device from them.


> So, this is a typical Apple decision to abandon anyone who doesn't have the latest device from them.

And those who bought a Macbook Air from an Apple Store yesterday.


What about today? Are they updating all the Airs with retina displays?


There's a two week return window for MacBooks.


Personally i prefer to disable antialiasing and if possible use bitmap fonts designed for clarity. Sadly almost nothing supports properly bitmap fonts these days (assuming it supports them in the first place - i think only GDI and X11 core does, but those are increasingly ignored by programs) and even when you can find support, finding good bitmap fonts is hard because it looks like most bitmap fonts are converted from outline fonts - but with jaggies.

As for what i consider a good bitmap font, in Windows MS Sans (the one you could find in Win9x, it was removed in later versions of Windows and aliased with MS Sans Serif which is an outline font), plain Courier and Fixedsys. On Linux i like the console font that you get when booting Debian (not sure what is called) as well as the Terminus fonts. On X11 i like the helv* fonts (usually registered as adobe helvetica - honestly, why is font registration on Xorg such a clusterfuck?) and some of the Fixed fonts (the smaller ones mainly, the larger are a bit too jaggy). On Mac... pretty much all of the original Macintosh fonts. I also like the "WarpSans" font that OS/2 (and eComStation) has, it is very compact and i like compact fonts (although i'm not fan of a few details, like the w being a bit too jaggy).


> On Linux i like the console font that you get when booting Debian (not sure what is called)

That's probably Unifont: http://unifoundry.com/unifont.html

The exact font used is configurable but Debian uses Unifont by default.


Yes, it looks like it.


This isn't 'no AA'.


> but it is still a huge improvement over the jagged over-hinted font rendering you get without ClearType.

There is no font smoothing at all (the jagged fonts), and then there is grayscale font smoothing, and then there is coloured font smoothing. Apple is switching from coloured to gray-scale, which like OP, is great for me personally as I can see all the colours in the other approach.


I don't see why they couldn't make it configurable though


They did, but it was inconsistent [1]. Microsoft had the same problem [2]. In both cases, poorly-documented APIs meant many applications still used subpixel antialiasing.

[1] https://apple.stackexchange.com/questions/110750/how-do-i-di...

[2] https://social.technet.microsoft.com/Forums/en-US/768bd013-c...


MacBook Air.

PDFs looks very blurry and dirty since High Sierra (Mojave too). In Preview.app and in Safari. You can compare it if open PDF in Chrome (looks sharp in chrome). https://imgur.com/a/TRpk1Oi

fonts on some websites looks blurry https://imgur.com/a/o450Mlr


Is the first image for real? Like, that's not a temporary "hold on a second I'm working here" rendering that's replaced after a split second with a proper version?

Wow.


Seems similar to this issue with reading PDFs in Emacs (predates High Sierra, though): https://github.com/politza/pdf-tools/issues/51


So are you saying the High Sierra PDF rendering catastrophy is not a bug but a precursor of this „feature“?

If that is true and the blurred text is going to be the new normal, I do not see how any professional user looking at type all day long can stay on their platform.


Every time I open PDF I think about install El Capitan back. One thing that stops me is picture-in-picture mode. I use it heavily on youtube and it is only in Sierra and later.


El Capitan had the same issue on my mac. It was fixed in High Sierra, and came back in High Sierra.


No, Preview awful pdf rendering is unrelated to antialiasing.


The problem is that there are no external 2x displays on the market (apart from the recalled trainwreck that was LG ultrasharp 5k): https://bjango.com/articles/macexternaldisplays/

Scaling is horrible on MacOS since it basically bumps up the 1x resolution to the next integer multiplier and then does a per-frame raster resize leading to performance drops, blurry edges and other render artifacts.

Meaning that your only reasonable option was to run a 1x monitor without scaling. Now they killed that option too.

Basically what they did, if your primary work machine was a MacBook hooked to a 27" display — is tell you "Buy an iMac and use Continuity".

I mean, I'm all up for removing subpixel antialiasing at the point where everyone is running a 220+ppi monitor. However, we're not at that point yet.

This is a horrible move that may cause me to drop the Mac as a development platform, since I'm looking at text for 8+ hours a day, and I'm not planning on shelling out 3k€ for an iMac or 2k€+ if they get back into the display business, just to fix something that they broke.


I use a display in the 'red zone' described in the article (a Dell P2415) daily paired to a retina MBP; there are no issues coming from not being an even multiple, it's the sharpest monitor I've ever had.


I recently worked on a 4k 24" monitor (~180ppi) on a 2017 MBP.

The text was either too small when rendered at 1x, or the framerate dropped to a crawl when using a scaling option. Everything was rendering at under 60 fps.

Plus I could clearly see the rendering artifacts when scrolling, mentioned in the article.


Here's a comparison, so we can comment on he actual difference:

http://www.framecompare.com/image-compare/screenshotcomparis...


Or slide 125 from this PDF: https://devstreaming-cdn.apple.com/videos/wwdc/2018/209pydir...

Which isn't a perfect comparison because of image compression, but at least on my retina MB they look identical to me.

EDIT: You can also see the side-by-side at 28:03 in this video https://developer.apple.com/videos/play/wwdc2018/209


Thanks.

On my MacBook Air, the Mojave capture looks blockier.

I won't update (I'm still on Sierra, actually, I ususally wait one year before updating anyways).


Unfortunately on my high-dpi monitor I can't get that page to display the screenshots at 1:1, so I can't see the difference due to filtering :/


Enlarge the page using browser zoom. Subpixel AA will make the text look red on one edge and cyan on the other. Without it, the text is perfectly greyscale. This is visible even with extensive bilinear filtering.


Set your browser zoom to the inverse of your scaling factor e.g. if you have a 2x scale set the zoom on the page to 50%.


Here's one where the text windows don't move, so you can actually see the difference:

http://www.framecompare.com/image-compare/screenshotcomparis...


Good idea, but all I can see is JPEG artifacts. They drown the actual differences.


Ah, unfortunate, as I uploaded PNG files.


You aren't going to capture the details of subpixel AA in a screenshot.


You most certainly can. The effect is applied when shading the text, so it does appear in screenshots, as would any other in-software AA technique (AFAIK).

You can test this yourself in the browser. For instance, toggle subpixel AA using the CSS -webkit-font-smoothing property in Safari. There is a clear visible difference that you can screenshot. On some websites with a light-on-dark theme, disabling subpixel AA with CSS was an important part of improving visual consistency between browsers. It’s also important to explicitly control when working with animations that changes text scale or rotation.

It’s easier to see with an enlarged screenshot, but you can totally see the difference at 1x.


I don't know what you can toggle with CSS but I'm pretty sure you can't replicate 'thing rendering a font with knowledge of your physical display' with 'bitmap made of pixels and some CSS'.


The “knowledge” is applied when the text is rendered to the bitmap, not when the bitmap is pushed to the display. You’d be right if the effect happened quite late in the rendering process. It doesn’t — it happens really early.


Screenshots aren't very useful for meets-the-eye visual quality comparisons of subpixel AA unless your display has the same pixel arrangement and resolution as the display the screenshot was rendered on. Even then, there's probably other issues.

I agree that they work for technical comparisons of rendered output.


Maybe I'm confused about something then but you seem to be telling me the subpixel AA is positionally independent. I can take the subpixel AA-ed bitmap and plop it in a page, open it on some other browser and OS and be certain everything is the same down to each coloured subpixel. You'd think every, say, vertical line would have the same colour artifacting, if that were the case. But they don't.


They do? They're just coloured pixels. When you open a bitmap image on different PCs and screens and display it at 1x wrt the monitor's pixels, you would expect every pixel to have the same colour across every PC/screen, no?


The pixels will be the same on every PC/screen. The perceived sharpness/blurriness won't be–the whole point of the "color fringing" is that the subpixels of any particular display have an arrangement that can be exploited to add additional sharpness: https://upload.wikimedia.org/wikipedia/commons/0/0f/Antialia...

If the display you're viewing the image on has a different subpixel layout than the one that the image was rendered for, it won't look sharper.


It's visible in any Windows screenshot


Okay, this might be an unpopular opinion but I've always actually preferred plain AA (grayscale), not subpixel AA. That's especially on a low-res screen. I discovered this preference back in the flip phones days when a certain J2ME reading app offered a choice between grayscale AA and subpixel AA. I immediately noticed that for subpixel AA, there is a lot of red and blue even when the text is black on white. Those colors just make me nauseated. Grayscale AA makes the text a bit more blurry, but I can actually read that text for hours without feeling nauseated.

When I was using Ubuntu, I was offered four choices for font rendering. The subpixel rendering choice also looked terrible to me.

On macOS Safari (maybe Chrome too) exposes a CSS property called -webkit-font-smoothing. Many websites set this to antialiased, including many built by me, because I truly think this antialias mode looks the best.


From a comment someone else made, I'm gonna guess you're red green color blind :P


In addition to the MacBook Air and normal non-4K external display, it seems that this will also affect projectors. Therefore, it seems that presentation using macOS Mojave will only make your slides/keynote look bad and less professional. Moreover, the projector is not something you can upgrade. It is usually preinstalled in whatever venue where you give your presentation, and totally outside your control.


Many projectors don't use RGB striped subpixel grids anyway. They use color wheels or multiple chips to achieve full coverage, and in that case you want grayscale antialiasing.

And on top of usually not using RGB stripes, they also often use software keystone correction, which means that you don't know the native pixel grid anyway.

So there's no disadvantage for projectors, not at all.


I doubt that many people will be able to tell the difference in your presentation if they're sitting twenty feet away.


Oh no no, on low res projector even if it is twenty feet away it is still definitely noticeable.


Apple said during WWDC that the change was made to support a wider variety of display technologies and scaling modes. It’s an odd choice considering they still sell the non-retina MacBook Air.


It's kind of a funny definition of support.


Forcing app developers to live with the same lowest-common-denominator aesthetics that most users see, will make them more likely to raise the bar for the aesthetics on lowest-common-denominator devices.

It's the same reason that video game developers shouldn't be allowed to have powerful GPUs. They need to be testing how the game looks (and works) for most people, not for a group of people only slightly larger than themselves.


I don't think most game shops do. Their devs typically are constantly comparing rendering on mid nVidia and AMD cards, as well as Intel integrated. The Indie titles have to do so because they know their games go for $9~$15 and they need to look consistent or at least decent or a lot of off-the-shelf laptops. Triple-A titles can afford larger huge testing teams that take on various Intel/amd/nvidia cards and report consistency issues.

Sure if the title has a huge AMD or nvidia logo when it starts, it's going to use a lot of specific functionality from that card, whatever those shops pay them specifically to advertise for. But devs need to ensure titles at least look okay and run at at least 60fps to get the largest sales audience.


Do you think most developers are using a non-retina screen? Compared to the general population I bet they skew very heavily towards newer, retina machines. Which means they'll barely notice anything. This is just plain lowering the bar, not raising the average.


I'd say most developers are docking with an external screen, and there are very few external screens that are retina. So yeah, I'd say most developers are using a non-retina screen.


> I'd say most developers are using a non-retina screen

I don't see this as a problem with 4K screens starting at 260 euros (~ US$ 306).


4k screens mostly run at 60hz, making them terrible for anything but watching movies. I don't like coding on a screen where I can see the refresh as I scroll.


Yes, that is currently a problem for those who want that 120 Hz goodness.


If you work as a dev in a company, they're not going to be issuing 4k screens.


You're kidding, right?

I get my devs whatever they want for hardware, within reason. $500 or $1000 monitors? No questions asked. The price of hardware that will last years is nothing compared to the value of engineer happiness and productivity, especially when compares to how costly good engineers are in the USA and Canada.


I know of many companies that issue their developers 4K screens, so…


So get your own monitor. I understand that many employees make the decision that they will not pay for equipment, to be true. However it's still a decision.


Why in the world would I want to use a 4K screen when I can get a 2560x1440 screen?


Hmm, I can't reply to Symbiote for some reason. But yes, 2560x1440 screens and other wide variants are really prefer by gamers or people who want high refresh rates. 4k is limited to 60Hz, unless you use DisplayPort 1.4 and some very new monitors.

There are quite a few users who prefer the wider format screens for a variety of reasons.


That's a silly question. More stuff fits in more pixels. Even if you have to compensate for readability by scaling fonts larger to make text the same size as it was on the lower dpi screen, the end result isn't actually equivalent to the lower dpi. It still works out that the higher dpi screen is on average more functional due to more things fitting on screen. Especially on dual/triple 27" external monitors on my desk. They are large and I am close to them, and a mere 1440 vertical pixels would be tolerable but crap when better is available. I usually have 4 to 10 terminal windows open where I want to see as much code and documentation and log output as possible, not a movie.


A 4K screen is 3840 × 2160, why would you not prefer that?


Because "retina" can imply pixel doubling, making it more like an extra-crisp 1920x1080. The typical 1440p screen is nice and big, and replacing it with such a setup at the same physical size would needlessly cut down on what you can fit. If you could trust non-integer scaling to work properly then 4K would resoundingly beat 1440, but that's a pipe dream right now.

You can get the best of both with a 5K-6K screen, but that's currently a very expensive price point not on most people's radar.


Because some computers can’t drive them > 30hz Most 4K monitors are too small to run native resolution

The rest are too large to run 2x

And those monitors have too low of a ppi to run scaled


These are cheap 27" screens though, which are not ideal for macOS, which is optimized for running at either 100% or 200% zoom. It's a shame that 4K screens at 24" and 5K screens at 27" are so hard to find. Apple uses these panels in their iMac line, but refuses to sell them as standalone displays.


I've scaled one to a virtual 3000-ish pixels, and some of my colleagues do so as well. It's not so bad.


Are they forcing app developers to not use retina screens?


Dark mode in Mojave points at OLED MacBooks. LCD subpixel antialiasing isn't working on OLED displays.


The intent is to provide users with a sense of pride and accomplishment for buying new devices.


Can we not just meme here please? This doesn't belong on HN.


As long as it's not a monitor.


One thing that I noticed when I had to switch from a Linux desktop to an MBP is how much worse text anti-aliasing is. The crispness just isn't there, especially at smaller font sizes.

Certainly the MBP has a high-res screen and can live without it, but it's highly reflective, and anti-glare external monitors provided by most employers are mere FHD, and most fonts become unpleasant and hard to read without anti-aliasing on them.

Maybe anti-aliasing was just hard to implement nicely in OSX graphics stack, they struggled with it, and decided to remove it.

I suspect that most of Apple's "proper computer" sales are laptops where they put a retina display anyway, and if you shell out $2k for a laptop, you can afford to spend $300 for a 4K external display.

In other words, they (correctly) decided that theirs is the premium segment, and that segment can afford to switch to retina-only hardware.


>I suspect that most of Apple's "proper computer" sales are laptops where they put a retina display anyway, and if you shell out $2k for a laptop, you can afford to spend $300 for a 4K external display.

My old windows computer is dying and I am thinking about getting a new mac. It seems the choice is a $2200 imac, or a $2700 mbp + a $1000 4k 21.5inch monitor.

LG is one of the very few companies that sells 4k 21.5inch and 5k 27inch monitors. Even 4k 24inch monitors, which I am guessing you are referring to, are quite rare. They are also not really approved by Apple. I mean, 4k 24inch will make everything look too big I assume, and 4k 27inch isn't retina quality.

So what exactly are we supposed to do? There are no external monitors to buy. Apple doesn't sell any, and the 3rd party solutions are few and far between. Maybe they want everyone to buy an imac and a mbp...


Fortunately, there are a quite few Windows-oriented machines with very decent specs now; some of them have an explicit Linux option, too.

21-22" 4k monitors are indeed pretty rare; 23-24" 4k are more widespread. I would gladly buy a 15-17" 3k / 4k display, like another laptop screen, but they seem to be absent from the market for some reason, despite the wealth of ready-made LCD panels for them.


Because most people buying monitors want something bigger than a laptop screen.


This seems certain to be related to integration of UIKit-on-macOS ("Marzipan") apps. (UIKit has never had subpixel antialiasing.)


The reason being that it tends to want to push pre-rendered bitmaps around, and sub-pixel anti-aliased pre-rendered bitmaps don't work well under rotation...


No.


High Sierra's Metal rewrite of windowserver still has serious bugs nearly a year after shipping, even on a brand new MacBook Pro.

They managed to ship 10.13 with bugs like this that made it into the Ars Technica review: http://cdn.arstechnica.net/wp-content/uploads/2017/09/Sep-23...


I still use a non-retina 13-inch from 2012, as well as an Ultrawide monitor (3440x1440), both of which I apparently wont be using with macOS Mojave.

I can't help but think this is just another in a long line of paper cuts endured by professional users causing them to switch to a Windows machine.


Or you know... Linux. Whatever. It's 2018.


Signed up for HN (again) just to vent about this terrible news. This is a change that will be hard for me to live with, and I will probably end up selling all my Apple gear.

This is the same change/downgrade that occurs if you go to System Preferences-->General-->Use LCD Font Smoothing When Available.

I did this and the fonts on my 5K iMac display looked horrible. Just atrocious. My plan is to not upgrade to Mojave, and then within a year or so sell all my Apple gear and move back to Linux.

I don't understand Apple's thinking, but I believe a lot of people will do the same.


> but I believe a lot of people will do the same

No they won’t. People don’t notice this stuff. I pointed out the weird font smoothing issue in Finder to my best mate – a professional photographer – and even then she barely had any idea what I was on about.


This doesn't affect retina devices so your 5k iMac will look the same it does now...


It does affect Retina devices. That's the whole point. I made the same change that Apple is going to make and the fonts looked far worse (almost unreadable). You can observe the same if you have an Apple device by going to: System Preferences-->General-->Use LCD Font Smoothing When Available.

Apple is removing sub-pixel anti-aliasing for all devices, not just non-Retina ones.



Thanks. The fonts are noticeably blurrier/less defined in the Mojave capture.


Come on, no they're not. Those screenshots are virtually identical.


Maybe this person's eyesight is better or worse than yours, or perhaps just different. You might be surprised what some people are bothered by, or trained to spot, that others aren't...

Here's a possibly better comparison, because it annoyed me that the windows weren't aligned: http://www.framecompare.com/image-compare/screenshotcomparis...


Thanks for the improved comparison.

Zoomed in on my 10" iPad Pro the main thing I notice is kerning differences.


Thanks, that's much better. I think it shows that Mojave renders the fonts clearly better: check out for instance 'll' in 'ullamco'. Generally, almost identical pictures though.


Try the beta before venting. It’s not the same thing.


Downloading the beta now, but have seen screenshots of both (here and elsewhere).


I've been on the beta for a couple days (using a 2016 12" MacBook) and I haven't noticed anything. There's a visible difference when I disable LCD font smoothing in settings, but not much changes besides fonts appearing somewhat thinner.


I installed Mojave beta on my MacBook Pro. It is the same thing. The fonts look identical to toggling off the setting "Use LCD font smoothing when available" in High Sierra. And by the same thing, I mean they look very bad in Mojave.


No it’s not the same as disabling lcd font smoothing. Try the public beta if you don’t believe it.


Mojave beta installed, and the fonts look equally atrocious. They are blurry, indistinct...and just bad. One of the reasons I bought a 5K iMac (three of them, actually) was to have great fonts. I am beyond angry that I will now have to sell them.


Can you post some screenshots? I find hard to believe that fonts on a retina display can be blurry and indistinct, when the antialiasing text weight is almost the same.


Screenshots available below. This is the only screenshot I took beforehand, so sorry it has some cursing in it:

http://www.technologyasnature.com/screenshot-comparison


Looks like Safari ignores the "LCD font smoothing" setting (I assume that was a webpage in Safari). How does the normal UI looks like?


Those screenshots were both in Waterfox. I also tried in Safari (didn't take any shots, though) and the fonts looked identically bad in that browser as well.

The normal UI's fonts are also worse -- about the same as the browser screenshots. Fonts are indistinct and blurry, and very light. All the result of greyscale aliasing only, I presume.


Either Apple disable "LCD font smoothing" for retina displays, or there is a bug somewhere. Because in Mojave on a non retina screen "LCD font smoothing" has got the same weight as before. And on your screen it clearly doesn't.


I hope it is a bug. If the weight were the same, it'd at least perhaps be tolerable on a Retina screen. I know the technical reasons, but I just don't understand why Apple can't leave it alone for desktop systems.


They don't remove font smoothing. They switch it from subpixel level to grayscale only. Technically subpixel should produce better results then grayscale. But IMO the subpixel antialiasing in macos looks horrible. So you might not even see the difference as it looks exactly as horrible as before.

I'd be more outraged if Windows removed cleartype, because their subpixel font antialising actually works and is configurable.


Greyscale anti-aliasing is far inferior to sub-pixel. The fonts look much worse.

BTW, I did install Mojave beta and the fonts look just terrible even on a Retina display. Shockingly bad. I can't believe Apple is doing this. I just bought a new 5K iMac a few months ago. I wish it were still in the return period...but in other news, I am now selling all of my Apple equipment as it's pointless to have it. Without the great fonts I purchased it for, it's so much junk to me.


Subpixel font rendering was a cool hack back in the day, but it’s no match for high dpi monitors. Personally I could never get past the color fringing artefacts. Bill Hill himself talked about it as an interim measure to get people reading on screen.

Here he is talking about the original implementation of ClearType: https://channel9.msdn.com/Blogs/TheChannel9Team/Bill-Hill-Ho...

Joel Spolsky on the different sub pixel AA approaches taken by Apple and Microsoft:

https://www.joelonsoftware.com/2007/06/12/font-smoothing-ant...


I just installed the Mojave beta on my Macbook Pro. The fonts are just as blurry as if I toggle off "Use LCD font smoothing when available" in High Sierra.

So the contention that it won't be the same in Mojave is disproven. Even on Retina displays, disabling sub-pixel anti-aliasing leads to blurry, indistinct fonts.


Mojave doesn't ship with subpixel antialiasing. The "LCD font smoothing" checkbox disables greyscale smoothing.


I know Mojave doesn't ship with subpixel anti-aliasing. That is my whole complaint.

My point was if that box is checked in Mojave (that is, greyscale anti-aliasing is on) the fonts still look atrocious. Sorry if that was not clear.


From perspective of Web Designers and Web Devs who obsess over font-rendering, the way I see it:

-webkit-font-smoothing: antialiased; will be the default now as opposed to subpixel-antialiased. And you will no longer have control over it.

Seems like it DOES affect Retina displays too. That's a bummer as quite a lot of fonts get too thin without subpixel-antialiasing even on MBP Retina.


I've been using Linux for 2 decades, and this year I moved to MacOS, with the newest MacBook and a 5K iMac. Everything pretty much just works, is much prettier by default than I ever got my Linux installations to be.

Given my experiences, I have full trust that the MacOS engineers will do good stuff with Mojave too, at least eventually. I still like Desktop Linux, though, and sincerely wish it will be as good some day. It most definitely is not today.


Am misreading this thread, or was this one person's complaint on reddit about beta software (with a single commenter saying they've seen the same behavior)?

Are there any more cases, or could this be a bug? Cause it seems like a bug to me.



Oh my goodness. And here I was thinking that I'd love the next macOS due to its alleged focus on stability. What the heck....


I never upgraded past Mavericks. It seems that the OS has been going downhill. Thinking about installing Ubuntu on my MacBook.


I'm not surprised, Apple loves removing configurability and thinks "one-size-fits-all". It would be far better to make the font rendering settings more configurable instead, but sadly this trend is not limited to only Apple.

...and I'm saying this as one of those for whom any form of antialiasing is blurry, subpixel being worse (my eyes water and feel dizzy after prolonged exposure), preferring AA off and pixel fonts with sharp, high-contrast edges instead. Let people have the font rendering they want.


Seems like a further shift from user friendliness towards consumer friendliness. That is, alienating core and power user in favour of larger consumer masses.


So what's the solution here? Buy a new monitor if we want to use up-to-date versions of macOS?

Anyone who uses Apple's FCPX or Logic knows we'll have to update to Mojave because eventually we'll try to open the latest version of said software and be told "You must be running Mojave to open this"

I'd really ditch Apple because of stunts like this, but I really love Logic and FCPX so I'm stuck with it.


Nothing like some frothy sight-unseen outrage on a Saturday morning.


I wonder what panel technologies this is supposed to improve? The only one I can think of is a portrait rotated LCD.


PenTile, RGBW Quad, etc -- basically everything except plain old RGB stripes. Of all the possible display layouts, SPAA really only works with one (or it has to be re-designed for each, which realistically will never happen).

The stats I've seen are that most Mac users, as of this year, are on @2x displays.


The real question is whether they’re on @2x disolays exclusively.

I have a Retina MacBook Pro I mostly use it plugged into a 27” Dell 1440p monitor.

Every office I’ve worked in that provided Macs also provided a 1x monitor of some kind. Never 4k or 5k.


It works (worked :/) just fine when switching to portrait mode. I use a secondary Dell monitor for that.


Not cool at all. I don’t want a high dpi external monitor because it’s too laggy. Very annoying change.


So is Apple going to stop selling the 21" iMac that has a 1920x1080 panel?

What about someone who buys one today?


I so knew it! I even put a reminder in my calendar some time ago (not knowing what the next update holds) to not upgrade until I know for sure that my experience won't get worse.


Ok, so what are options in terms of external displays? I guess this will make my Dell 1080p monitor much less appealing... I planned to jump to 4k 24inch anyway... ¯\_(ツ)_/¯


That must be personal, but I always preferred grayscale AA.


What's the last time you got an eye exam?


I'm a pilot, so every year.


Wow. The one biggest advantage my previous MBP had over my current Arch Linux + i3 setup on Carbon X1 is the natural support for HiDPI, as I connect my laptop to external monitors all the time (The only other one is the built-in dictionary). Now Apple are going in this direction further and further I definitely don't see myself returning to Mac any time soon, or indeed ever.


So THIS explains why the beta Apple News app on Mojave looks _so bad_ on my 1x display...


Wasn't typography Steve Jobs "big thing" he wanted to do well on the Mac?


Seeing mac screenshots with amazing font rendering was one of the things that made me switch to mac. I guess I won't be updating past high sierra.


In other news, how is Linux's sub-pixel anti-aliasing lately? I haven't used it in years, and never on a 4K or above display.

Since I will now be switching off Mac, Linux is looking likely again. It will probably be two 27" 4K monitors.


As of "lately" in the sense of "in the last 10 years", it was always great. Freetype always had great rendering, on part and often superior to both ClearType and OSX to my eyes.

I attribute the bad perceived performance of freetype to the lack of good and/or commercial fonts.

The default configuration in most distributions is decent, but with little tuning everything can be changed to your taste. I ran with grayscale AA since the beginning because I find the color fringing annoying.

I also used to like the bytecode hinter, but in the last years with my laptop having 120+ dpi, I find the freetype autohinter to be actually superior as it better preserves the letterform and provides sharper results even without subpixel AA.

The settings can also be tailored per-font, and apply system-wide, with the exception of some stupid QML and Electron apps.


Thanks for that detailed response. I last used Linux full-time as my main machine in 2009 or so. It has indeed been a while. Good to know that it has improved since that time.


Freetype has been superior to Microsoft's and Apple's implementation for years, with incremental improvements ever since.

One of the reasons I would never buy a Windows or macOS machine. These companies really need to sort out their quality issues with font rendering.


I'll be that guy and say that I don't like sub-pixel anti-aliasing—I find the color fringes distracting. I turn it off wherever possible. That said, people seem to like it so it's a shame the option is going away.


It doesn't surprise me, since even Microsoft is slowly dropping ClearType.


I don't know how you can justify making devices so expensive. It's almost like buying Apple calls out for social status. That brand will always remind of class warfare.


Could be this a measure to combat hackintosh installed base?


How so?


This is sad. It should be an option when using 1x displays.


I'm betting Apple will launch a family of Retina desktop monitors.

Also, remember it's still beta software. People do public betas to surface issues like these.


I must be the odd one out in that I actually prefer my fonts to look “soft”?


As long as they find a fix I’ll be happy. I want things to look crisp.


And here I was thinking I was just getting worse eyesight. Lovely.


Since retina I disabled it myself


It’s a beta. Report it as a bug.


Its’s not a bug. It was announced as a change at WWDC. It’s believed to be a consequence of the move to merge iOS code into macOS. iOS doesn’t support sub-pixel antialiasing.


That's no reason not to report it either as a bug or file as an enhancement request.


Apple strikes again....they recently destroyed having two external displays with a recent update, and now they mess up single screen external displays too.

I have a MacBook Air connected to a 2560x1440 screen and fonts looks good. I was looking forward to Mojaves dark mode.

Now I'm afraid to install Mojave. I understand that Apple wants to simplify their code, but they can't just remove things before having an alternate solution to a problem that exist.

Font rendering is really super important... I really hope they reconsider.


Planned obsolescence




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: