ex-MacOS SWE here. Subpixel antialiasing is obnoxious to implement. It requires threading physical pixel geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain!
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).
>"Listen guys the code for this is a mess. Can you just buy the new Macbook?"
More like:
"Listen guys, the code for this is a mess AND slows your graphics down even when it's not needed. We already have had Retina displays for 6+ years already, and Hi-Dpi displays are the norm these days for third party too, so there's no reason for everybody to suffer from old tech, like there was no reason for everybody to have a floppy disk or CD-ROM on their laptop just because 5% had a real use for it. Part of the way it is with Apple, and has always been is that we move fast, and cut things off first. Until now, this exact tactic got us from near bankruptcy to being the #1 company on the planet. So, if you don't like it, don't let he door hit you on your way out".
> Hi-Dpi displays are the norm these days for third party
Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.
I would even go so far as to say that the majority of people who want to buy a monitor for doing text based things (ie business) will buy a bog standard monitor.
It's taken a while party because of cost and perhaps a bit more because of the horrible ways Windows and Linux deal with HiDPI. You wouldn't want heterogenous DPIs or non-integer scales on those platforms. On Linux it seems heterogenous DPI is still very experimental and ugly. On Windows some apps are buggy and others are ugly when dealing with heterogenous DPI. On Windows non integer scale does actually work, but it makes some apps size things horrifically. Needless to say Microsoft's multiple approaches to DPI scaling have made a mess, and Linux never really had a unified way of dealing with it.
If you're on a MacOS platform, with the current price of HiDPI IPS displays, the time is right to grab just about anything. If you're on Windows or Linux, it's still a great time so as long as you're keeping all monitors the same DPI and probably integer scale.
You're absolutely right, but I'd wager it's mostly because they haven't been exposed to 120Hz yet. The moment Apple introduces a 120Hz screen on their iPhones, people are going to want it everywhere. Much like HiDPI displays.
I absolutely love scrolling on 120Hz displays. It feels so much more natural when the letters aren't blurry as they move under your fingers. Indeed, the iPad Pros have the feature, but they aren't nearly as popular as iPhones. I tried on the Razer Phone, can't wait to have it on mine.
100-200 Hz displays are the next logical step for laptops and mobile phones.
Check out iPad Pro devices with 120 Hz display. It makes a big difference in readability of scrolling text (try to read a web page while it's scrolling), responsiveness and smoothness of motion.
These days even mobile phones can drive external monitors at 4k 60 Hz. I think it's reasonable to expect next gen MacBook Pros to be able to drive two 4k monitors at 120 Hz+.
And the majority of people (and businesses) won't care.
The ones that do already have "Hi-DPI terminal windows and really sharp text" as an almost-mandatory requirement.
Think about it. Sub-pixel text rendering (and AFAIK it's just text, not images w/ subpixel positioning? although that's an incredibly interesting idea for a hack!). Maybe subpixel vectors/lines?
Either you care about high-quality lines or you don't... what Apple's basically doing here is moving the cost to the hardware instead of the software, and the win is simplified, faster, less-energy-using device.
I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks (don't make me source my dates on retina displays). It's the norm for "executive" users (ie: software developers, graphics users, finance, book editors, etc.).
If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.
However, thinking it through, I would _really_ like then a "subpixel mode" for either the OS as a whole (kindof like an FSAA?), or especially certain windows. Maybe there's a way to do that as a hack?
>I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks
That's Apple's market, not people holding on to 10 year computers. Not even those that still buy the MBA.
1x Monitor is only acceptable for gaming in my opinion. I cannot believe people are working on 200$ screens. I mean you can also code on an 80 char terminal in theory, but why would you
16:9 monitor is only acceptable for watching movies in my opinion. I cannot believe people are working on TV screens. I mean you can also code on a Commodore 64 in theory, but why would you?
Historically 16:10 monitors have had the same width as 16:9 ones (e.g. 1920×1200 vs 1920×1080) so there's no difference as far as having things side by side.
I think you're being sarcastic, but I honestly can't. I recently bought a new workstation with a $3k budget, and I regret going with a laptop because the budget wasn't enough for me to get an adequate laptop that will still perform well in 3-5 years and get nice monitors.
I was being sarcastic, but also not taking into account currency conversion rates (and possibly the availability of stuff in your country)
Regardless of that, if all you're doing is General Business™ then surely all you need to do is grab an all-in-one box, shove as much 3rd party ram in it as will fit & then nab a couple of monitors and an unbranded mechanical keyboard off amazon.
I did a lot of work on a £700(~$900) Optiplex with 16gb of ram and that was more than capable of running ubuntu on 3 monitors while being small enough to fit into a handbag
> AND slows your graphics down even when it's not needed
So just let me turn it off. Or turn it off automatically when I'm scaling. Or leave it as it is because no one's ever cared about it. Removing it is the worst option of all.
> We already have had Retina displays for 6+ years already
This is 'Xbox One is online-only #DealWithIt' level sentiment. Fuck the consumer, right?
I really hope the market punishes you for this sentiment.
And P.S. Apple is strong because of the iPod and iPhone - macOS is a tiny part of their business and Apple is strong in spite of their shitting on Mac customers, not because of it.
The just sell it as a lower tier option for developing markets and those entering the ecosystem. If they so decide they could drop support for it in a heartbeat and leave the users only running HS.
If the "update" also includes removing all the ports except USB-C, getting rid of MagSafe, and switching to an inferior keyboard design to save a few fractions of a millimeter, I don't see much to look forward to.
Seems more likely that Apple will simply discontinue the Air and that the MacBook represents their new vision of what laptops ought to be like.
Agreed. Discontinuing the Air seems much more likely. This is so aggravating to me. Air + retina display would have been the perfect laptop for years now, but they simply refused to make it. I hate many of the changes Apple has been making instead, and the new keyboard is an absolute deal-breaker. Don't know how I will ever willingly give Apple my money again...
Your general skepticism about other aspects of the coming revision of the MacBook Air may be warranted -- nevertheless a retina display would be a great improvement.
Further, if your prediction is correct, why wouldn't Apple already have canceled the Air? what you describe is already on the market: the escape-less MacBook.
I don't feel as skeptical or negative about Apple's plans for new laptops. We shall see, soon!
I do actually suspect the Air will disappear as an offer in due time.
When the Air was released, it had too many limitations, and too high a cost, to take over the macbook's role. Its primary market was people who really valued the mobility over all else. But as they iterated it closed the gap, until it overtook the macbook, and the macbook offer shrivelled up and died.
If you swap 'macbook' and 'Air' in that narrative, you see history repeating - today we're at the first step of that. "The new macbook" is too limited and too expensive to take the Air's place, but give them a few iterations, and I suspect the Air will become unnecessary in the line-up.
I still believe that a big problem MS had was their relentless commitment to backward compatibility. I agree at some point you have to dump last years legacy, but Apple seem to want to dump last _weeks_ legacy.
Promising/maintaining backwards compatability is how you market and sustain Enterprise software contracts. For consumer products, it's a bonus if you still get updates after 2 years.
apple still sells a whole line of non Hi-DPI machines it's arguably a little early for them to kill this stuff off. They should drop that line or upgrade it to hd-dpi and then in 2-3 year kill off non HI-DPI support
Meh. I see their point. Subpixel AA was about using software to simulate better hardware. But, at the end of the day you really do want the better hardware so that you can use your software budget to simulate an even better generation of hardware. Repeat ad-infinitum, modulo scaling limits.
Pretty sure this is what Alan Kay means when he says that people who are serious about software will make their own hardware.
Mac sales massively happen on machines with retina display though.
A 1x monitor will be a cheap second monitor or something attached to a Mac Mini (please keep it alive Apple) or Mac Pro (one day it will come). From Apple's point of view I imagine all of these are just super small niches.
The software folks have a good reason to remove subpixel AA; it's a hack to get better horizontal spatial resolution out of cheap monitors. The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.
By removing subpixel AA at a time when HiDPI monitors are cheap, Apple is probably doing us a service. In a few years, hopefully everyone will be on a DPI that can render nice text. You may think it's unnecessary, but consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
I know it's pretty irritating to suggest that forcing users to buy new hardware is doing them "a service." But really, I see no conflict of interest here. Apple themselves probably hasn't shipped a 96 DPI display in half a decade, and they don't even sell their own external displays, so they don't stand to gain that much here other than software performance.
> In a few years, hopefully everyone will be on a DPI that can render nice text.
My 1x monitor can render text just fine using sub-pixel AA. It's a proven technique that has been around for ages and the text looks great on it.
> consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
If I start to become a heavy user of Hanzi and Kanji then I will consider purchasing a monitor that displays it better, but currently not an issue.
> Apple is probably doing us a service.
Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?
> so they don't stand to gain that much here other than software performance.
I don't understand the performance argument people keep bringing up. Subpixel AA has been around for ages and worked fine on much older hardware. Why is it suddenly a bottleneck that needs to be removed when my decade old hardware can still do it?
>Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?
No, of course not. If you phrase it like that, that is, ignore the point of removing it entirely and all of the benefits that come with it, of course it's not beneficial.
However, 96 DPI was never ideal. HiDPI won't just avoid a regression, it'll improve your experience, period. And with this new software change, you'll get a performance improvement, from better caching and easier GPU acceleration, and software engineers can delete thousands of lines of potentially buggy code made to work around the interaction of subpixel rendering with other graphics code.
Like, subpixel rendering is easy to understand in theory and that might delude one into thinking that it's just a minor annoyance, but it's an annoyance that permeates the stack. You have to special case the entire stack wherever text is concerned. Effectively, the text is being rendered at 3x the horizontal resolution of everything else. This brings in complication across the board. An example is that you pretty much have to do the alpha blending in software, and when you do it requires knowledge of the final composition. Doing alpha blending in software sounds easy... But it's only easy if you do it wrong. Dealing with gamma for example. Which makes life worse because you already have to do this for the other part of your graphics library, probably with a case for both GPU acceleration and software.
Basically, you'd have to render the whole screen at 3x horizontal resolution to remove the need for all of these compositing hacks. Nobody does this because subpixel AA was a hack from the get go.
In an era where battery life and TDP are actually important, finite resources, we want to remove these hacks.
P.S.: if you've never tried rendering large amounts of text on screen before, I highly recommend you try writing a piece of software that needs to do this. A few years ago I struggled to get 60fps rendering text for an IRC client, and of course caching the glyphs was horrendously complex thanks to subpixel AA. It would've been easy if I could've just used GPU compositing like all other graphics, but instead I resorted to hacks everywhere. Maybe today's computers don't need the hacks, but it's worth noting that yes, rendering fonts is resource intensive - you're just living on top of decades of optimizations and hacks to make it look like it's easy.
> No, of course not. If you phrase it like that, that is
But that's what it is, that's not just clever phrasing on my part.
> HiDPI won't just avoid a regression, it'll improve your experience, period.
I get that, but if I wanted to spend money on that improvement I would do that. The point is my choice is no longer between maintaining my current standard or upgrading, its between regressing my current standard or upgrading.
Unless you want to selflessly support some botnet you have to update your OS pretty soon after it's released. Certainly more often than you have to upgrade your screen.
I'm not sure if this was sarcastic or not, but I'd have to say pretty well. I have a 2008 iMac that's stuck on El Cap, and it still gets OS and Safari security updates.
Well, I'm not, and neither is Apple, because we are not the common case. I, too, have my Thinkpad docked all the time, connected to two external monitors that are 96 DPI. That does not mean I do not support the move past 96 DPI, I absolutely do despite the extra cost, because I do have HiDPI setups and after having one I have little interest in putting any more money into low DPI equipment.
The cheapest HiDPI monitor I would even consider buying is $2000. There are no ultrawide HiDPI monitors in existence at the moment.
Of course HiDPI is a regression in some ways. You can get far better non-retina monitors (low latency, higher refresh rate, wider range of aspect ratios) at any price point.
34” 3440x1440 is 110 dpi, which is 1x, not retina. To do retina at those sizes you’re looking at 8k. Only dell has an 8k monitor, and it’s not ultrawide, is very expensive, and most of the mac line-up probably can’t drive that many pixels.
Of course, I dont have answers. I didn't say HiDPI displays that suit everyone's needs is available for cheap. However, the main problem is that everyday people have a lot of 1080p 24" monitors sitting around. There were probably fifty to a hundred at my old office. Because we didn't have any huge size requirements or uncommon aspect ratio needs, we were able to take advantage of HiDPI displays that cost to the order of $100. To me that's a price point the average person can afford. If it causes some folks to have to shell out a lot of money to stay on 16:10 or have a reasonably large monitor, that is a reasonable trade off; most users don't need em, hence why it's difficult to find good 16:10 monitors in the first place, for example.
I don't see USB drives with USB-B port or headphones with jacks going away. I don't think that it'll be different with displays in foreseeable future. macOS users are forced to buy hidpi displays, but they are tiny minority and other people will continue to buy cheaper displays, it won't turn industry. Microsoft could force everyone to upgrade by disable anti-aliasing in new Windows version, but I highly doubt that they would do that.
> Microsoft could force everyone to upgrade by disable anti-aliasing in new Windows version
Given how shoddy Win10's support for HiDPI is across applications at the moment (currently I have a choice between running my 4k laptop at full resolution and having a correctly-sized UI in all the applications I use), I seriously hope they don't do that.
Do you deal with mixed DPI, for example, via external displays? I have very little issues with Windows 10 on 4K, so as long as I don't mix DPIs. Maybe it's because we're using different software, though.
I see absolutely nothing wrong with USB-B or headphone jacks. I'm far from an Apple fanboy, running a Thinkpad with Linux and a Google Pixel 2 (and yes, I am pissed about not having a headphone jack.) I'm not just embracing whatever Apple says. But once I switched all of my machines to HiDPI, I saw the light. This, to me, is not a minor improvement in experience. It is a huge improvement in experience, and in my opinion essential for people who deal with languages that have a more dense script.
Japanese can be rendered acceptably with subpixel AA, but it is significantly worse, especially for complex kanji, than a HiDPI display. For Chinese where you don't have less dense character sets like kana, it's even worse.
> It’s less “the code for this is a mess”, and more “the code for this can’t be anything but a mess”.
Ok. But so what? As a customer that's not my problem, and Apple has a HUGE war chest to pay developers to maintain the mess. I have nothing but 1x displays externally and this makes Mojave a non-starter.
It's not like tech debt has no cost. This is engineering effort that can be spent in other places; it's lots of other tasks that become easier because this complication doesn't need to be supported. It's reasonable to argue that'll turn into real benefits in the software in other ways.
Personally, I think that is moving in the wrong direction. This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly.
Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
If we ever want to have a hope of computers looking like, e.g. this Microsoft fluent design mockup [1], which is clearly window-based, but the windows are tilted slightly and shaded giving it a physical appearance, we'd have to go in that direction. If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display.
Also, one thing I've never seen discussed are virtual displays inside a VR world. The VR headset has plenty of pixels, but you can't simulate a full HD screen at a few feet distance, let alone a retina screen.
> This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly.
It's pretty common for regular AA to be borked in those situations, browser support for AA under transforms hasn't ever been awesome, and it's only been decent in the major browsers pretty recently.
> Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
Text & vector rendering with regular AA are currently aware of virtual sub-pixels, when they're not borked by transforms like in your examples. But making the browser rendering engine have to be aware of the order and orientation of your specific LCD hardware sub-pixels of your monitor would be very difficult for browsers to support. There is a reason subpixel AA has never worked everywhere.
When I first glanced at this article yesterday, I got the mistaken impression that Apple was turning off all AA, not the LCD specific ClearType style of subpixel AA. That would be a big deal, but that's not what's happening. The difference between regular AA and ClearType style AA does exist, but it's pretty small. I'm a graphics person, and I'm very much in favor of AA, but the outrage at losing subpixel AA feels overblown to me. It helps a little, but also has the downside of chromatic aberration. I'm honestly wondering if most of the hubbub here is people making the same mistake I did initially and thinking that they're losing all anti-aliasing?
> there's no denying that subpixel-AA text looks better on 1x displays.
I get that it's hard, but at the end of the day what users are going to notice is, when they dock their laptop at work, their display is going to look like fuzzy crap. Maybe most people won't notice? It's hard to tell without seeing a side-by-side comparison.
1) They had Retina MacBook Air
2) They actually have a Monitor Line, that is Retina+ Display, Super great colour accuracy etc.
For 1st point Apple doesn't seems to care, and 2nd point, Apple should at least have a list of Monitor that work best or up to their "Standards". Or on sale at Apple Store.
These attention to details, in its ecosystem weren't as obvious when Steve were here. But recently they have started to show cracks everywhere.
Who is buying the MacBook Air right now anyway? The pro is lighter, smaller, faster and better performance for the money. They are very similarly priced I am not sure why anyone would purchase the air given the updated 13 MBP is available.
I've opted to use the MBA exclusively at work for years now. I have not found anything that beats the size and weight of it while still giving that level of power. The form factor is one of my top considerations, so a Pro isn't really something I need or want because the extra horsepower doesn't really add much for my needs.
I can dock it at my desk and hook it up to two huge displays and a bigger keyboard, and then not be burdened when traveling.
> I have not found anything that beats the size and weight of it while still giving that level of power
The current MacBook pro is only .05 pounds heavier than the 13 inch MacBook air, and noticably thinner than the air's thickest point. They are basically the same form factor at this point.
The 13" MacBook Pro with no touchbar is more or less the upgraded MacBook Air with Retina Display that everyone has been asking for for years.
I have to disagree. I bought both a 2017 air and a 2017 mbp 13” to compare, and ended up returning the MBP. The screen on the MBP is far and away better, which made it a painful choice. On the other hand, the air was superior for the keyboard, and also for noise/heat, and of course several hundred dollars in price. The MBP didn’t really feel any faster for the type of work I was doing on it.
Most annoying was that the MBP fan was kicking on in high gear consistently when I was doing nothing of note, while the air very rarely does so. It’s always cool and quiet, which I love.
If they just threw a retina screen on it, the current Air could be a hell of a computer for several more years.
I recently went from an Air to MBP 13" (not out of choice, work requirement) and the keyboards are horrible on both. The only noteworthy thing on the MBP one to me is the reliability issues, comparing the actual typing quality is apples to oranges.
But what I have realized is that in reality it's very rare that you have to actually use a laptop keyboard. Maybe only like once a week do I not use a proper USB keyboard, i.e. if having to visit a client or something.
Mainly I've got multiple browser windows open with way too many tabs. I use a plugin to prevent them all from being actively loaded, but still have a ton active at any one time.
I also usually have one or more Excel windows open for data analysis, possibly Powerpoint or Keynote, and then some other lightweight stuff (Spotify, Sublime, etc.).
I also am most frequently using this across two 27" Thunderbolt displays (and if I'm desperate for a 3rd, I can open the MBA screen in addition).
I'm not a developer, so my needs may be less intensive than yours.
They still sell the 11-inch Air through education channels. You can buy one from a e.g. university's store if you or someone you know is student/staff/faculty/alum.
The non-Touch Bar MacBook Pro is £1249 vs the Air’s £949. That’s a big difference for a lot of people. People like my partner, who doesn’t care that it’s essentially 4-5 year old hardware with minor revisions.
I would never buy an Air myself because that screen is pure trash, but she genuinely doesn’t care.
For the majority of the consumer, which is 80% of the market, 10% of market are nerds and professional each. Of which many of those are price sensitive, and the $999 MBA is the entry level machine which Apple offers. And as of right now, it is ridiculously over priced even by Apple standards.
I have a 2015 rMBP with nVidia 750m and recently I switched from Apple Cinema Display to a 4k screen. At any of the graphical modes between the full 2x retina and 1x the whole OS gets extremely sluggish to the point of unusable.
Some people from Jetbrains have looked into it and concluded it was the font rendering in IDEA bringing down the entire system.
Do you think this change can improve this? Because it certainly sounds like it.
Sorry for the rant, but given the price of the JetBrains Toolbox, it infuriates me that they've been giving us HDPI users the middle finger for a couple of years now.
First of all, switching to grayscale helps A LOT. mojave does this system wide, but the JVM (or IntelliJ, I don't really remember) has its own font rendering system, so you need to change that in IntelliJ directly.
I was hurt terribly by this issue. My problems have been solved by three things:
- IDEA update that reworks the statusbar undertermined progress bar, which maxed my cpu just for a dumb anymation
- Going from an iGPU to dGPU mac (you seem to have one, so congrats, it would be way worse with only your iGPU)
- Using an experimental JVM from this thread: https://youtrack.jetbrains.com/issue/IDEA-144261
That said, JetBrains messed up with their 2018.1 release again. I'm mostly using Android studio, which lags behind IntelliJ's versions. 3.1 merged the stupid progressbar fix, but I'm not looking forward when 3.2 hits stable, with 2018.1 merged in.
Bottom line is that there are multiple factors in JetBrain's custom OpenGL UI inefficiencies combined with Oracle's JVM bugs/inefficiencies that takes down the whole system, but it's ultimately not only a font smoothing issue.
JetBrains doesn't really care, or at least doesn't show that they care. They first said "we don't notice that", and then released some JVM patches in the comments of an issue.
Of course, Oracle also shares part of the blame: none of this would have happened had they kept Apple's JVM Quartz support.
> @Gani Taskynbekuly OH MAN! I knew I was having this issue way worse at some times than others, but I'd never made the connection to sleep mode. Since reading your comment I've just been shutting my laptop down instead of closing the lid and letting it sleep and the problem has all but gone away. I searched High Sierra sleep issue and this is a well-known issue with bug reports. This means this isn't a JetBrains issue, it's a mac issue. It is annoying to turn off graphics switching and sleep mode and have crappy battery life, but hey, I'll pay that price for a usable laptop. I owe you a drink of your choice!
Seams like a macOS issue. I heard even rumors that's not the only issue with that OS. ;-)
> I heard even rumors that's not the only issue with that OS. ;-)
Sigh. Could we please not fall into macOS trolling? I believe that it is a totally different issue that the one I'm suffering.
I haven't had this issue with any other program. There is no excuse for IntelliJ when Xcode, VSCode, Sublime Text and even Eclipse perform 10 times better as IDEA does.
Even if I do a cold boot and open IntelliJ directly, I will still get unbearable slowness.
Sure, there might be a bug, but it doesn't affect me. If it was only a macOS bug, how would you explain that JetBrain's patched JDK works way better, and that 2018.1 is a downgrade?
My old laptop didn't even have a dGPU, so turning off graphics switching wasn't the problem. Many apps don't handle graphics switching correctly, and it's 100% their fault.
Also, do not confuse "top comment" with "latest comment".
Not so much hard but a hack. There is also no clear non-hackish way to implement sub-pixel rendering after 20 years of experience across at least three platforms.
Also since we no longer have consistent sub-pixel layouts, sub-pixel rendered text is no longer portable to another screen, such as you get with an image containing text on a web page.
Yes, but if it dramatically increases complexity for all the lower levels of the rendering stack, it ends up creating extra work for those OS/framework developers, requiring separate versions of the code targeting different displays which must be maintained separately with different sets of bugs in each, preventing them from making large-scale architectural changes they want to make, restricting some types of significant optimizations, etc.
Apple has a $200B war chest. If they can't afford to deal with this problem and instead choose to make my experience worse that says something about their priorities as a company.
Yeah but isn't it already a solved problem with 1x monitors? Who is implementing things multiple times for no reason? I'm glad in the future it will be easier for Apple because they just won't hire devs to do that part then, but does this imply that 1x monitors will simply look inferior no matter what is done with newer versions of OSX+?
If it's possible to realise those benefits while retaining subpixel antialiasing, it would entail a complete reimplementation.
On the other side of things -- Light-on-dark subpixel antialiasing has always been a little wonky, and Apple disables it in most dark UIs. Without significant changes, dark mode would've been greyscale either way,
This complexity is why we can't have windows that spread between 1x and 2x monitors since the introduction of Retina displays.
Ideally everything should be rendered (and cached) to the maximum resolution available (minimum trace widths considered) and then downsampled to the specific screens with subpixel data applied, but that's a lot of processing to do and battery life would suffer.
Eh? Before Yosemite was released in 2014, one window could span between a 1x and 2x monitor. I forget if it was always rendered at 1x, or if it was rendered at 2x and scaled for the 1x, but yeah it looked like crap on one or the other. Actually I think it switched depending on which monitor had more of the window?
But they only eliminated windows spanning across multiple monitors when they made spaces per-display.
It’s already a solved problem if you want to keep using the software you already wrote, and never do anything different. (Nobody is stopping people from running OS X 10.13 indefinitely on their 1x displays.)
If you want to rearchitect your whole text / vector graphics rendering pipeline to take proper advantage of GPUs, then you are going to be much happier with your life if you don’t have to double your work to support legacy hardware.
>It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing
Most of the apple hardware supports ARB_blend_func_extended or even EXT_framebuffer_fetch, which lets you to implement subpixel blending with ease. Otherwise you just have to fall back to traditional alpha blending and render each glyph in three passes for each color channel, which is not that hard either.
If this was hard for apple imagine how much work it was for Microsoft with their complete lack of control over monitor displays. But they pulled it off.
Pulled it off, or they just don't care to streamline their code and get the best performance, but instead hold tight to all kinds of legacy crap in sake of compatibility?
Holding tight to 'all that legacy crap' is what makes Windows the de facto standard for desktop.
Despite people trying to insist on it being business practices from twenty years ago, the simple fact is that Microsoft dominated, and still dominates, bc of backwards compatibility, a lesson that Apple has never learned (or never cared to learn) and why they will never have more than a few percent market share on the desktop.
And despite the gossip bloggers constant nonsense, the desktop market is still be much alive and thriving.
>..but instead hold tight to all kinds of legacy crap in sake of compatibility?
Yes the pulled that *hit off... imagine all the combination of components and drivers they needed to support. Way more of a challenge than a closed system like Mac.
This change seems to be dumping on Steve Jobs' graduation speech where he talks about taking calligraphy classes and how it made him realize the importance of good font rendering.
If our LG 21x9 displays all look like crap when we dock our MacBook Pros, then it’s just one more justification for the Windows lovers (yes, they exist) to move to Surfaces. Adobe doesn’t care which we run.
If you aren't after a certain size or refresh rate, then 4K displays have gotten pretty affordable -- but if you want one with high refresh or a more exotic resolution things get quite expensive.
My main computer I am still using a CRT, and the flat panels I own some of them are with pixels in funky patterns, and when enabling sub pixel the image instead gain lots of random color fringes.
And you can't use rotated screens either. I saw people doing that to read source code easier, with multiple screens rotated, and rotated screens are used in some games, Ikaruga for Windows for example explicity supports rotated screen mode.
This and the document @gnachman posted has me intrigued. Are you saying that subpixel AA is not just like a multichannel mask? I would expect the amount of R/G/B foreground and background color for a given pixel to be the same regardless of what those colors actually are?
(note I haven't worked on subpixel AA for font rendering, but have for something different)
the amount of color you apply to each subpixel has different levels of intensity depending on how deeply into the pixel the shape goes, so you would need to blend with whatever is behind/in front of it. Also you can't just do something like write (128,0,0) to a pixel and leave it at that, you'll have a red pixel. It would need to be blended with what's already there.
I expect this would be the case with alpha compositing at least...
Afaict subpixel rendering is supersampling width with multiples of 3, then modulo via the horizontal coordinate into rgb triplets and divide by supersampling factor. I guess if kerning and position are also subpixel it might become more tricky to cache results efficiently.
> Subpixel antialiasing is obnoxious to implement.
I imagine 10 engineers that worked out antialiasing at Apple are handsomely compensated for their suffering.
I hook my mac to a cheap ultrawide. Will Apple give me 1k eur to get a new monitor with 4x pixel density so I can continue using my computer that I got from them only a few months ago or will my mac become obsolete? This is unworkable.
You sound very dramatic. You know, you could just not update the OS if it's that important to you.
As someone with a non-retina iMac I'm also affected, but I understand that non-retina displays are a niche category these days. Thinking about it, I guess the Macbook Air is the only computer Apple is selling that doesn't use a retina display.
I already thought it was a reasonable engineering decision, though, because of one other factor: the bar for what a screen should look like has gotten massively higher over the last several years.
On 1x displays, yes, subpixel antialiased text absolutely looks better... but it still looks like shit compared to any modern phone, any MacBook Pro or high-res Dell laptop from the past several years, to any 27" 5K display or 20" 4K display, etc.
Subpixel-AA used to be the way to get the best text rendering possible, for all normal computer users.
But now, it sounds like it requires maintaining some bug-prone, nightmarish architecture, just to get slightly-better-but-still-terrible text rendering. That probably isn't worth it.
It's also easier to implement non-gamma correct blending, but seeing as it looks terrible and uneven, Apple so far has not done that.
More so, most technical people don't even understand that one, so good look getting a non-Jobs type to see the importance.
Non-retina displays are going to stick around for years. Not everyone wants high DPI, I for one much prefer working on a giant non-Retina monitor instead, for more real estate instead of detail. I don't know what they put in the water in the Bay Area, but everyone seems to be moving towards a lowest common denominator approach.
This is a terrible decision, and if they don't fix it, it'll be just another reason to abandon the platform.
> It's also easier to implement non-gamma correct blending,
Gamma correct blending doesn’t require threading the information through multiple graphics layers, as the parent comment put it. The difficulty of a gamma correct blending implementation isn’t even remotely close to the difficulty of subpixel AA.
> so good look getting a non-Jobs type to see the importance.
I’m not sure I understand what you mean. Mac was the first major PC platform that had configurable gamma correction and color calibration in the OS. It used to be known as the platform for designers, and more people using Mac cared about color and gamma for display and print than for any other platform.
> Not everyone wants high DPI... I prefer more real estate instead of detail
You actually don’t want high DPI even if you can have it on a large monitor? Or do you mean you prioritize monitor size over resolution?
Low DPI is not going to be a choice in the future... the near future. And once it’s gone, it’s never coming back.
> This is a terrible decision
Can you elaborate on how it affects you personally? Since you know about gamma, you do know the difference between LCD subpixel AA and regular AA, right? For some people, subpixel AA is blurrier than regular AA. For me, I see slight improvement, but it is by no means some sort of deal breaker. I’m not clear what the outrage is about, can you explain?
Subpixel-AA was not removed from macOS Mojave so the title is completely wrong anyway.
This was covered at WWDC. It uses grayscale-AA now which is much easier to hardware accelerate and gives a more consistent experience across different types of displays.
You are right. But I think parent comment understands the concepts, and was misled by the name, which to be fair, is a misleading name. The names "subpixel AA" and "grayscale AA" are both terrible names.
Grayscale AA isn't somehow gray or monochromatic, it is regular anti-aliasing, it works in color, and it works by averaging (virtual) sub-pixel samples. The name is trying to refer to how it's not making use of the physical LCD color subpixels.
Both names assume that the term "sub-pixel" is primarily referring to LCD hardware, rather than general sub-area portions of a pixel. From my (graphics) point of view, the LCD case is a specific and rare subset of what people mean by "subpixel", which is what makes these terms seem misleading and awkward.
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).