Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).
(love my latest gen Macbook Pro, but this is the classic problem with Apple's approach to product design)
"Listen guys, the code for this is a mess AND slows your graphics down even when it's not needed. We already have had Retina displays for 6+ years already, and Hi-Dpi displays are the norm these days for third party too, so there's no reason for everybody to suffer from old tech, like there was no reason for everybody to have a floppy disk or CD-ROM on their laptop just because 5% had a real use for it. Part of the way it is with Apple, and has always been is that we move fast, and cut things off first. Until now, this exact tactic got us from near bankruptcy to being the #1 company on the planet. So, if you don't like it, don't let he door hit you on your way out".
Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.
I would even go so far as to say that the majority of people who want to buy a monitor for doing text based things (ie business) will buy a bog standard monitor.
If you're on a MacOS platform, with the current price of HiDPI IPS displays, the time is right to grab just about anything. If you're on Windows or Linux, it's still a great time so as long as you're keeping all monitors the same DPI and probably integer scale.
Well, the vast majority of the people consider 60hz totally acceptable for work - and work fine with it.
For the huge majority of the people refresh ration > 60hz isn't even a concern.
Better resolution on the other hand is a marked improvement (either as more screen real estate or as finer detail retina style).
But isn't that more for quick moving stuff, like games or (in the iPhone) quicker visual response to pen input and such?
For regular computing (reading webpages, editing stuff, programming, watching movies) I don't see much of a difference.
Heck, movies are 24p and we're fine with it.
Check out iPad Pro devices with 120 Hz display. It makes a big difference in readability of scrolling text (try to read a web page while it's scrolling), responsiveness and smoothness of motion.
The ones that do already have "Hi-DPI terminal windows and really sharp text" as an almost-mandatory requirement.
Think about it. Sub-pixel text rendering (and AFAIK it's just text, not images w/ subpixel positioning? although that's an incredibly interesting idea for a hack!). Maybe subpixel vectors/lines?
Either you care about high-quality lines or you don't... what Apple's basically doing here is moving the cost to the hardware instead of the software, and the win is simplified, faster, less-energy-using device.
I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks (don't make me source my dates on retina displays). It's the norm for "executive" users (ie: software developers, graphics users, finance, book editors, etc.).
If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.
However, thinking it through, I would _really_ like then a "subpixel mode" for either the OS as a whole (kindof like an FSAA?), or especially certain windows. Maybe there's a way to do that as a hack?
True, but not being in search of the best quality doesn't mean that any quality will do.
I've been happy with text rendering on my bog-standard monitor so far. I'm not sure I will still be happy after this degradation.
But as it seems my current Mac Mini is the last of its kind anyway and I'm going to have to move on from Apple.
That's Apple's market, not people holding on to 10 year computers. Not even those that still buy the MBA.
They are not. But I guess the point of the parent poster is that Apple require its customers to spend on latest items instead of supporting old tech.
Basically, it is saying: If you can't afford this, don't buy it. It is fair given that Apple products are already marked up in price.
Historically 16:10 monitors have had the same width as 16:9 ones (e.g. 1920×1200 vs 1920×1080) so there's no difference as far as having things side by side.
Regardless of that, if all you're doing is General Business™ then surely all you need to do is grab an all-in-one box, shove as much 3rd party ram in it as will fit & then nab a couple of monitors and an unbranded mechanical keyboard off amazon.
I did a lot of work on a £700(~$900) Optiplex with 16gb of ram and that was more than capable of running ubuntu on 3 monitors while being small enough to fit into a handbag
So just let me turn it off. Or turn it off automatically when I'm scaling. Or leave it as it is because no one's ever cared about it. Removing it is the worst option of all.
> We already have had Retina displays for 6+ years already
External displays are the thing here.
I really hope the market punishes you for this sentiment.
And P.S. Apple is strong because of the iPod and iPhone - macOS is a tiny part of their business and Apple is strong in spite of their shitting on Mac customers, not because of it.
Seems more likely that Apple will simply discontinue the Air and that the MacBook represents their new vision of what laptops ought to be like.
Further, if your prediction is correct, why wouldn't Apple already have canceled the Air? what you describe is already on the market: the escape-less MacBook.
I don't feel as skeptical or negative about Apple's plans for new laptops. We shall see, soon!
When the Air was released, it had too many limitations, and too high a cost, to take over the macbook's role. Its primary market was people who really valued the mobility over all else. But as they iterated it closed the gap, until it overtook the macbook, and the macbook offer shrivelled up and died.
If you swap 'macbook' and 'Air' in that narrative, you see history repeating - today we're at the first step of that. "The new macbook" is too limited and too expensive to take the Air's place, but give them a few iterations, and I suspect the Air will become unnecessary in the line-up.
Pretty sure this is what Alan Kay means when he says that people who are serious about software will make their own hardware.
A 1x monitor will be a cheap second monitor or something attached to a Mac Mini (please keep it alive Apple) or Mac Pro (one day it will come). From Apple's point of view I imagine all of these are just super small niches.
The software folks have a good reason to remove subpixel AA; it's a hack to get better horizontal spatial resolution out of cheap monitors. The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.
By removing subpixel AA at a time when HiDPI monitors are cheap, Apple is probably doing us a service. In a few years, hopefully everyone will be on a DPI that can render nice text. You may think it's unnecessary, but consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
I know it's pretty irritating to suggest that forcing users to buy new hardware is doing them "a service." But really, I see no conflict of interest here. Apple themselves probably hasn't shipped a 96 DPI display in half a decade, and they don't even sell their own external displays, so they don't stand to gain that much here other than software performance.
My 1x monitor can render text just fine using sub-pixel AA. It's a proven technique that has been around for ages and the text looks great on it.
> consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
If I start to become a heavy user of Hanzi and Kanji then I will consider purchasing a monitor that displays it better, but currently not an issue.
> Apple is probably doing us a service.
Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?
> so they don't stand to gain that much here other than software performance.
I don't understand the performance argument people keep bringing up. Subpixel AA has been around for ages and worked fine on much older hardware. Why is it suddenly a bottleneck that needs to be removed when my decade old hardware can still do it?
No, of course not. If you phrase it like that, that is, ignore the point of removing it entirely and all of the benefits that come with it, of course it's not beneficial.
However, 96 DPI was never ideal. HiDPI won't just avoid a regression, it'll improve your experience, period. And with this new software change, you'll get a performance improvement, from better caching and easier GPU acceleration, and software engineers can delete thousands of lines of potentially buggy code made to work around the interaction of subpixel rendering with other graphics code.
Like, subpixel rendering is easy to understand in theory and that might delude one into thinking that it's just a minor annoyance, but it's an annoyance that permeates the stack. You have to special case the entire stack wherever text is concerned. Effectively, the text is being rendered at 3x the horizontal resolution of everything else. This brings in complication across the board. An example is that you pretty much have to do the alpha blending in software, and when you do it requires knowledge of the final composition. Doing alpha blending in software sounds easy... But it's only easy if you do it wrong. Dealing with gamma for example. Which makes life worse because you already have to do this for the other part of your graphics library, probably with a case for both GPU acceleration and software.
Basically, you'd have to render the whole screen at 3x horizontal resolution to remove the need for all of these compositing hacks. Nobody does this because subpixel AA was a hack from the get go.
In an era where battery life and TDP are actually important, finite resources, we want to remove these hacks.
P.S.: if you've never tried rendering large amounts of text on screen before, I highly recommend you try writing a piece of software that needs to do this. A few years ago I struggled to get 60fps rendering text for an IRC client, and of course caching the glyphs was horrendously complex thanks to subpixel AA. It would've been easy if I could've just used GPU compositing like all other graphics, but instead I resorted to hacks everywhere. Maybe today's computers don't need the hacks, but it's worth noting that yes, rendering fonts is resource intensive - you're just living on top of decades of optimizations and hacks to make it look like it's easy.
But that's what it is, that's not just clever phrasing on my part.
> HiDPI won't just avoid a regression, it'll improve your experience, period.
I get that, but if I wanted to spend money on that improvement I would do that. The point is my choice is no longer between maintaining my current standard or upgrading, its between regressing my current standard or upgrading.
Do you have to upgrade the OS? Nope. It's a choice and a tradeoff.
Of course HiDPI is a regression in some ways. You can get far better non-retina monitors (low latency, higher refresh rate, wider range of aspect ratios) at any price point.
Please point to me a reasonably-priced HiDPI/Retina Ultrawide (at least 34" and 3440x1440 reso) monitor. I couldn't find one.
> Of course, I dont have answers
The questions were related to backing up what you claimed. You obviously could not do this.
Given how shoddy Win10's support for HiDPI is across applications at the moment (currently I have a choice between running my 4k laptop at full resolution and having a correctly-sized UI in all the applications I use), I seriously hope they don't do that.
Elementary level of product owner analysis I must say. Proactively constraint your product into a niche category.. tsk, tsk.
Yeah, it hasn't been working for Apple this past 2 decades. They're only near the $1 trillion mark, could have been way better...
Ok. But so what? As a customer that's not my problem, and Apple has a HUGE war chest to pay developers to maintain the mess. I have nothing but 1x displays externally and this makes Mojave a non-starter.
It is. Customers pay for code debt too -- with bugs, slow performance, and slower moving features.
From the OP it sounds more like its a pain to implement due to the number and design of abstraction layers in the modern macOS graphics stack.
Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
If we ever want to have a hope of computers looking like, e.g. this Microsoft fluent design mockup , which is clearly window-based, but the windows are tilted slightly and shaded giving it a physical appearance, we'd have to go in that direction. If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display.
Also, one thing I've never seen discussed are virtual displays inside a VR world. The VR headset has plenty of pixels, but you can't simulate a full HD screen at a few feet distance, let alone a retina screen.
It's pretty common for regular AA to be borked in those situations, browser support for AA under transforms hasn't ever been awesome, and it's only been decent in the major browsers pretty recently.
> Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
Text & vector rendering with regular AA are currently aware of virtual sub-pixels, when they're not borked by transforms like in your examples. But making the browser rendering engine have to be aware of the order and orientation of your specific LCD hardware sub-pixels of your monitor would be very difficult for browsers to support. There is a reason subpixel AA has never worked everywhere.
When I first glanced at this article yesterday, I got the mistaken impression that Apple was turning off all AA, not the LCD specific ClearType style of subpixel AA. That would be a big deal, but that's not what's happening. The difference between regular AA and ClearType style AA does exist, but it's pretty small. I'm a graphics person, and I'm very much in favor of AA, but the outrage at losing subpixel AA feels overblown to me. It helps a little, but also has the downside of chromatic aberration. I'm honestly wondering if most of the hubbub here is people making the same mistake I did initially and thinking that they're losing all anti-aliasing?
Which is what we _should_ be doing. What do you expect, some resurrection of the low res display market, or perhaps the return of VGA?
> there's no denying that subpixel-AA text looks better on 1x displays.
I get that it's hard, but at the end of the day what users are going to notice is, when they dock their laptop at work, their display is going to look like fuzzy crap. Maybe most people won't notice? It's hard to tell without seeing a side-by-side comparison.
1) They had Retina MacBook Air
2) They actually have a Monitor Line, that is Retina+ Display, Super great colour accuracy etc.
For 1st point Apple doesn't seems to care, and 2nd point, Apple should at least have a list of Monitor that work best or up to their "Standards". Or on sale at Apple Store.
These attention to details, in its ecosystem weren't as obvious when Steve were here. But recently they have started to show cracks everywhere.
I am. I want a keyboard that's not absolute garbage to type on.
I really miss my MacBook air.
I can dock it at my desk and hook it up to two huge displays and a bigger keyboard, and then not be burdened when traveling.
The current MacBook pro is only .05 pounds heavier than the 13 inch MacBook air, and noticably thinner than the air's thickest point. They are basically the same form factor at this point.
The 13" MacBook Pro with no touchbar is more or less the upgraded MacBook Air with Retina Display that everyone has been asking for for years.
Most annoying was that the MBP fan was kicking on in high gear consistently when I was doing nothing of note, while the air very rarely does so. It’s always cool and quiet, which I love.
If they just threw a retina screen on it, the current Air could be a hell of a computer for several more years.
But what I have realized is that in reality it's very rare that you have to actually use a laptop keyboard. Maybe only like once a week do I not use a proper USB keyboard, i.e. if having to visit a client or something.
Can you say what your needs are? I'm skeptical that I would be happy with the speed of my compiler on an Air.
I also usually have one or more Excel windows open for data analysis, possibly Powerpoint or Keynote, and then some other lightweight stuff (Spotify, Sublime, etc.).
I also am most frequently using this across two 27" Thunderbolt displays (and if I'm desperate for a 3rd, I can open the MBA screen in addition).
I'm not a developer, so my needs may be less intensive than yours.
Yep, I guess that was the root of my question, thanks. :)
I have chronic pain and the idea of paying an extra almost half kilo in weight for my laptop is enough that I gave a Windows PC a try for a year.
I would never buy an Air myself because that screen is pure trash, but she genuinely doesn’t care.
"Hey, we know you probably spent $1200-2500 on a laptop. Buy a $700 or $1300 monitor to go with it!" (And sorry, $700 for a 21" monitor is ... meh).
External retina displays don't exist.
HiDPI displays aren't available in a number of common monitor sizes.
Some people from Jetbrains have looked into it and concluded it was the font rendering in IDEA bringing down the entire system.
Do you think this change can improve this? Because it certainly sounds like it.
First of all, switching to grayscale helps A LOT. mojave does this system wide, but the JVM (or IntelliJ, I don't really remember) has its own font rendering system, so you need to change that in IntelliJ directly.
I was hurt terribly by this issue. My problems have been solved by three things:
- IDEA update that reworks the statusbar undertermined progress bar, which maxed my cpu just for a dumb anymation
- Going from an iGPU to dGPU mac (you seem to have one, so congrats, it would be way worse with only your iGPU)
- Using an experimental JVM from this thread: https://youtrack.jetbrains.com/issue/IDEA-144261
That said, JetBrains messed up with their 2018.1 release again. I'm mostly using Android studio, which lags behind IntelliJ's versions. 3.1 merged the stupid progressbar fix, but I'm not looking forward when 3.2 hits stable, with 2018.1 merged in.
Bottom line is that there are multiple factors in JetBrain's custom OpenGL UI inefficiencies combined with Oracle's JVM bugs/inefficiencies that takes down the whole system, but it's ultimately not only a font smoothing issue.
JetBrains doesn't really care, or at least doesn't show that they care. They first said "we don't notice that", and then released some JVM patches in the comments of an issue.
Of course, Oracle also shares part of the blame: none of this would have happened had they kept Apple's JVM Quartz support.
> Joshua Austill commented 24 Jun 2018 06:21
> @Gani Taskynbekuly OH MAN! I knew I was having this issue way worse at some times than others, but I'd never made the connection to sleep mode. Since reading your comment I've just been shutting my laptop down instead of closing the lid and letting it sleep and the problem has all but gone away. I searched High Sierra sleep issue and this is a well-known issue with bug reports. This means this isn't a JetBrains issue, it's a mac issue. It is annoying to turn off graphics switching and sleep mode and have crappy battery life, but hey, I'll pay that price for a usable laptop. I owe you a drink of your choice!
Seams like a macOS issue. I heard even rumors that's not the only issue with that OS. ;-)
Sigh. Could we please not fall into macOS trolling? I believe that it is a totally different issue that the one I'm suffering.
I haven't had this issue with any other program. There is no excuse for IntelliJ when Xcode, VSCode, Sublime Text and even Eclipse perform 10 times better as IDEA does.
Even if I do a cold boot and open IntelliJ directly, I will still get unbearable slowness.
Sure, there might be a bug, but it doesn't affect me. If it was only a macOS bug, how would you explain that JetBrain's patched JDK works way better, and that 2018.1 is a downgrade?
My old laptop didn't even have a dGPU, so turning off graphics switching wasn't the problem. Many apps don't handle graphics switching correctly, and it's 100% their fault.
Also, do not confuse "top comment" with "latest comment".
One thing that worked for me while it was a problem was switching off ALL antialiasing in IntelliJ, or at least going down to greyscale.
Also since we no longer have consistent sub-pixel layouts, sub-pixel rendered text is no longer portable to another screen, such as you get with an image containing text on a web page.
Isn't it completely abstracted away from 99%+ of macOS development?
Do they always get things right for them? No. But those are their users, not the average joe.
Essentially, this article: http://darknoon.com/2011/02/07/mac-ui-in-the-age-of-ios/ will be obsolete.
If it's possible to realise those benefits while retaining subpixel antialiasing, it would entail a complete reimplementation.
On the other side of things -- Light-on-dark subpixel antialiasing has always been a little wonky, and Apple disables it in most dark UIs. Without significant changes, dark mode would've been greyscale either way,
Ideally everything should be rendered (and cached) to the maximum resolution available (minimum trace widths considered) and then downsampled to the specific screens with subpixel data applied, but that's a lot of processing to do and battery life would suffer.
But they only eliminated windows spanning across multiple monitors when they made spaces per-display.
If you want to rearchitect your whole text / vector graphics rendering pipeline to take proper advantage of GPUs, then you are going to be much happier with your life if you don’t have to double your work to support legacy hardware.
I would love to see Apple implement this kind of thing http://w3.impa.br/~diego/projects/GanEtAl14/
Most of the apple hardware supports ARB_blend_func_extended or even EXT_framebuffer_fetch, which lets you to implement subpixel blending with ease. Otherwise you just have to fall back to traditional alpha blending and render each glyph in three passes for each color channel, which is not that hard either.
Despite people trying to insist on it being business practices from twenty years ago, the simple fact is that Microsoft dominated, and still dominates, bc of backwards compatibility, a lesson that Apple has never learned (or never cared to learn) and why they will never have more than a few percent market share on the desktop.
And despite the gossip bloggers constant nonsense, the desktop market is still be much alive and thriving.
Well, MacOS doesn't aspire to be the "de facto standard for desktod", or to be able to used in 10 years old machines in some corporation.
Isn't it nice that there are options with different priorities?
Yes the pulled that *hit off... imagine all the combination of components and drivers they needed to support. Way more of a challenge than a closed system like Mac.
On the other hand, I won’t tolerate 1x displays on either platform anymore, so this change hardly affects me.
Well, buying 60 4K displays was out of our budget and the LG displays looked very nice. I guess we're just not rich enough.
You can’t buy 1024x768 displays anymore (well, outside of projectors) even though they would cheaper. The same will be true for 1x sooner or later.
Example, the Acer X27 which offers 144 Hz is $2000: https://www.newegg.com/Product/Product.aspx?Item=N82E1682401...
LG's 34WK95U (5K ultrawide, not out yet but supposedly will be next month), is $1500.
Personally, I have a 38UC99 right now and I want to stick with the 21:9 aspect ratio, so I'm waiting for that 34UK95U.
My main computer I am still using a CRT, and the flat panels I own some of them are with pixels in funky patterns, and when enabling sub pixel the image instead gain lots of random color fringes.
And you can't use rotated screens either. I saw people doing that to read source code easier, with multiple screens rotated, and rotated screens are used in some games, Ikaruga for Windows for example explicity supports rotated screen mode.
the amount of color you apply to each subpixel has different levels of intensity depending on how deeply into the pixel the shape goes, so you would need to blend with whatever is behind/in front of it. Also you can't just do something like write (128,0,0) to a pixel and leave it at that, you'll have a red pixel. It would need to be blended with what's already there.
Afaict subpixel rendering is supersampling width with multiples of 3, then modulo via the horizontal coordinate into rgb triplets and divide by supersampling factor. I guess if kerning and position are also subpixel it might become more tricky to cache results efficiently.
I imagine 10 engineers that worked out antialiasing at Apple are handsomely compensated for their suffering.
I hook my mac to a cheap ultrawide. Will Apple give me 1k eur to get a new monitor with 4x pixel density so I can continue using my computer that I got from them only a few months ago or will my mac become obsolete? This is unworkable.
As someone with a non-retina iMac I'm also affected, but I understand that non-retina displays are a niche category these days. Thinking about it, I guess the Macbook Air is the only computer Apple is selling that doesn't use a retina display.
Social bubble detected
I already thought it was a reasonable engineering decision, though, because of one other factor: the bar for what a screen should look like has gotten massively higher over the last several years.
On 1x displays, yes, subpixel antialiased text absolutely looks better... but it still looks like shit compared to any modern phone, any MacBook Pro or high-res Dell laptop from the past several years, to any 27" 5K display or 20" 4K display, etc.
Subpixel-AA used to be the way to get the best text rendering possible, for all normal computer users.
But now, it sounds like it requires maintaining some bug-prone, nightmarish architecture, just to get slightly-better-but-still-terrible text rendering. That probably isn't worth it.
Love fish shell, btw!
More so, most technical people don't even understand that one, so good look getting a non-Jobs type to see the importance.
Non-retina displays are going to stick around for years. Not everyone wants high DPI, I for one much prefer working on a giant non-Retina monitor instead, for more real estate instead of detail. I don't know what they put in the water in the Bay Area, but everyone seems to be moving towards a lowest common denominator approach.
This is a terrible decision, and if they don't fix it, it'll be just another reason to abandon the platform.
Gamma correct blending doesn’t require threading the information through multiple graphics layers, as the parent comment put it. The difficulty of a gamma correct blending implementation isn’t even remotely close to the difficulty of subpixel AA.
> so good look getting a non-Jobs type to see the importance.
I’m not sure I understand what you mean. Mac was the first major PC platform that had configurable gamma correction and color calibration in the OS. It used to be known as the platform for designers, and more people using Mac cared about color and gamma for display and print than for any other platform.
> Not everyone wants high DPI... I prefer more real estate instead of detail
You actually don’t want high DPI even if you can have it on a large monitor? Or do you mean you prioritize monitor size over resolution?
Low DPI is not going to be a choice in the future... the near future. And once it’s gone, it’s never coming back.
> This is a terrible decision
Can you elaborate on how it affects you personally? Since you know about gamma, you do know the difference between LCD subpixel AA and regular AA, right? For some people, subpixel AA is blurrier than regular AA. For me, I see slight improvement, but it is by no means some sort of deal breaker. I’m not clear what the outrage is about, can you explain?
Yet, somehow, freetype guys did it all
This was covered at WWDC. It uses grayscale-AA now which is much easier to hardware accelerate and gives a more consistent experience across different types of displays.
Grayscale AA isn't somehow gray or monochromatic, it is regular anti-aliasing, it works in color, and it works by averaging (virtual) sub-pixel samples. The name is trying to refer to how it's not making use of the physical LCD color subpixels.
Both names assume that the term "sub-pixel" is primarily referring to LCD hardware, rather than general sub-area portions of a pixel. From my (graphics) point of view, the LCD case is a specific and rare subset of what people mean by "subpixel", which is what makes these terms seem misleading and awkward.
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).