But the reason I’m sad is that this wonderful hack was short lived: https://docs.google.com/document/d/1vfBq6vg409Zky-IQ7ne-Yy7o...
My pet theory is that macOS is going to pull in a bunch of iOS code and iOS has never had subpixel AA.
For anyone who wants to see whats coming when Mojave gets released without subpixel AA, I made a few comparison screenshots on a non-Retina display:
iTerm2 dark subpixel AA enabled: https://static.imeos.com/images/iTerm2-3.1.7.png
iTerm2 dark subpixel AA disabled: https://static.imeos.com/images/iTerm2-3_1_20170924-nightly....
iTerm2 light subpixel AA enabled: https://static.imeos.com/images/iTerm2-3.1.7-light.png
iTerm2 light subpixel AA disabled: https://static.imeos.com/images/iTerm2-3_1_20170924-nightly-...
On Retina screens, I don't see much of a difference:
iTerm2 light subpixel AA enabled on Retina display: https://static.imeos.com/images/iTerm2-3.1.7-light-retina.pn...
iTerm2 light subpixel AA disabled on Retina display: https://static.imeos.com/images/iTerm2-3_1_20170924-nightly-...
Consider the 3 and 4th images, the "on light" AA and non-AA versions… Look at line number 10, the "let arr1 =…" line. Look at the vertical stroke of the "l" of "let". In the anti aliased version there is a red glow on the left side and a blue-green glow on the right when you zoom in. On the non anti-aliased one there is not such glow. Now lets lay that into pixels on a scanline…
RGBRGBR_________GBRGBRGB subpixel AA
RGBRGB_________RGBRGBRGB non-subpixel AA
If we turn that back into what the computer abstracts as pixels we get…
RGB RGB R__ ___ ___ _GB RGB RGB subpixel AA (see R and GB? halos)
RGB RGB ___ ___ ___ RGB RGB RGB non-subpixel AA
Where I notice it the most is on the curly quotes and parentheses. Without AA they look... harsh I guess? Or maybe blocky. You notice the vertical lines making up the curves more. You also lose some of the definition around letters - for example without AA the dot on the "i" is much less noticeable.
On the dark theme it doesn't look nearly as bad, I have a really hard time seeing the difference there at all. Maybe that's because they're scaled differently? Or maybe dark themes just don't need it as much?
I’m noticing this misunderstanding quite a bit in the thread. None of those images have antialiasing disabled. What’s been disabled is subpixel antialiasing. So those images are comparing subpixel antialiasing and greyscale antialiasing.
Opinions between the two are mixed:
I’m one of the people that hates greyscale antialiasing (I find it makes text look fuzzier) and prefer subpixel antialiasing (which does a much better job of preserving the intended shape of the font glyphs).
On the other side, there are people who hate subpixel antialiasing (because they can see the colour fringes on the glyphs) and prefer greyscale antialiasing. Here’s an example from 1897235235 lower down in the thread:
“This is actually better for me. I wrote to Steve Jobs a long time ago and asked him if he could switch to gray-scale font smoothing because the tricks they use with colours don't actually work with people like me who are red green colour blind. The end result was that text on apple products looked terrible to me and I couldn't use any of their products until the retina displays came out. In windows, you can use grayscale font smoothing easily.
Anyway, he replied "we don't have any plans to do this" or something like that. Turns out I won in the end.”
Now if antialiasing was actually fully disabled, the difference would be extremely obvious, to say the least.
I wonder if this is part of the reason why I can't really tell the difference in the dark mode pictures, at least without zooming in all the way. I guess the colors from subpixel antialiasing would be less noticable when overlayed or transitioning from dark to light rather than from light to dark?
> the tricks they use with colours don't actually work with people like me who are red green colour blind.
Oh crud, this never occurred to me. It doesn't seem to matter how many times I remind myself that red/green doesn't work for everyone, I still find myself forgetting about it all the time.
Some Apple engineer said on reddit that it's because subpixel AA is not so useful in Retinas and HiDPI, but slows down processing and complicates pipelines.
So it's part of the move to Metal and increasing graphics performance, even if it means external lo-res monitors will suffer.
Rendering in monochrome is also faster than true 32-bit color, but we use 32-bit color because it provides a better experience to the user who is the ultimate consumer of the graphics pipeline.
It's almost as of it's a trade-off and monochrome so laughingly doesn't cut it, that it's a totally contrived counter example. Almost.
I could replace my old monitors, but they still work well with good colour and brightness, so it's not exactly environmentally considerate, or even slightly necessary.
Adding colour has added very little to UX aside from true-colour icons, the UX itself is essentially the same. Colour is used as a theme on top of a UI perfectly recognisable, and much the same as, mono and 4 colour interfaces of the 80s and 90s.
A current dense desktop monitor seems to be around 160ppi, so 8-bit grayscale would be around 280ppi, which is starting to approach a low-end printer.
Mobiles and tablets have such high resolution monitors nowadays they probably run without subpixel AA at all…
Edit: personally I value simple and precise language. Constructs such as those mentioned in the Wikipedia article may convey similar meaning but the fact that they exist means that they allow for some variation in meaning and color which is unnecessary in this case.
The original statement is clear from context, since the literal meaning is semantically impossible (monitors are incapable of suffering).
Metonym is just a linguistic term. Such terms are not constrained to describe high uses of language by great masters of writing. They are merely names for specific constructs or linguistic phenomena.
A drunken sailor swearing at someone at 3am could be using a metonym just as easily as Wallace Stevens.
> I released a build of iTerm2 last year that accidentally disabled subpixel AA and everyone flipped out. People definitely notice.
On a Retina display at least I find that the most striking difference is that most text appears to have a lighter weight. Maybe that's the difference people are perceiving, rather than color fringing or whatever?
That's PR-speak. Translation "It works acceptably well for all pixel layouts, and it is easier for us to implement".
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).
(love my latest gen Macbook Pro, but this is the classic problem with Apple's approach to product design)
"Listen guys, the code for this is a mess AND slows your graphics down even when it's not needed. We already have had Retina displays for 6+ years already, and Hi-Dpi displays are the norm these days for third party too, so there's no reason for everybody to suffer from old tech, like there was no reason for everybody to have a floppy disk or CD-ROM on their laptop just because 5% had a real use for it. Part of the way it is with Apple, and has always been is that we move fast, and cut things off first. Until now, this exact tactic got us from near bankruptcy to being the #1 company on the planet. So, if you don't like it, don't let he door hit you on your way out".
Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.
I would even go so far as to say that the majority of people who want to buy a monitor for doing text based things (ie business) will buy a bog standard monitor.
If you're on a MacOS platform, with the current price of HiDPI IPS displays, the time is right to grab just about anything. If you're on Windows or Linux, it's still a great time so as long as you're keeping all monitors the same DPI and probably integer scale.
Well, the vast majority of the people consider 60hz totally acceptable for work - and work fine with it.
For the huge majority of the people refresh ration > 60hz isn't even a concern.
Better resolution on the other hand is a marked improvement (either as more screen real estate or as finer detail retina style).
But isn't that more for quick moving stuff, like games or (in the iPhone) quicker visual response to pen input and such?
For regular computing (reading webpages, editing stuff, programming, watching movies) I don't see much of a difference.
Heck, movies are 24p and we're fine with it.
Check out iPad Pro devices with 120 Hz display. It makes a big difference in readability of scrolling text (try to read a web page while it's scrolling), responsiveness and smoothness of motion.
The ones that do already have "Hi-DPI terminal windows and really sharp text" as an almost-mandatory requirement.
Think about it. Sub-pixel text rendering (and AFAIK it's just text, not images w/ subpixel positioning? although that's an incredibly interesting idea for a hack!). Maybe subpixel vectors/lines?
Either you care about high-quality lines or you don't... what Apple's basically doing here is moving the cost to the hardware instead of the software, and the win is simplified, faster, less-energy-using device.
I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks (don't make me source my dates on retina displays). It's the norm for "executive" users (ie: software developers, graphics users, finance, book editors, etc.).
If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.
However, thinking it through, I would _really_ like then a "subpixel mode" for either the OS as a whole (kindof like an FSAA?), or especially certain windows. Maybe there's a way to do that as a hack?
True, but not being in search of the best quality doesn't mean that any quality will do.
I've been happy with text rendering on my bog-standard monitor so far. I'm not sure I will still be happy after this degradation.
But as it seems my current Mac Mini is the last of its kind anyway and I'm going to have to move on from Apple.
That's Apple's market, not people holding on to 10 year computers. Not even those that still buy the MBA.
They are not. But I guess the point of the parent poster is that Apple require its customers to spend on latest items instead of supporting old tech.
Basically, it is saying: If you can't afford this, don't buy it. It is fair given that Apple products are already marked up in price.
Historically 16:10 monitors have had the same width as 16:9 ones (e.g. 1920×1200 vs 1920×1080) so there's no difference as far as having things side by side.
Regardless of that, if all you're doing is General Business™ then surely all you need to do is grab an all-in-one box, shove as much 3rd party ram in it as will fit & then nab a couple of monitors and an unbranded mechanical keyboard off amazon.
I did a lot of work on a £700(~$900) Optiplex with 16gb of ram and that was more than capable of running ubuntu on 3 monitors while being small enough to fit into a handbag
So just let me turn it off. Or turn it off automatically when I'm scaling. Or leave it as it is because no one's ever cared about it. Removing it is the worst option of all.
> We already have had Retina displays for 6+ years already
External displays are the thing here.
I really hope the market punishes you for this sentiment.
And P.S. Apple is strong because of the iPod and iPhone - macOS is a tiny part of their business and Apple is strong in spite of their shitting on Mac customers, not because of it.
Seems more likely that Apple will simply discontinue the Air and that the MacBook represents their new vision of what laptops ought to be like.
Further, if your prediction is correct, why wouldn't Apple already have canceled the Air? what you describe is already on the market: the escape-less MacBook.
I don't feel as skeptical or negative about Apple's plans for new laptops. We shall see, soon!
When the Air was released, it had too many limitations, and too high a cost, to take over the macbook's role. Its primary market was people who really valued the mobility over all else. But as they iterated it closed the gap, until it overtook the macbook, and the macbook offer shrivelled up and died.
If you swap 'macbook' and 'Air' in that narrative, you see history repeating - today we're at the first step of that. "The new macbook" is too limited and too expensive to take the Air's place, but give them a few iterations, and I suspect the Air will become unnecessary in the line-up.
Pretty sure this is what Alan Kay means when he says that people who are serious about software will make their own hardware.
The software folks have a good reason to remove subpixel AA; it's a hack to get better horizontal spatial resolution out of cheap monitors. The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.
By removing subpixel AA at a time when HiDPI monitors are cheap, Apple is probably doing us a service. In a few years, hopefully everyone will be on a DPI that can render nice text. You may think it's unnecessary, but consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
I know it's pretty irritating to suggest that forcing users to buy new hardware is doing them "a service." But really, I see no conflict of interest here. Apple themselves probably hasn't shipped a 96 DPI display in half a decade, and they don't even sell their own external displays, so they don't stand to gain that much here other than software performance.
My 1x monitor can render text just fine using sub-pixel AA. It's a proven technique that has been around for ages and the text looks great on it.
> consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
If I start to become a heavy user of Hanzi and Kanji then I will consider purchasing a monitor that displays it better, but currently not an issue.
> Apple is probably doing us a service.
Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?
> so they don't stand to gain that much here other than software performance.
I don't understand the performance argument people keep bringing up. Subpixel AA has been around for ages and worked fine on much older hardware. Why is it suddenly a bottleneck that needs to be removed when my decade old hardware can still do it?
No, of course not. If you phrase it like that, that is, ignore the point of removing it entirely and all of the benefits that come with it, of course it's not beneficial.
However, 96 DPI was never ideal. HiDPI won't just avoid a regression, it'll improve your experience, period. And with this new software change, you'll get a performance improvement, from better caching and easier GPU acceleration, and software engineers can delete thousands of lines of potentially buggy code made to work around the interaction of subpixel rendering with other graphics code.
Like, subpixel rendering is easy to understand in theory and that might delude one into thinking that it's just a minor annoyance, but it's an annoyance that permeates the stack. You have to special case the entire stack wherever text is concerned. Effectively, the text is being rendered at 3x the horizontal resolution of everything else. This brings in complication across the board. An example is that you pretty much have to do the alpha blending in software, and when you do it requires knowledge of the final composition. Doing alpha blending in software sounds easy... But it's only easy if you do it wrong. Dealing with gamma for example. Which makes life worse because you already have to do this for the other part of your graphics library, probably with a case for both GPU acceleration and software.
Basically, you'd have to render the whole screen at 3x horizontal resolution to remove the need for all of these compositing hacks. Nobody does this because subpixel AA was a hack from the get go.
In an era where battery life and TDP are actually important, finite resources, we want to remove these hacks.
P.S.: if you've never tried rendering large amounts of text on screen before, I highly recommend you try writing a piece of software that needs to do this. A few years ago I struggled to get 60fps rendering text for an IRC client, and of course caching the glyphs was horrendously complex thanks to subpixel AA. It would've been easy if I could've just used GPU compositing like all other graphics, but instead I resorted to hacks everywhere. Maybe today's computers don't need the hacks, but it's worth noting that yes, rendering fonts is resource intensive - you're just living on top of decades of optimizations and hacks to make it look like it's easy.
But that's what it is, that's not just clever phrasing on my part.
> HiDPI won't just avoid a regression, it'll improve your experience, period.
I get that, but if I wanted to spend money on that improvement I would do that. The point is my choice is no longer between maintaining my current standard or upgrading, its between regressing my current standard or upgrading.
Do you have to upgrade the OS? Nope. It's a choice and a tradeoff.
Of course HiDPI is a regression in some ways. You can get far better non-retina monitors (low latency, higher refresh rate, wider range of aspect ratios) at any price point.
Please point to me a reasonably-priced HiDPI/Retina Ultrawide (at least 34" and 3440x1440 reso) monitor. I couldn't find one.
> Of course, I dont have answers
The questions were related to backing up what you claimed. You obviously could not do this.
Given how shoddy Win10's support for HiDPI is across applications at the moment (currently I have a choice between running my 4k laptop at full resolution and having a correctly-sized UI in all the applications I use), I seriously hope they don't do that.
A 1x monitor will be a cheap second monitor or something attached to a Mac Mini (please keep it alive Apple) or Mac Pro (one day it will come). From Apple's point of view I imagine all of these are just super small niches.
Elementary level of product owner analysis I must say. Proactively constraint your product into a niche category.. tsk, tsk.
Yeah, it hasn't been working for Apple this past 2 decades. They're only near the $1 trillion mark, could have been way better...
Ok. But so what? As a customer that's not my problem, and Apple has a HUGE war chest to pay developers to maintain the mess. I have nothing but 1x displays externally and this makes Mojave a non-starter.
It is. Customers pay for code debt too -- with bugs, slow performance, and slower moving features.
From the OP it sounds more like its a pain to implement due to the number and design of abstraction layers in the modern macOS graphics stack.
Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
If we ever want to have a hope of computers looking like, e.g. this Microsoft fluent design mockup , which is clearly window-based, but the windows are tilted slightly and shaded giving it a physical appearance, we'd have to go in that direction. If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display.
Also, one thing I've never seen discussed are virtual displays inside a VR world. The VR headset has plenty of pixels, but you can't simulate a full HD screen at a few feet distance, let alone a retina screen.
It's pretty common for regular AA to be borked in those situations, browser support for AA under transforms hasn't ever been awesome, and it's only been decent in the major browsers pretty recently.
> Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
Text & vector rendering with regular AA are currently aware of virtual sub-pixels, when they're not borked by transforms like in your examples. But making the browser rendering engine have to be aware of the order and orientation of your specific LCD hardware sub-pixels of your monitor would be very difficult for browsers to support. There is a reason subpixel AA has never worked everywhere.
When I first glanced at this article yesterday, I got the mistaken impression that Apple was turning off all AA, not the LCD specific ClearType style of subpixel AA. That would be a big deal, but that's not what's happening. The difference between regular AA and ClearType style AA does exist, but it's pretty small. I'm a graphics person, and I'm very much in favor of AA, but the outrage at losing subpixel AA feels overblown to me. It helps a little, but also has the downside of chromatic aberration. I'm honestly wondering if most of the hubbub here is people making the same mistake I did initially and thinking that they're losing all anti-aliasing?
Which is what we _should_ be doing. What do you expect, some resurrection of the low res display market, or perhaps the return of VGA?
> there's no denying that subpixel-AA text looks better on 1x displays.
I get that it's hard, but at the end of the day what users are going to notice is, when they dock their laptop at work, their display is going to look like fuzzy crap. Maybe most people won't notice? It's hard to tell without seeing a side-by-side comparison.
1) They had Retina MacBook Air
2) They actually have a Monitor Line, that is Retina+ Display, Super great colour accuracy etc.
For 1st point Apple doesn't seems to care, and 2nd point, Apple should at least have a list of Monitor that work best or up to their "Standards". Or on sale at Apple Store.
These attention to details, in its ecosystem weren't as obvious when Steve were here. But recently they have started to show cracks everywhere.
I am. I want a keyboard that's not absolute garbage to type on.
I really miss my MacBook air.
I can dock it at my desk and hook it up to two huge displays and a bigger keyboard, and then not be burdened when traveling.
The current MacBook pro is only .05 pounds heavier than the 13 inch MacBook air, and noticably thinner than the air's thickest point. They are basically the same form factor at this point.
The 13" MacBook Pro with no touchbar is more or less the upgraded MacBook Air with Retina Display that everyone has been asking for for years.
Most annoying was that the MBP fan was kicking on in high gear consistently when I was doing nothing of note, while the air very rarely does so. It’s always cool and quiet, which I love.
If they just threw a retina screen on it, the current Air could be a hell of a computer for several more years.
But what I have realized is that in reality it's very rare that you have to actually use a laptop keyboard. Maybe only like once a week do I not use a proper USB keyboard, i.e. if having to visit a client or something.
Can you say what your needs are? I'm skeptical that I would be happy with the speed of my compiler on an Air.
I also usually have one or more Excel windows open for data analysis, possibly Powerpoint or Keynote, and then some other lightweight stuff (Spotify, Sublime, etc.).
I also am most frequently using this across two 27" Thunderbolt displays (and if I'm desperate for a 3rd, I can open the MBA screen in addition).
I'm not a developer, so my needs may be less intensive than yours.
Yep, I guess that was the root of my question, thanks. :)
I have chronic pain and the idea of paying an extra almost half kilo in weight for my laptop is enough that I gave a Windows PC a try for a year.
I would never buy an Air myself because that screen is pure trash, but she genuinely doesn’t care.
"Hey, we know you probably spent $1200-2500 on a laptop. Buy a $700 or $1300 monitor to go with it!" (And sorry, $700 for a 21" monitor is ... meh).
External retina displays don't exist.
HiDPI displays aren't available in a number of common monitor sizes.
Some people from Jetbrains have looked into it and concluded it was the font rendering in IDEA bringing down the entire system.
Do you think this change can improve this? Because it certainly sounds like it.
First of all, switching to grayscale helps A LOT. mojave does this system wide, but the JVM (or IntelliJ, I don't really remember) has its own font rendering system, so you need to change that in IntelliJ directly.
I was hurt terribly by this issue. My problems have been solved by three things:
- IDEA update that reworks the statusbar undertermined progress bar, which maxed my cpu just for a dumb anymation
- Going from an iGPU to dGPU mac (you seem to have one, so congrats, it would be way worse with only your iGPU)
- Using an experimental JVM from this thread: https://youtrack.jetbrains.com/issue/IDEA-144261
That said, JetBrains messed up with their 2018.1 release again. I'm mostly using Android studio, which lags behind IntelliJ's versions. 3.1 merged the stupid progressbar fix, but I'm not looking forward when 3.2 hits stable, with 2018.1 merged in.
Bottom line is that there are multiple factors in JetBrain's custom OpenGL UI inefficiencies combined with Oracle's JVM bugs/inefficiencies that takes down the whole system, but it's ultimately not only a font smoothing issue.
JetBrains doesn't really care, or at least doesn't show that they care. They first said "we don't notice that", and then released some JVM patches in the comments of an issue.
Of course, Oracle also shares part of the blame: none of this would have happened had they kept Apple's JVM Quartz support.
> Joshua Austill commented 24 Jun 2018 06:21
> @Gani Taskynbekuly OH MAN! I knew I was having this issue way worse at some times than others, but I'd never made the connection to sleep mode. Since reading your comment I've just been shutting my laptop down instead of closing the lid and letting it sleep and the problem has all but gone away. I searched High Sierra sleep issue and this is a well-known issue with bug reports. This means this isn't a JetBrains issue, it's a mac issue. It is annoying to turn off graphics switching and sleep mode and have crappy battery life, but hey, I'll pay that price for a usable laptop. I owe you a drink of your choice!
Seams like a macOS issue. I heard even rumors that's not the only issue with that OS. ;-)
Sigh. Could we please not fall into macOS trolling? I believe that it is a totally different issue that the one I'm suffering.
I haven't had this issue with any other program. There is no excuse for IntelliJ when Xcode, VSCode, Sublime Text and even Eclipse perform 10 times better as IDEA does.
Even if I do a cold boot and open IntelliJ directly, I will still get unbearable slowness.
Sure, there might be a bug, but it doesn't affect me. If it was only a macOS bug, how would you explain that JetBrain's patched JDK works way better, and that 2018.1 is a downgrade?
My old laptop didn't even have a dGPU, so turning off graphics switching wasn't the problem. Many apps don't handle graphics switching correctly, and it's 100% their fault.
Also, do not confuse "top comment" with "latest comment".
One thing that worked for me while it was a problem was switching off ALL antialiasing in IntelliJ, or at least going down to greyscale.
Also since we no longer have consistent sub-pixel layouts, sub-pixel rendered text is no longer portable to another screen, such as you get with an image containing text on a web page.
Isn't it completely abstracted away from 99%+ of macOS development?
Do they always get things right for them? No. But those are their users, not the average joe.
Essentially, this article: http://darknoon.com/2011/02/07/mac-ui-in-the-age-of-ios/ will be obsolete.
If it's possible to realise those benefits while retaining subpixel antialiasing, it would entail a complete reimplementation.
On the other side of things -- Light-on-dark subpixel antialiasing has always been a little wonky, and Apple disables it in most dark UIs. Without significant changes, dark mode would've been greyscale either way,
Ideally everything should be rendered (and cached) to the maximum resolution available (minimum trace widths considered) and then downsampled to the specific screens with subpixel data applied, but that's a lot of processing to do and battery life would suffer.
But they only eliminated windows spanning across multiple monitors when they made spaces per-display.
If you want to rearchitect your whole text / vector graphics rendering pipeline to take proper advantage of GPUs, then you are going to be much happier with your life if you don’t have to double your work to support legacy hardware.
I would love to see Apple implement this kind of thing http://w3.impa.br/~diego/projects/GanEtAl14/
Despite people trying to insist on it being business practices from twenty years ago, the simple fact is that Microsoft dominated, and still dominates, bc of backwards compatibility, a lesson that Apple has never learned (or never cared to learn) and why they will never have more than a few percent market share on the desktop.
And despite the gossip bloggers constant nonsense, the desktop market is still be much alive and thriving.
Well, MacOS doesn't aspire to be the "de facto standard for desktod", or to be able to used in 10 years old machines in some corporation.
Isn't it nice that there are options with different priorities?
Yes the pulled that *hit off... imagine all the combination of components and drivers they needed to support. Way more of a challenge than a closed system like Mac.
Most of the apple hardware supports ARB_blend_func_extended or even EXT_framebuffer_fetch, which lets you to implement subpixel blending with ease. Otherwise you just have to fall back to traditional alpha blending and render each glyph in three passes for each color channel, which is not that hard either.
On the other hand, I won’t tolerate 1x displays on either platform anymore, so this change hardly affects me.
Well, buying 60 4K displays was out of our budget and the LG displays looked very nice. I guess we're just not rich enough.
You can’t buy 1024x768 displays anymore (well, outside of projectors) even though they would cheaper. The same will be true for 1x sooner or later.
Example, the Acer X27 which offers 144 Hz is $2000: https://www.newegg.com/Product/Product.aspx?Item=N82E1682401...
LG's 34WK95U (5K ultrawide, not out yet but supposedly will be next month), is $1500.
Personally, I have a 38UC99 right now and I want to stick with the 21:9 aspect ratio, so I'm waiting for that 34UK95U.
Yet, somehow, freetype guys did it all
My main computer I am still using a CRT, and the flat panels I own some of them are with pixels in funky patterns, and when enabling sub pixel the image instead gain lots of random color fringes.
And you can't use rotated screens either. I saw people doing that to read source code easier, with multiple screens rotated, and rotated screens are used in some games, Ikaruga for Windows for example explicity supports rotated screen mode.
the amount of color you apply to each subpixel has different levels of intensity depending on how deeply into the pixel the shape goes, so you would need to blend with whatever is behind/in front of it. Also you can't just do something like write (128,0,0) to a pixel and leave it at that, you'll have a red pixel. It would need to be blended with what's already there.
Afaict subpixel rendering is supersampling width with multiples of 3, then modulo via the horizontal coordinate into rgb triplets and divide by supersampling factor. I guess if kerning and position are also subpixel it might become more tricky to cache results efficiently.
I imagine 10 engineers that worked out antialiasing at Apple are handsomely compensated for their suffering.
I hook my mac to a cheap ultrawide. Will Apple give me 1k eur to get a new monitor with 4x pixel density so I can continue using my computer that I got from them only a few months ago or will my mac become obsolete? This is unworkable.
As someone with a non-retina iMac I'm also affected, but I understand that non-retina displays are a niche category these days. Thinking about it, I guess the Macbook Air is the only computer Apple is selling that doesn't use a retina display.
Social bubble detected
I already thought it was a reasonable engineering decision, though, because of one other factor: the bar for what a screen should look like has gotten massively higher over the last several years.
On 1x displays, yes, subpixel antialiased text absolutely looks better... but it still looks like shit compared to any modern phone, any MacBook Pro or high-res Dell laptop from the past several years, to any 27" 5K display or 20" 4K display, etc.
Subpixel-AA used to be the way to get the best text rendering possible, for all normal computer users.
But now, it sounds like it requires maintaining some bug-prone, nightmarish architecture, just to get slightly-better-but-still-terrible text rendering. That probably isn't worth it.
Love fish shell, btw!
More so, most technical people don't even understand that one, so good look getting a non-Jobs type to see the importance.
Non-retina displays are going to stick around for years. Not everyone wants high DPI, I for one much prefer working on a giant non-Retina monitor instead, for more real estate instead of detail. I don't know what they put in the water in the Bay Area, but everyone seems to be moving towards a lowest common denominator approach.
This is a terrible decision, and if they don't fix it, it'll be just another reason to abandon the platform.
Gamma correct blending doesn’t require threading the information through multiple graphics layers, as the parent comment put it. The difficulty of a gamma correct blending implementation isn’t even remotely close to the difficulty of subpixel AA.
> so good look getting a non-Jobs type to see the importance.
I’m not sure I understand what you mean. Mac was the first major PC platform that had configurable gamma correction and color calibration in the OS. It used to be known as the platform for designers, and more people using Mac cared about color and gamma for display and print than for any other platform.
> Not everyone wants high DPI... I prefer more real estate instead of detail
You actually don’t want high DPI even if you can have it on a large monitor? Or do you mean you prioritize monitor size over resolution?
Low DPI is not going to be a choice in the future... the near future. And once it’s gone, it’s never coming back.
> This is a terrible decision
Can you elaborate on how it affects you personally? Since you know about gamma, you do know the difference between LCD subpixel AA and regular AA, right? For some people, subpixel AA is blurrier than regular AA. For me, I see slight improvement, but it is by no means some sort of deal breaker. I’m not clear what the outrage is about, can you explain?
This was covered at WWDC. It uses grayscale-AA now which is much easier to hardware accelerate and gives a more consistent experience across different types of displays.
Grayscale AA isn't somehow gray or monochromatic, it is regular anti-aliasing, it works in color, and it works by averaging (virtual) sub-pixel samples. The name is trying to refer to how it's not making use of the physical LCD color subpixels.
Both names assume that the term "sub-pixel" is primarily referring to LCD hardware, rather than general sub-area portions of a pixel. From my (graphics) point of view, the LCD case is a specific and rare subset of what people mean by "subpixel", which is what makes these terms seem misleading and awkward.
The idea that they would downgrade the display support so that non-retina monitors--- and let's be serious, that is nearly all monitors that people dock into at work or at home-- are going to look worse in Mojave, is almost too absurd to be true.
I want to see this for myself.
Does anyone know if you can simulate this now, without installing the beta? Or if you can install or somehow use the beta without nuking what you already have?
Edit: I am still on Sierra, because the buzz around High Sierra made it sounds like it was a bad idea to upgrade. This is sounding more and more like Mojave is a no-go as well.
Final Cut X was ... something else. Anyone who claims you just need to get use to is must not like having good tools. It's a terrible video editor and Apple killed off their only pro video tool.
Around that time I just ran Linux in a VM on my Mac and got into tiling window managers. I tried a bunch and liked a lot of the, but eventually settled on i3 (which I still use today). I ran OpenSUSE at work. Someone at work was throwing out an old IBM dual Xeon and that became my primary box and I used my MacBook as a Windows gaming laptop, up until it was stolen:
At my most recent job, I asked for a PC laptop (Dell or HP) and they gave me a terrible MacBook. I run Gentoo on it:
but I still hate the hardware and it's more difficult to get Linux on a MacBook than any other x86 laptop I've owned, including two Dells, an HP and an MSI.
I don't want Apple to go away because we do need competition in the market, but their operating system and hardware seriously needs to stop being a pile or poop.
The only reason why anyone continues to hate it today is because they're used to the terrible, horrible and stupid way that all the other NLEs work (including FCP6). It can be disconcerting to use FCPX because everything seems like a fancier version of iMovie, that it therefore must be not for "pros" or it's missing lots of "pro" features.
But it's not. Terrible, horrible and stupid NLE design isn't a feature. FCPX is the first one to get it right, but the rusted on Premiere / Avid "professionals" refuse to see it.
To really appreciate the difference, I recommend turning on screen zoom with ctrl-scroll in the accessibility system prefs. You’ll see the effect of SPAA as colouring on the left and right edges of text.
That said, if there's anyone in the position to exploit their knowledge of the ACTUAL physical display and its sub-pixel layout, it's Apple. I'd expect Windows or Android to drop sub-pixel AA... but iOS/macOS? I'm mystified.
On Windows there used to be a wizard where you had to choose which one looks better to let the system figure out your display’s subpixel arrangement.
I’ve never used Windows beyond 7.
It also makes Retina-class devices look far worse. The fonts on my 5K iMac became much less readable after this change.
I just compared on High Sierra and it's mostly that without subpixel AA types look thinner
I actually prefer this setting to be off on Retina screens. You don't need to anti-alias fonts on a 230ppi display.
And 1080p projectors in conference rooms.
Yes, I'm running the beta right now. If you're scared about your data, create a new partition or virtual machine and install macOS to that.
Someone at Apple really seems to believe in pain as a motivator. And it may work in the short term, but in the long term, it always drives customers to other platforms.
I think they simply believe they're large enough player to make such calls unilaterally, and that the rest of the industry will follow. And I fear they might be right - such decisions enter into the calculations of more "commoditized" vendors like peripheral manufacturers when they design next iterations of their products.
I mean, that's why removing the headphone jack is such a big deal even for people not using iPhones. Not just because we sympathize with iPhone owners - but because Apple is big enough to make this stupidity a new standard everywhere.
* Safari - might seem unnecessary, but it's a lot of the web
* Discord - I hear they have a Linux version in alpha, but...alpha
* iTerm - best there is after years, why the hell can't Linux win at this?!
* 1Password - guess I use the CLI? ugh.
* iTunes - largely used for watching movies
* Messages - This is surprisingly nice to have on the desktop
* Microsoft Outlook - guess you are forced to use the web client
* Xcode - heh
I'm of course not including the apps I use less often but really like when I need them, like Lightroom, Photoshop, Illustrator, Keynote, Excel/Numbers, OmniGraffle, Sketch, and Things.
I think Linux is safe from me, except as the system I have next to my daily driver, and which I use for Windows (when I need to look at something there) and dual boot for Tensorflow.
Safari - either Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem. At worst I think you'd find the experience on par, but when I owned a Mac, I found Google Chrome to be more performant than Safari on the regular, and its Linux build is just as quick. A lot of Linux folks recommend Firefox as well but I still am not convinced it beats Chrome in terms of out-of-the-box magical "Just Works" factor.
Discord - The client may be alpha, but this works just fine honestly. I've never had a problem. This is largely because the desktop app is Electron based, so it's running on a web browser anyway; it's very, very cross platform friendly. Slack too, if that's your thing.
iTerm - There are MANY good terminal clients for Linux. I personally use Terminator, which I find has a good balance of power features and stability. I particularly enjoy its terminal broadcasting implementation, but I'm in the unusual position of working on many parallel servers in tandem during my day job, so this feature is very important to me.
iTunes - If it's just for movie watching, I've found VLC to be perfectly servicable. mPlayer is also quite popular and there are many frontends.
Messages, Outlook - Here you've got a fair point. Outlook in particular is a pain point; there are workarounds to get it working in Thunderbird and Evolution, but they're just that - workarounds. Anything beyond basic email will need the web app; fortunately the web app isn't _terrible_, but yeah. Fair complaint.
Xcode - If your goal is to build Mac / iOS apps, there is no substitute thanks to Apple's EULA. For everything else, pick your poison; there are more code editors and IDEs on Linux than one can count, many of them excellent. Personally I'm happy with Sublime Text (paid, worth every penny) and a Terminator window, but I hear VSCode is also excellent, which is odd considering that's a Microsoft endeavor. (Now if we could just get them to port Outlook...)
Google Chrome's extension ecosystem is undoubtably far, far ahead of what Safari has. As for usability, and…
> I found Google Chrome to be more performant than Safari on the regular
People use Safari because it integrates so well with macOS, is performant, and doesn't kill resources (CPU, RAM, battery, you name it). No other browser comes close, even on other platforms. Apple's just spent too much time here optimizing their browser that nobody else can match it (maybe Edge on Windows?).
> There are MANY good terminal clients for Linux.
iTerm just has everything and the kitchen sink. Like, it has some flaws, but it just does so many things that I haven't seen any other emulator do. Plus the author is really smart (he's at the top of the comments currently, if you want to check his stuff out)
Xcode can be surprisingly nice for C/C++ development, when it decides it wants to work.
Check back with Konsole.
I don't think I know all features, but just the things I know it does, because I use them:
- Tabs, obviously. Movable between windows, too.
- Arbitrary in-window splitting (tiling)
- Profiles, obviously.
- Copy stuff as HTML & export whole scrollback as HTML and also print it / convert to PDF
- Can basically configure everything
Edge feels like poor man's Safari. However the rest of Mac OS (in terms of GUI, not underlying OS) feels like poor man's Windows. :D
That depends on the environment. I assume you're talking about Outlook as a frontend for an enterprise Exchange setup?
If your mail server admin configures IMAP and SMTP correctly, it's a breeze to get it set up (you will need SSL though). Use "DOMAIN\user.name" as username together with your AD password (and for team/shared mailboxes, use DOMAIN\user.name\mailbox.name, where mailbox.name is the cn or sAMAccountName attribute of the mailbox AD entry). Thunderbird can handle everyday emailing as well as responding to calendar events that way; I'm not sure about integrating "real" calendar stuff. Auto-completion of mail addresses must be done separately in Thunderbird, you'll need the AD structure information (root DN, plus the DN of your AD account so you can log in).
What you'll miss is forwarding rules access, but that can be done via webmail if the need arises.
Even IE will at least provide a free VM image these days.
Discord works perfectly fine in a web browser. IIRC the platform-specific clients are just running a web browser themselves. Would rather use PWA then.
As for iTerm, you don't need that when you use i3 but perhaps you can specify the features you need.
1Password, I recommend Bitwarden. Cheaper, open source, you can even run an open source backend.
Messages, no idea why you want that. I only use WhatsApp for IM, and even that is just a web app.
For development I recommend Sublime Text. You can use it for free, though I did buy it (as someone else commented; worth every penny).
Skype for Linux is actually fallen behind to the other ports. Though I don't use Skype I heard complaints about that.
For office I use WPS office (word/excel/power point). It has tabs for documents and I have never had document compatibility issues (like I have had with libre office). I believe office 2013 runs under Wine too if you really need office.
I prefer the web clients for email anyway.
I use wavebox to integrate web based chat and email with the operating system for notifications.
Safari uses WebKit so it's very similar to the engine chrome uses - but Google forked at some point.
Deepin Terminal is an amazing terminal.
Can't help you with Xcode unless you switch programming languages :)
It would be great if Adobe released Linux versions of their products.
I don't do much video editing but I think there are good options on Linux.
Try it as a challenge I think you might be pleasantly surprised - if you use all the above apps how far Linux has come. Most of these apps work on Mac as well so you may find a few new good tools.
I do a ton of things in the browser, love the availability of a good Terminal at my fingertips (same as macOS), develop webapps and love the simple availability of a full local web server stack to develop on and so on. I use LibreOffice for my very few "Office" needs. I use Firefox as my primary browser (moved away from Chrome to Firefox before I made the switch to Linux). I use Visual Studio Code as my primary editor. I use Krita for my very few photo editing needs - simple stuff really like cropping or color picking. That's about it. Everything else happens in the Browser or on the command line.
As Netrunner is a derivative of Manjaro and thus Arch it's very simple to access a plethora of packages from their AUR system. It's like Brew on steroids.
Have been really happy with the switch!
Apple isn’t quite big enough to force an industry change in personal computers.
The problem with the USB-C standard is that USB-A now transcends far beyond the personal computer industry. It's now a power adapter standard for mobile phones and all sorts of gizmos, gadgets and accessories for the home and car. USB-A isn't going anywhere for a long time.
I don't think that's true. USB became ubiquitous because literally every PC included it after 1998. Yes, it might have helped that USB also worked with Macs, unlike previous connectors.
In many cases it's arguable whether Apple forced a change or it was a case of Apple "skating where the puck is going" and doing something that was inevitably going to happen given time.
Similarly, if Tesla never existed the electric car would still have been an inevitability. Perhaps it would have taken another decade to emerge. Perhaps two decades. But it would have eventually. On electric propulsion, Tesla is most definitely just "skating where the puck is going."
In both cases—Tesla and Apple—they're not just skating where the puck is going, they're also in control of the puck, even if only a little bit.
 And thinking about it further, I shouldn't have conceded the assumption that USB's dominance was inevitable. It seems that way in hindsight but was it really? It could have fizzled like Firewire. It is entirely possible that new incremental standards—perhaps an ultra-fast nine pin serial port protocol—could have filled the gap before another connector emerged to dominate.
Often these technologies are stuck until there is a first big adopter. Now Google, Apple Macs, and most of Android is behind USB-C for instance, there will be a lot more accessories that use it. Right now IMHO it is being held back mostly by cost - micro-usb and an A to micro-B cable are still the cheapest parts.
PC’s have been slowly adopting it. Give it a couple of more years.
If the end result is that you can't tell if plugging into your friend's charger is going to blow up your phone (or video game console), I think USB-C is going to fall apart.
Another sign that a lot of innovation comes from China currently... The Matebook Pro X looks great also.
Seriously, there's nothing that comes close to Windows - it just works great and the manufacturer doesn't pull the rug out from under your feet every 5 years like Apple does. If there were something better I'd switch to that too, but there's not.
I'm sure glad that my ancestors decided to move out of their caves.
Retina screens are great. Disabling the better font-rendering tech is stupid.
And doubling the resolution doesn't make subpixel AA obsolete either.
The actual reason is that switching to hardware-accelerated font compositing means that you lose out on subpixel antialiasing, because of complications related to blending. The classic way of doing subpixel text rendering means you need three distinct alpha channels, and most hardware only lets you supply just one. There are ways of hacking around this with dual-source blending, but such a feature was not supported at the time of Vista. In Windows 8 and up, DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.
But what was this crazy obsession with hardware-acceleration of every rendered character and its grandmother? While you're busy Flippin' 3D I get it, you can temporarily render aliased Wingdings if that makes your life easier, but at least for Pete's sake I just want to enjoy not tearing my eyes out the other 99.99999% of the time. If there was anything slow or poorly implemented in XP it honestly wasn't the taskbar font rendering.
(Not angry at you; just annoyed at the situation they caused.)
> DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.
No, it (still) doesn't. This is my taskbar: https://i.imgur.com/zzIusQD.png
Market context: Microsoft yearned to implement the Metro UI they'd been researching for a long time (Zune was its public appearance) — and Vista kind of soured people on Aero by osmosis; third-party Android applications always had consistency issues, so Google wanted to mandate something that would be generic enough to work (and be easily implemented) for all apps but also look distinctly specific to Android; Apple wanted to signal some shift to upscale high fashion (parallel to "everything is black glossy glass and aluminum" in its industrial design) and understood fashion-minded upscale as a mix of modernist typography with neon bright contemporary color schemes.
To make up for the boring visual aesthetic that often resulted, they recommended adding animations throughout apps. Most of these animations aren't very helpful, but they keep the user distracted for a moment.
What? ClearType was basically introduced in Vista and refined in Win 7.
Win 8 is when they start to remove it for various applications (UWP, and some others) due to its poor performance on retatable devices (tablets mainly).
I love my magsafe charge + 2 thunder + 2 USB + HDMI ports and can't imagine life with fewer.
The main reason is that when I have to move somewhere, it is only one thing to remove (and then add back in) compared to the 5 minute shuffle I used to do.
There have only been one or two situations in a year where it caused a slight annoyance.
I'm not a huge fan of Apple, but they've been pulling this stuff for ages, and it doesn't look like people are going anywhere.
People complaining about their Macs type posts constantly top HN. Usually the comments are filled with further moaning about how things have been so bad for so long with Apple etc., usually accompanied by threats of moving to another platform.
In particular, with OS X I never need to worry about video drivers, switching dGPU on/off, etc. just to use my laptop in typical ways.
Interestingly, it drove me to buy three maxed-out 11" MBAs before they were discontinued.
So I have not left the Apple computer platform but I won't be buying another laptop until I burn through these three - which could be as late as 2022-2023 ...
As for fonts/aliasing - I just won't upgrade past Sierra (or whatever ... I use Mavericks now ...)
Legacy is technical debt. You have to move on eventually; for many, the pain is worth doing it early.