I'm confused. Are the screenshots somehow different so that the retina and non-retina screenshots will appear different even when viewing both on non-retina (or retina) display?
It depends on your hardware. If your RGB are in a different spatial order from the originator it actually makes it worse.
Consider the 3 and 4th images, the "on light" AA and non-AA versions… Look at line number 10, the "let arr1 =…" line. Look at the vertical stroke of the "l" of "let". In the anti aliased version there is a red glow on the left side and a blue-green glow on the right when you zoom in. On the non anti-aliased one there is not such glow. Now lets lay that into pixels on a scanline…
RGBRGBR_________GBRGBRGB subpixel AA
RGBRGB_________RGBRGBRGB non-subpixel AA
So both are just a black gap in an interwise solid line of GBRGBRG dots (I'm hiding the 'rgb' deliberately there, it doesn't exist in the real world on the LCDs where AA works)
If we turn that back into what the computer abstracts as pixels we get…
RGB RGB R__ ___ ___ _GB RGB RGB subpixel AA (see R and GB? halos)
RGB RGB ___ ___ ___ RGB RGB RGB non-subpixel AA
On 150dpi desktop displays it doesn't matter, your eyes aren't good enough. On displays which can be rotated it is a bad idea because the sub pixel AA hack only works in one orientation and that is just confusing.
On my desktop display I see a difference in the light screens. It's a very subtle difference, but enough that it might theoretically irritate me if I had to stare at text all day. Probably not in practice, it's not a massive problem or anything; but yeah, AA looks better.
Where I notice it the most is on the curly quotes and parentheses. Without AA they look... harsh I guess? Or maybe blocky. You notice the vertical lines making up the curves more. You also lose some of the definition around letters - for example without AA the dot on the "i" is much less noticeable.
On the dark theme it doesn't look nearly as bad, I have a really hard time seeing the difference there at all. Maybe that's because they're scaled differently? Or maybe dark themes just don't need it as much?
I’m noticing this misunderstanding quite a bit in the thread. None of those images have antialiasing disabled. What’s been disabled is subpixel antialiasing. So those images are comparing subpixel antialiasing and greyscale antialiasing.
Opinions between the two are mixed:
I’m one of the people that hates greyscale antialiasing (I find it makes text look fuzzier) and prefer subpixel antialiasing (which does a much better job of preserving the intended shape of the font glyphs).
On the other side, there are people who hate subpixel antialiasing (because they can see the colour fringes on the glyphs) and prefer greyscale antialiasing. Here’s an example from 1897235235 lower down in the thread:
“This is actually better for me. I wrote to Steve Jobs a long time ago and asked him if he could switch to gray-scale font smoothing because the tricks they use with colours don't actually work with people like me who are red green colour blind. The end result was that text on apple products looked terrible to me and I couldn't use any of their products until the retina displays came out. In windows, you can use grayscale font smoothing easily.
Anyway, he replied "we don't have any plans to do this" or something like that. Turns out I won in the end.”
Now if antialiasing was actually fully disabled, the difference would be extremely obvious, to say the least.
I wonder if this is part of the reason why I can't really tell the difference in the dark mode pictures, at least without zooming in all the way. I guess the colors from subpixel antialiasing would be less noticable when overlayed or transitioning from dark to light rather than from light to dark?
> the tricks they use with colours don't actually work with people like me who are red green colour blind.
Oh crud, this never occurred to me. It doesn't seem to matter how many times I remind myself that red/green doesn't work for everyone, I still find myself forgetting about it all the time.
I can see a difference between pairs of each screen shot. I'm pretty confident though that I won't be able to tell when I don't have these nice comparison images, or care about the difference in practice.
Yeah, I put them both at normal zoom level and realized that they're both the same, except the retina is much higher pixel density so the text looks a lot bigger when it is set at 1x zoom.
>My pet theory is that macOS is going to pull in a bunch of iOS code and iOS has never had subpixel AA.
Some Apple engineer said on reddit that it's because subpixel AA is not so useful in Retinas and HiDPI, but slows down processing and complicates pipelines.
So it's part of the move to Metal and increasing graphics performance, even if it means external lo-res monitors will suffer.
> but slows down processing and complicates pipelines.
Rendering in monochrome is also faster than true 32-bit color, but we use 32-bit color because it provides a better experience to the user who is the ultimate consumer of the graphics pipeline.
>Rendering in monochrome is also faster than true 32-bit color, but we use 32-bit color because it provides a better experience to the user who is the ultimate consumer of the graphics pipeline.
It's almost as of it's a trade-off and monochrome so laughingly doesn't cut it, that it's a totally contrived counter example. Almost.
I actually use Nocturne pretty frequently to turn my screen monochrome if I am doing stuff outside of iTerm or my text editor — I find the use of color unmotivated and distracting in most programs, and especially websites
Here, here for Nocturne! I use it at night, switching to monochrome red night mode, and brightness down to the last setting. Sometimes in the morning I forget and think the screen isn't working.
I just downloaded Nocturne and it's not working for me on High Sierra. Is the most recent version really from 2009? May I ask what system you're using?
Those with older external monitors might take issue with that.
I could replace my old monitors, but they still work well with good colour and brightness, so it's not exactly environmentally considerate, or even slightly necessary.
Adding colour has added very little to UX aside from true-colour icons, the UX itself is essentially the same. Colour is used as a theme on top of a UI perfectly recognisable, and much the same as, mono and 4 colour interfaces of the 80s and 90s.
Screen rotation is also a thing on desktop. My Dell has three external monitors, side by side, all rotated in portrait; use a decent window manager and you can have a decent 3×2 grid of windows that easily leads to Perfect Window Placement™ just by using Move Window to Top (Bottom) Half. Gnome and pals know how to handle both horizontal and vertical RGB subpixel arrangements and automatically switches configuration as you rotate the screen.
Mobiles and tablets have such high resolution monitors nowadays they probably run without subpixel AA at all…
“Monitors will suffer” is a metonym, there’s no need to correct it, it was already correct. Metonymy is common in casual speech but less so in formal writing.
Thank you for the link, it is very interesting. Having read the Wikipedia article, I think that metonym here is a bit too heavyweight for the simple thing expressed in the comment.
Edit: personally I value simple and precise language. Constructs such as those mentioned in the Wikipedia article may convey similar meaning but the fact that they exist means that they allow for some variation in meaning and color which is unnecessary in this case.
You value precise and simple language, but other people value other things like clarity and brevity. It’s a tradeoff. Making something precise can mean adding extra words which sometimes, paradoxically makes it less clear and more difficult to understand. Even in extremely formal contexts like mathematical papers, it's inappropriate to be completely precise because it gets in the way of communicating ideas. And if we strongly preferred simple language, we would use https://simple.wikipedia.org/ instead of https://en.wikipedia.org/
The original statement is clear from context, since the literal meaning is semantically impossible (monitors are incapable of suffering).
>Thank you for the link, it is very interesting. Having read the Wikipedia article, I think that metonym here is a bit too heavyweight for the simple thing expressed in the comment.
Metonym is just a linguistic term. Such terms are not constrained to describe high uses of language by great masters of writing. They are merely names for specific constructs or linguistic phenomena.
A drunken sailor swearing at someone at 3am could be using a metonym just as easily as Wallace Stevens.
> I released a build of iTerm2 last year that accidentally disabled subpixel AA and everyone flipped out. People definitely notice.
On a Retina display at least I find that the most striking difference is that most text appears to have a lighter weight. Maybe that's the difference people are perceiving, rather than color fringing or whatever?
It's simply a different setting for stem thickening, the subpixel AA modes added a lot. It would definitely be possible to set the same stem thickening for grayscale rendering, and on hi-res monitors it would look pretty similar.
They give justification for it in the video linked somewhere else in the comments. They say "it works better on a wider variety of displays" in reference to the gray-scale approach. I am guessing they are probably switching to oled across all of their products.
ex-MacOS SWE here. Subpixel antialiasing is obnoxious to implement. It requires threading physical pixel geometry up through multiple graphics layers, geometry which is screen-dependent (think multi-monitor). It multiplies your glyph caches: glyph * subpixel offset. It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing. There's tons of ways to fall off of the subpixel antialiased quality path, and there's weird graphical artifacts when switching from static to animated text, or the other way. What a pain!
Nevertheless there's no denying that subpixel-AA text looks better on 1x displays. Everyone notices when it's not working, and macOS will look worse without it (on 1x displays).
>"Listen guys the code for this is a mess. Can you just buy the new Macbook?"
More like:
"Listen guys, the code for this is a mess AND slows your graphics down even when it's not needed. We already have had Retina displays for 6+ years already, and Hi-Dpi displays are the norm these days for third party too, so there's no reason for everybody to suffer from old tech, like there was no reason for everybody to have a floppy disk or CD-ROM on their laptop just because 5% had a real use for it. Part of the way it is with Apple, and has always been is that we move fast, and cut things off first. Until now, this exact tactic got us from near bankruptcy to being the #1 company on the planet. So, if you don't like it, don't let he door hit you on your way out".
> Hi-Dpi displays are the norm these days for third party
Are they? Nearly every monitor in my company's mainly mac based office is a 1080p dell. All the monitors I personally own are 1x.
I would even go so far as to say that the majority of people who want to buy a monitor for doing text based things (ie business) will buy a bog standard monitor.
It's taken a while party because of cost and perhaps a bit more because of the horrible ways Windows and Linux deal with HiDPI. You wouldn't want heterogenous DPIs or non-integer scales on those platforms. On Linux it seems heterogenous DPI is still very experimental and ugly. On Windows some apps are buggy and others are ugly when dealing with heterogenous DPI. On Windows non integer scale does actually work, but it makes some apps size things horrifically. Needless to say Microsoft's multiple approaches to DPI scaling have made a mess, and Linux never really had a unified way of dealing with it.
If you're on a MacOS platform, with the current price of HiDPI IPS displays, the time is right to grab just about anything. If you're on Windows or Linux, it's still a great time so as long as you're keeping all monitors the same DPI and probably integer scale.
You're absolutely right, but I'd wager it's mostly because they haven't been exposed to 120Hz yet. The moment Apple introduces a 120Hz screen on their iPhones, people are going to want it everywhere. Much like HiDPI displays.
I absolutely love scrolling on 120Hz displays. It feels so much more natural when the letters aren't blurry as they move under your fingers. Indeed, the iPad Pros have the feature, but they aren't nearly as popular as iPhones. I tried on the Razer Phone, can't wait to have it on mine.
100-200 Hz displays are the next logical step for laptops and mobile phones.
Check out iPad Pro devices with 120 Hz display. It makes a big difference in readability of scrolling text (try to read a web page while it's scrolling), responsiveness and smoothness of motion.
These days even mobile phones can drive external monitors at 4k 60 Hz. I think it's reasonable to expect next gen MacBook Pros to be able to drive two 4k monitors at 120 Hz+.
And the majority of people (and businesses) won't care.
The ones that do already have "Hi-DPI terminal windows and really sharp text" as an almost-mandatory requirement.
Think about it. Sub-pixel text rendering (and AFAIK it's just text, not images w/ subpixel positioning? although that's an incredibly interesting idea for a hack!). Maybe subpixel vectors/lines?
Either you care about high-quality lines or you don't... what Apple's basically doing here is moving the cost to the hardware instead of the software, and the win is simplified, faster, less-energy-using device.
I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks (don't make me source my dates on retina displays). It's the norm for "executive" users (ie: software developers, graphics users, finance, book editors, etc.).
If you're happy w/ a bog-standard 1080p output device, then obviously you're not "in search of" the best quality text you can get.
However, thinking it through, I would _really_ like then a "subpixel mode" for either the OS as a whole (kindof like an FSAA?), or especially certain windows. Maybe there's a way to do that as a hack?
>I'm not personally agreeing with Apple's decision, I'm responding to your disbelief that "hi-dpi is the norm". It's the norm for anyone who primarily uses an iPhone. It's the norm for anyone using modern MacBooks
That's Apple's market, not people holding on to 10 year computers. Not even those that still buy the MBA.
1x Monitor is only acceptable for gaming in my opinion. I cannot believe people are working on 200$ screens. I mean you can also code on an 80 char terminal in theory, but why would you
16:9 monitor is only acceptable for watching movies in my opinion. I cannot believe people are working on TV screens. I mean you can also code on a Commodore 64 in theory, but why would you?
Historically 16:10 monitors have had the same width as 16:9 ones (e.g. 1920×1200 vs 1920×1080) so there's no difference as far as having things side by side.
I think you're being sarcastic, but I honestly can't. I recently bought a new workstation with a $3k budget, and I regret going with a laptop because the budget wasn't enough for me to get an adequate laptop that will still perform well in 3-5 years and get nice monitors.
I was being sarcastic, but also not taking into account currency conversion rates (and possibly the availability of stuff in your country)
Regardless of that, if all you're doing is General Business™ then surely all you need to do is grab an all-in-one box, shove as much 3rd party ram in it as will fit & then nab a couple of monitors and an unbranded mechanical keyboard off amazon.
I did a lot of work on a £700(~$900) Optiplex with 16gb of ram and that was more than capable of running ubuntu on 3 monitors while being small enough to fit into a handbag
> AND slows your graphics down even when it's not needed
So just let me turn it off. Or turn it off automatically when I'm scaling. Or leave it as it is because no one's ever cared about it. Removing it is the worst option of all.
> We already have had Retina displays for 6+ years already
This is 'Xbox One is online-only #DealWithIt' level sentiment. Fuck the consumer, right?
I really hope the market punishes you for this sentiment.
And P.S. Apple is strong because of the iPod and iPhone - macOS is a tiny part of their business and Apple is strong in spite of their shitting on Mac customers, not because of it.
The just sell it as a lower tier option for developing markets and those entering the ecosystem. If they so decide they could drop support for it in a heartbeat and leave the users only running HS.
If the "update" also includes removing all the ports except USB-C, getting rid of MagSafe, and switching to an inferior keyboard design to save a few fractions of a millimeter, I don't see much to look forward to.
Seems more likely that Apple will simply discontinue the Air and that the MacBook represents their new vision of what laptops ought to be like.
Agreed. Discontinuing the Air seems much more likely. This is so aggravating to me. Air + retina display would have been the perfect laptop for years now, but they simply refused to make it. I hate many of the changes Apple has been making instead, and the new keyboard is an absolute deal-breaker. Don't know how I will ever willingly give Apple my money again...
Your general skepticism about other aspects of the coming revision of the MacBook Air may be warranted -- nevertheless a retina display would be a great improvement.
Further, if your prediction is correct, why wouldn't Apple already have canceled the Air? what you describe is already on the market: the escape-less MacBook.
I don't feel as skeptical or negative about Apple's plans for new laptops. We shall see, soon!
I do actually suspect the Air will disappear as an offer in due time.
When the Air was released, it had too many limitations, and too high a cost, to take over the macbook's role. Its primary market was people who really valued the mobility over all else. But as they iterated it closed the gap, until it overtook the macbook, and the macbook offer shrivelled up and died.
If you swap 'macbook' and 'Air' in that narrative, you see history repeating - today we're at the first step of that. "The new macbook" is too limited and too expensive to take the Air's place, but give them a few iterations, and I suspect the Air will become unnecessary in the line-up.
I still believe that a big problem MS had was their relentless commitment to backward compatibility. I agree at some point you have to dump last years legacy, but Apple seem to want to dump last _weeks_ legacy.
Promising/maintaining backwards compatability is how you market and sustain Enterprise software contracts. For consumer products, it's a bonus if you still get updates after 2 years.
apple still sells a whole line of non Hi-DPI machines it's arguably a little early for them to kill this stuff off. They should drop that line or upgrade it to hd-dpi and then in 2-3 year kill off non HI-DPI support
Meh. I see their point. Subpixel AA was about using software to simulate better hardware. But, at the end of the day you really do want the better hardware so that you can use your software budget to simulate an even better generation of hardware. Repeat ad-infinitum, modulo scaling limits.
Pretty sure this is what Alan Kay means when he says that people who are serious about software will make their own hardware.
Mac sales massively happen on machines with retina display though.
A 1x monitor will be a cheap second monitor or something attached to a Mac Mini (please keep it alive Apple) or Mac Pro (one day it will come). From Apple's point of view I imagine all of these are just super small niches.
The software folks have a good reason to remove subpixel AA; it's a hack to get better horizontal spatial resolution out of cheap monitors. The users have no reason, however, to buy better, now reasonably-priced, displays that would make subpixel AA unnecessary.
By removing subpixel AA at a time when HiDPI monitors are cheap, Apple is probably doing us a service. In a few years, hopefully everyone will be on a DPI that can render nice text. You may think it's unnecessary, but consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
I know it's pretty irritating to suggest that forcing users to buy new hardware is doing them "a service." But really, I see no conflict of interest here. Apple themselves probably hasn't shipped a 96 DPI display in half a decade, and they don't even sell their own external displays, so they don't stand to gain that much here other than software performance.
> In a few years, hopefully everyone will be on a DPI that can render nice text.
My 1x monitor can render text just fine using sub-pixel AA. It's a proven technique that has been around for ages and the text looks great on it.
> consider complex glyphs like Hanzi and Kanji, which really don't get enough mileage out of subpixel AA really.
If I start to become a heavy user of Hanzi and Kanji then I will consider purchasing a monitor that displays it better, but currently not an issue.
> Apple is probably doing us a service.
Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?
> so they don't stand to gain that much here other than software performance.
I don't understand the performance argument people keep bringing up. Subpixel AA has been around for ages and worked fine on much older hardware. Why is it suddenly a bottleneck that needs to be removed when my decade old hardware can still do it?
>Requiring me to spend money on new hardware in order to avoid a regression in my experience is somehow doing me a favour? Really?
No, of course not. If you phrase it like that, that is, ignore the point of removing it entirely and all of the benefits that come with it, of course it's not beneficial.
However, 96 DPI was never ideal. HiDPI won't just avoid a regression, it'll improve your experience, period. And with this new software change, you'll get a performance improvement, from better caching and easier GPU acceleration, and software engineers can delete thousands of lines of potentially buggy code made to work around the interaction of subpixel rendering with other graphics code.
Like, subpixel rendering is easy to understand in theory and that might delude one into thinking that it's just a minor annoyance, but it's an annoyance that permeates the stack. You have to special case the entire stack wherever text is concerned. Effectively, the text is being rendered at 3x the horizontal resolution of everything else. This brings in complication across the board. An example is that you pretty much have to do the alpha blending in software, and when you do it requires knowledge of the final composition. Doing alpha blending in software sounds easy... But it's only easy if you do it wrong. Dealing with gamma for example. Which makes life worse because you already have to do this for the other part of your graphics library, probably with a case for both GPU acceleration and software.
Basically, you'd have to render the whole screen at 3x horizontal resolution to remove the need for all of these compositing hacks. Nobody does this because subpixel AA was a hack from the get go.
In an era where battery life and TDP are actually important, finite resources, we want to remove these hacks.
P.S.: if you've never tried rendering large amounts of text on screen before, I highly recommend you try writing a piece of software that needs to do this. A few years ago I struggled to get 60fps rendering text for an IRC client, and of course caching the glyphs was horrendously complex thanks to subpixel AA. It would've been easy if I could've just used GPU compositing like all other graphics, but instead I resorted to hacks everywhere. Maybe today's computers don't need the hacks, but it's worth noting that yes, rendering fonts is resource intensive - you're just living on top of decades of optimizations and hacks to make it look like it's easy.
> No, of course not. If you phrase it like that, that is
But that's what it is, that's not just clever phrasing on my part.
> HiDPI won't just avoid a regression, it'll improve your experience, period.
I get that, but if I wanted to spend money on that improvement I would do that. The point is my choice is no longer between maintaining my current standard or upgrading, its between regressing my current standard or upgrading.
Unless you want to selflessly support some botnet you have to update your OS pretty soon after it's released. Certainly more often than you have to upgrade your screen.
I'm not sure if this was sarcastic or not, but I'd have to say pretty well. I have a 2008 iMac that's stuck on El Cap, and it still gets OS and Safari security updates.
Well, I'm not, and neither is Apple, because we are not the common case. I, too, have my Thinkpad docked all the time, connected to two external monitors that are 96 DPI. That does not mean I do not support the move past 96 DPI, I absolutely do despite the extra cost, because I do have HiDPI setups and after having one I have little interest in putting any more money into low DPI equipment.
The cheapest HiDPI monitor I would even consider buying is $2000. There are no ultrawide HiDPI monitors in existence at the moment.
Of course HiDPI is a regression in some ways. You can get far better non-retina monitors (low latency, higher refresh rate, wider range of aspect ratios) at any price point.
34” 3440x1440 is 110 dpi, which is 1x, not retina. To do retina at those sizes you’re looking at 8k. Only dell has an 8k monitor, and it’s not ultrawide, is very expensive, and most of the mac line-up probably can’t drive that many pixels.
Of course, I dont have answers. I didn't say HiDPI displays that suit everyone's needs is available for cheap. However, the main problem is that everyday people have a lot of 1080p 24" monitors sitting around. There were probably fifty to a hundred at my old office. Because we didn't have any huge size requirements or uncommon aspect ratio needs, we were able to take advantage of HiDPI displays that cost to the order of $100. To me that's a price point the average person can afford. If it causes some folks to have to shell out a lot of money to stay on 16:10 or have a reasonably large monitor, that is a reasonable trade off; most users don't need em, hence why it's difficult to find good 16:10 monitors in the first place, for example.
I don't see USB drives with USB-B port or headphones with jacks going away. I don't think that it'll be different with displays in foreseeable future. macOS users are forced to buy hidpi displays, but they are tiny minority and other people will continue to buy cheaper displays, it won't turn industry. Microsoft could force everyone to upgrade by disable anti-aliasing in new Windows version, but I highly doubt that they would do that.
> Microsoft could force everyone to upgrade by disable anti-aliasing in new Windows version
Given how shoddy Win10's support for HiDPI is across applications at the moment (currently I have a choice between running my 4k laptop at full resolution and having a correctly-sized UI in all the applications I use), I seriously hope they don't do that.
Do you deal with mixed DPI, for example, via external displays? I have very little issues with Windows 10 on 4K, so as long as I don't mix DPIs. Maybe it's because we're using different software, though.
I see absolutely nothing wrong with USB-B or headphone jacks. I'm far from an Apple fanboy, running a Thinkpad with Linux and a Google Pixel 2 (and yes, I am pissed about not having a headphone jack.) I'm not just embracing whatever Apple says. But once I switched all of my machines to HiDPI, I saw the light. This, to me, is not a minor improvement in experience. It is a huge improvement in experience, and in my opinion essential for people who deal with languages that have a more dense script.
Japanese can be rendered acceptably with subpixel AA, but it is significantly worse, especially for complex kanji, than a HiDPI display. For Chinese where you don't have less dense character sets like kana, it's even worse.
> It’s less “the code for this is a mess”, and more “the code for this can’t be anything but a mess”.
Ok. But so what? As a customer that's not my problem, and Apple has a HUGE war chest to pay developers to maintain the mess. I have nothing but 1x displays externally and this makes Mojave a non-starter.
It's not like tech debt has no cost. This is engineering effort that can be spent in other places; it's lots of other tasks that become easier because this complication doesn't need to be supported. It's reasonable to argue that'll turn into real benefits in the software in other ways.
Personally, I think that is moving in the wrong direction. This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly.
Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
If we ever want to have a hope of computers looking like, e.g. this Microsoft fluent design mockup [1], which is clearly window-based, but the windows are tilted slightly and shaded giving it a physical appearance, we'd have to go in that direction. If we stay in the model where everything is a texture that gets composited, the only thing we can do is to use higher resolution / retina display.
Also, one thing I've never seen discussed are virtual displays inside a VR world. The VR headset has plenty of pixels, but you can't simulate a full HD screen at a few feet distance, let alone a retina screen.
> This is the reason text looks bad on transformed HTML elements in browsers, or when you click a button on Windows and it tilts slightly.
It's pretty common for regular AA to be borked in those situations, browser support for AA under transforms hasn't ever been awesome, and it's only been decent in the major browsers pretty recently.
> Text rendering should be aware of the (sub)pixels, even post-transform. While we're at it, this can apply to vector art, too.
Text & vector rendering with regular AA are currently aware of virtual sub-pixels, when they're not borked by transforms like in your examples. But making the browser rendering engine have to be aware of the order and orientation of your specific LCD hardware sub-pixels of your monitor would be very difficult for browsers to support. There is a reason subpixel AA has never worked everywhere.
When I first glanced at this article yesterday, I got the mistaken impression that Apple was turning off all AA, not the LCD specific ClearType style of subpixel AA. That would be a big deal, but that's not what's happening. The difference between regular AA and ClearType style AA does exist, but it's pretty small. I'm a graphics person, and I'm very much in favor of AA, but the outrage at losing subpixel AA feels overblown to me. It helps a little, but also has the downside of chromatic aberration. I'm honestly wondering if most of the hubbub here is people making the same mistake I did initially and thinking that they're losing all anti-aliasing?
> there's no denying that subpixel-AA text looks better on 1x displays.
I get that it's hard, but at the end of the day what users are going to notice is, when they dock their laptop at work, their display is going to look like fuzzy crap. Maybe most people won't notice? It's hard to tell without seeing a side-by-side comparison.
1) They had Retina MacBook Air
2) They actually have a Monitor Line, that is Retina+ Display, Super great colour accuracy etc.
For 1st point Apple doesn't seems to care, and 2nd point, Apple should at least have a list of Monitor that work best or up to their "Standards". Or on sale at Apple Store.
These attention to details, in its ecosystem weren't as obvious when Steve were here. But recently they have started to show cracks everywhere.
Who is buying the MacBook Air right now anyway? The pro is lighter, smaller, faster and better performance for the money. They are very similarly priced I am not sure why anyone would purchase the air given the updated 13 MBP is available.
I've opted to use the MBA exclusively at work for years now. I have not found anything that beats the size and weight of it while still giving that level of power. The form factor is one of my top considerations, so a Pro isn't really something I need or want because the extra horsepower doesn't really add much for my needs.
I can dock it at my desk and hook it up to two huge displays and a bigger keyboard, and then not be burdened when traveling.
> I have not found anything that beats the size and weight of it while still giving that level of power
The current MacBook pro is only .05 pounds heavier than the 13 inch MacBook air, and noticably thinner than the air's thickest point. They are basically the same form factor at this point.
The 13" MacBook Pro with no touchbar is more or less the upgraded MacBook Air with Retina Display that everyone has been asking for for years.
I have to disagree. I bought both a 2017 air and a 2017 mbp 13” to compare, and ended up returning the MBP. The screen on the MBP is far and away better, which made it a painful choice. On the other hand, the air was superior for the keyboard, and also for noise/heat, and of course several hundred dollars in price. The MBP didn’t really feel any faster for the type of work I was doing on it.
Most annoying was that the MBP fan was kicking on in high gear consistently when I was doing nothing of note, while the air very rarely does so. It’s always cool and quiet, which I love.
If they just threw a retina screen on it, the current Air could be a hell of a computer for several more years.
I recently went from an Air to MBP 13" (not out of choice, work requirement) and the keyboards are horrible on both. The only noteworthy thing on the MBP one to me is the reliability issues, comparing the actual typing quality is apples to oranges.
But what I have realized is that in reality it's very rare that you have to actually use a laptop keyboard. Maybe only like once a week do I not use a proper USB keyboard, i.e. if having to visit a client or something.
Mainly I've got multiple browser windows open with way too many tabs. I use a plugin to prevent them all from being actively loaded, but still have a ton active at any one time.
I also usually have one or more Excel windows open for data analysis, possibly Powerpoint or Keynote, and then some other lightweight stuff (Spotify, Sublime, etc.).
I also am most frequently using this across two 27" Thunderbolt displays (and if I'm desperate for a 3rd, I can open the MBA screen in addition).
I'm not a developer, so my needs may be less intensive than yours.
They still sell the 11-inch Air through education channels. You can buy one from a e.g. university's store if you or someone you know is student/staff/faculty/alum.
The non-Touch Bar MacBook Pro is £1249 vs the Air’s £949. That’s a big difference for a lot of people. People like my partner, who doesn’t care that it’s essentially 4-5 year old hardware with minor revisions.
I would never buy an Air myself because that screen is pure trash, but she genuinely doesn’t care.
For the majority of the consumer, which is 80% of the market, 10% of market are nerds and professional each. Of which many of those are price sensitive, and the $999 MBA is the entry level machine which Apple offers. And as of right now, it is ridiculously over priced even by Apple standards.
I have a 2015 rMBP with nVidia 750m and recently I switched from Apple Cinema Display to a 4k screen. At any of the graphical modes between the full 2x retina and 1x the whole OS gets extremely sluggish to the point of unusable.
Some people from Jetbrains have looked into it and concluded it was the font rendering in IDEA bringing down the entire system.
Do you think this change can improve this? Because it certainly sounds like it.
Sorry for the rant, but given the price of the JetBrains Toolbox, it infuriates me that they've been giving us HDPI users the middle finger for a couple of years now.
First of all, switching to grayscale helps A LOT. mojave does this system wide, but the JVM (or IntelliJ, I don't really remember) has its own font rendering system, so you need to change that in IntelliJ directly.
I was hurt terribly by this issue. My problems have been solved by three things:
- IDEA update that reworks the statusbar undertermined progress bar, which maxed my cpu just for a dumb anymation
- Going from an iGPU to dGPU mac (you seem to have one, so congrats, it would be way worse with only your iGPU)
- Using an experimental JVM from this thread: https://youtrack.jetbrains.com/issue/IDEA-144261
That said, JetBrains messed up with their 2018.1 release again. I'm mostly using Android studio, which lags behind IntelliJ's versions. 3.1 merged the stupid progressbar fix, but I'm not looking forward when 3.2 hits stable, with 2018.1 merged in.
Bottom line is that there are multiple factors in JetBrain's custom OpenGL UI inefficiencies combined with Oracle's JVM bugs/inefficiencies that takes down the whole system, but it's ultimately not only a font smoothing issue.
JetBrains doesn't really care, or at least doesn't show that they care. They first said "we don't notice that", and then released some JVM patches in the comments of an issue.
Of course, Oracle also shares part of the blame: none of this would have happened had they kept Apple's JVM Quartz support.
> @Gani Taskynbekuly OH MAN! I knew I was having this issue way worse at some times than others, but I'd never made the connection to sleep mode. Since reading your comment I've just been shutting my laptop down instead of closing the lid and letting it sleep and the problem has all but gone away. I searched High Sierra sleep issue and this is a well-known issue with bug reports. This means this isn't a JetBrains issue, it's a mac issue. It is annoying to turn off graphics switching and sleep mode and have crappy battery life, but hey, I'll pay that price for a usable laptop. I owe you a drink of your choice!
Seams like a macOS issue. I heard even rumors that's not the only issue with that OS. ;-)
> I heard even rumors that's not the only issue with that OS. ;-)
Sigh. Could we please not fall into macOS trolling? I believe that it is a totally different issue that the one I'm suffering.
I haven't had this issue with any other program. There is no excuse for IntelliJ when Xcode, VSCode, Sublime Text and even Eclipse perform 10 times better as IDEA does.
Even if I do a cold boot and open IntelliJ directly, I will still get unbearable slowness.
Sure, there might be a bug, but it doesn't affect me. If it was only a macOS bug, how would you explain that JetBrain's patched JDK works way better, and that 2018.1 is a downgrade?
My old laptop didn't even have a dGPU, so turning off graphics switching wasn't the problem. Many apps don't handle graphics switching correctly, and it's 100% their fault.
Also, do not confuse "top comment" with "latest comment".
Not so much hard but a hack. There is also no clear non-hackish way to implement sub-pixel rendering after 20 years of experience across at least three platforms.
Also since we no longer have consistent sub-pixel layouts, sub-pixel rendered text is no longer portable to another screen, such as you get with an image containing text on a web page.
Yes, but if it dramatically increases complexity for all the lower levels of the rendering stack, it ends up creating extra work for those OS/framework developers, requiring separate versions of the code targeting different displays which must be maintained separately with different sets of bugs in each, preventing them from making large-scale architectural changes they want to make, restricting some types of significant optimizations, etc.
Apple has a $200B war chest. If they can't afford to deal with this problem and instead choose to make my experience worse that says something about their priorities as a company.
Yeah but isn't it already a solved problem with 1x monitors? Who is implementing things multiple times for no reason? I'm glad in the future it will be easier for Apple because they just won't hire devs to do that part then, but does this imply that 1x monitors will simply look inferior no matter what is done with newer versions of OSX+?
If it's possible to realise those benefits while retaining subpixel antialiasing, it would entail a complete reimplementation.
On the other side of things -- Light-on-dark subpixel antialiasing has always been a little wonky, and Apple disables it in most dark UIs. Without significant changes, dark mode would've been greyscale either way,
This complexity is why we can't have windows that spread between 1x and 2x monitors since the introduction of Retina displays.
Ideally everything should be rendered (and cached) to the maximum resolution available (minimum trace widths considered) and then downsampled to the specific screens with subpixel data applied, but that's a lot of processing to do and battery life would suffer.
Eh? Before Yosemite was released in 2014, one window could span between a 1x and 2x monitor. I forget if it was always rendered at 1x, or if it was rendered at 2x and scaled for the 1x, but yeah it looked like crap on one or the other. Actually I think it switched depending on which monitor had more of the window?
But they only eliminated windows spanning across multiple monitors when they made spaces per-display.
It’s already a solved problem if you want to keep using the software you already wrote, and never do anything different. (Nobody is stopping people from running OS X 10.13 indefinitely on their 1x displays.)
If you want to rearchitect your whole text / vector graphics rendering pipeline to take proper advantage of GPUs, then you are going to be much happier with your life if you don’t have to double your work to support legacy hardware.
>It requires knowing your foreground and background colors at render time, which is an unnatural requirement when you want to do GPU-accelerated compositing
Most of the apple hardware supports ARB_blend_func_extended or even EXT_framebuffer_fetch, which lets you to implement subpixel blending with ease. Otherwise you just have to fall back to traditional alpha blending and render each glyph in three passes for each color channel, which is not that hard either.
If this was hard for apple imagine how much work it was for Microsoft with their complete lack of control over monitor displays. But they pulled it off.
Pulled it off, or they just don't care to streamline their code and get the best performance, but instead hold tight to all kinds of legacy crap in sake of compatibility?
Holding tight to 'all that legacy crap' is what makes Windows the de facto standard for desktop.
Despite people trying to insist on it being business practices from twenty years ago, the simple fact is that Microsoft dominated, and still dominates, bc of backwards compatibility, a lesson that Apple has never learned (or never cared to learn) and why they will never have more than a few percent market share on the desktop.
And despite the gossip bloggers constant nonsense, the desktop market is still be much alive and thriving.
>..but instead hold tight to all kinds of legacy crap in sake of compatibility?
Yes the pulled that *hit off... imagine all the combination of components and drivers they needed to support. Way more of a challenge than a closed system like Mac.
This change seems to be dumping on Steve Jobs' graduation speech where he talks about taking calligraphy classes and how it made him realize the importance of good font rendering.
If our LG 21x9 displays all look like crap when we dock our MacBook Pros, then it’s just one more justification for the Windows lovers (yes, they exist) to move to Surfaces. Adobe doesn’t care which we run.
If you aren't after a certain size or refresh rate, then 4K displays have gotten pretty affordable -- but if you want one with high refresh or a more exotic resolution things get quite expensive.
My main computer I am still using a CRT, and the flat panels I own some of them are with pixels in funky patterns, and when enabling sub pixel the image instead gain lots of random color fringes.
And you can't use rotated screens either. I saw people doing that to read source code easier, with multiple screens rotated, and rotated screens are used in some games, Ikaruga for Windows for example explicity supports rotated screen mode.
This and the document @gnachman posted has me intrigued. Are you saying that subpixel AA is not just like a multichannel mask? I would expect the amount of R/G/B foreground and background color for a given pixel to be the same regardless of what those colors actually are?
(note I haven't worked on subpixel AA for font rendering, but have for something different)
the amount of color you apply to each subpixel has different levels of intensity depending on how deeply into the pixel the shape goes, so you would need to blend with whatever is behind/in front of it. Also you can't just do something like write (128,0,0) to a pixel and leave it at that, you'll have a red pixel. It would need to be blended with what's already there.
I expect this would be the case with alpha compositing at least...
Afaict subpixel rendering is supersampling width with multiples of 3, then modulo via the horizontal coordinate into rgb triplets and divide by supersampling factor. I guess if kerning and position are also subpixel it might become more tricky to cache results efficiently.
> Subpixel antialiasing is obnoxious to implement.
I imagine 10 engineers that worked out antialiasing at Apple are handsomely compensated for their suffering.
I hook my mac to a cheap ultrawide. Will Apple give me 1k eur to get a new monitor with 4x pixel density so I can continue using my computer that I got from them only a few months ago or will my mac become obsolete? This is unworkable.
You sound very dramatic. You know, you could just not update the OS if it's that important to you.
As someone with a non-retina iMac I'm also affected, but I understand that non-retina displays are a niche category these days. Thinking about it, I guess the Macbook Air is the only computer Apple is selling that doesn't use a retina display.
I already thought it was a reasonable engineering decision, though, because of one other factor: the bar for what a screen should look like has gotten massively higher over the last several years.
On 1x displays, yes, subpixel antialiased text absolutely looks better... but it still looks like shit compared to any modern phone, any MacBook Pro or high-res Dell laptop from the past several years, to any 27" 5K display or 20" 4K display, etc.
Subpixel-AA used to be the way to get the best text rendering possible, for all normal computer users.
But now, it sounds like it requires maintaining some bug-prone, nightmarish architecture, just to get slightly-better-but-still-terrible text rendering. That probably isn't worth it.
It's also easier to implement non-gamma correct blending, but seeing as it looks terrible and uneven, Apple so far has not done that.
More so, most technical people don't even understand that one, so good look getting a non-Jobs type to see the importance.
Non-retina displays are going to stick around for years. Not everyone wants high DPI, I for one much prefer working on a giant non-Retina monitor instead, for more real estate instead of detail. I don't know what they put in the water in the Bay Area, but everyone seems to be moving towards a lowest common denominator approach.
This is a terrible decision, and if they don't fix it, it'll be just another reason to abandon the platform.
> It's also easier to implement non-gamma correct blending,
Gamma correct blending doesn’t require threading the information through multiple graphics layers, as the parent comment put it. The difficulty of a gamma correct blending implementation isn’t even remotely close to the difficulty of subpixel AA.
> so good look getting a non-Jobs type to see the importance.
I’m not sure I understand what you mean. Mac was the first major PC platform that had configurable gamma correction and color calibration in the OS. It used to be known as the platform for designers, and more people using Mac cared about color and gamma for display and print than for any other platform.
> Not everyone wants high DPI... I prefer more real estate instead of detail
You actually don’t want high DPI even if you can have it on a large monitor? Or do you mean you prioritize monitor size over resolution?
Low DPI is not going to be a choice in the future... the near future. And once it’s gone, it’s never coming back.
> This is a terrible decision
Can you elaborate on how it affects you personally? Since you know about gamma, you do know the difference between LCD subpixel AA and regular AA, right? For some people, subpixel AA is blurrier than regular AA. For me, I see slight improvement, but it is by no means some sort of deal breaker. I’m not clear what the outrage is about, can you explain?
Subpixel-AA was not removed from macOS Mojave so the title is completely wrong anyway.
This was covered at WWDC. It uses grayscale-AA now which is much easier to hardware accelerate and gives a more consistent experience across different types of displays.
You are right. But I think parent comment understands the concepts, and was misled by the name, which to be fair, is a misleading name. The names "subpixel AA" and "grayscale AA" are both terrible names.
Grayscale AA isn't somehow gray or monochromatic, it is regular anti-aliasing, it works in color, and it works by averaging (virtual) sub-pixel samples. The name is trying to refer to how it's not making use of the physical LCD color subpixels.
Both names assume that the term "sub-pixel" is primarily referring to LCD hardware, rather than general sub-area portions of a pixel. From my (graphics) point of view, the LCD case is a specific and rare subset of what people mean by "subpixel", which is what makes these terms seem misleading and awkward.
I hope we are all somehow misunderstanding how terrible this is.
The idea that they would downgrade the display support so that non-retina monitors--- and let's be serious, that is nearly all monitors that people dock into at work or at home-- are going to look worse in Mojave, is almost too absurd to be true.
I want to see this for myself.
Does anyone know if you can simulate this now, without installing the beta? Or if you can install or somehow use the beta without nuking what you already have?
Edit: I am still on Sierra, because the buzz around High Sierra made it sounds like it was a bad idea to upgrade. This is sounding more and more like Mojave is a no-go as well.
I fucking hated Lion when I got it on a new MacBook in 2012. I had been running Snow Leopard on my Hackintosh. I decided to buy a new Mac when I went overseas and Lion removed expose and replaced it with that (still) terrible Mission Control garbage. I spent a few weeks trying to downgrade it to 10.6 and it was impossible.
Final Cut X was ... something else. Anyone who claims you just need to get use to is must not like having good tools. It's a terrible video editor and Apple killed off their only pro video tool.
Around that time I just ran Linux in a VM on my Mac and got into tiling window managers. I tried a bunch and liked a lot of the, but eventually settled on i3 (which I still use today). I ran OpenSUSE at work. Someone at work was throwing out an old IBM dual Xeon and that became my primary box and I used my MacBook as a Windows gaming laptop, up until it was stolen:
but I still hate the hardware and it's more difficult to get Linux on a MacBook than any other x86 laptop I've owned, including two Dells, an HP and an MSI.
I don't want Apple to go away because we do need competition in the market, but their operating system and hardware seriously needs to stop being a pile or poop.
I won't have anything bad said about FCPX. Yes, they botched the launch by releasing an unfinished, beta-quality first version. But every criticism of it was addressed within short order and it's solid as hell now.
The only reason why anyone continues to hate it today is because they're used to the terrible, horrible and stupid way that all the other NLEs work (including FCP6). It can be disconcerting to use FCPX because everything seems like a fancier version of iMovie, that it therefore must be not for "pros" or it's missing lots of "pro" features.
But it's not. Terrible, horrible and stupid NLE design isn't a feature. FCPX is the first one to get it right, but the rusted on Premiere / Avid "professionals" refuse to see it.
You can simulate this by adding the `-webkit-font-smoothing: antialiased` CSS rule to any website when using Safari or Chrome. Lots of websites already do this though, so you can use the value `subpixel-antialiased` to revert the text to that mode instead.
To really appreciate the difference, I recommend turning on screen zoom with ctrl-scroll in the accessibility system prefs. You’ll see the effect of SPAA as colouring on the left and right edges of text.
The quality of sub-pixel AA depends A LOT on the physical parameters of the display. It's basically the OS saying "I know that pixels aren't actually square, and I'll use that knowledge to better digitize the font's curves." That worked fine when everything was CRTs or the same technology of LCD displays. However the faults of sup-pixel AA show through pretty strongly on the variety of modern displays (OLED vs AMOLED vs. ...), and the rotation capability of tablet displays. Apple is in the process of switching LED technologies across their products.
That said, if there's anyone in the position to exploit their knowledge of the ACTUAL physical display and its sub-pixel layout, it's Apple. I'd expect Windows or Android to drop sub-pixel AA... but iOS/macOS? I'm mystified.
Subpixel AA was never about CRTs; they do not have a consistent mapping between logical pixels and physical rgb holes in the shadow mask. It was always about LCDs
Having actually been in the graphics industry before LCDs were prevalent, we did sub-pixel rendering. It just wasn't built in as a standard feature in OS font rendering until LCDs made it necessary.
I think Windows has a feature to configure the subpixel arrangement, what about MacOS? Sure they know about the internal screen, but not the one you plug in.
Don't screens tell the OS about their physical characteristics? At least the native resolution and the physical screen size (or dpi) are available to the OS somehow. I'd guess the sub-pixel arrangement might also be.
That's EDID you're talking about and AFAIK, it doesn't support that. On my android now but if anyone wants to fiddle around with what their screen provides, on Linux there is read-edid and parse-edid to query and interpret the raw data directly. It's not terribly much since it's 256 bytes max iirc.
I do not think this is an accurate way to simulate the issue reported. This simulation will look worse than the actual issue, which is not present on 4K+ displays.
OS X has never done gamma correction properly with subpixel antialiasing. Instead, it “dilates” the shapes slightly, but this is actually wrong. I suppose half the problem is getting used to the new, correct, line weights. This is more problematic on retina displays because the dilation affected the weight far more than the gamma deviations would’ve at high resolution.
This is the software equivalent to moving exclusively to USB-C. 18 months after the 2016 MacBook Pro, the vast majority of hardware currently in use still requires a dongle, and 18 months from Mojave's release, the vast majority of external monitors in use will still look blurry without subpixel AA.
Someone at Apple really seems to believe in pain as a motivator. And it may work in the short term, but in the long term, it always drives customers to other platforms.
> Someone at Apple really seems to believe in pain as a motivator.
I think they simply believe they're large enough player to make such calls unilaterally, and that the rest of the industry will follow. And I fear they might be right - such decisions enter into the calculations of more "commoditized" vendors like peripheral manufacturers when they design next iterations of their products.
I mean, that's why removing the headphone jack is such a big deal even for people not using iPhones. Not just because we sympathize with iPhone owners - but because Apple is big enough to make this stupidity a new standard everywhere.
Recently switched to Manjaro Deepin operating system. Everything looks nice and quite polished once inside the OS. I have a great pull down terminal. I can snap windows side by side. Almost everything is configurable without digging into configuration files. It's a rolling release so you don't get stuck on a particular version of a package until the next major release. You get the latest stable release of all software monthly or sooner using the built in package manager.
Most of my development tools I install using the package manager. Git, docker, docker-compose, vitualbox, openjdk, node, go, python, (dont use pip use pacman to install python packages), (sdkman for kotlin, gradle and maven), visual studio code, postman,(dbeaver as gui for SQL, Cassandra, mongo, big table, redis, neo4j). Download and install intellij and sublime. WPS office is better than Microsoft Office in my opinion. There's also the AUR repository which are community built packages that can also be enabled in the package manager settings. Everything is really stable. Built in screen recorder and screenshots with annotations. You can map your keys to work like a Mac or a Windows computer even on a Mac. The latest Gimp is actually really good (not yet available on macos). There's also Krita for image editing but doesn't handle large PSD files well. Overall the development experience is much less frustrating and docker runs natively which means everything is much faster. I've had a lot of luck running music production DAW and VSTs and games under Wine. There's a great guide on the techonia website about how to dual boot Manjaro Deepin safely on a MacBook Pro. The best part is I can have this environment on any computer so I can have the latest intel processor if I want and the OS won't make my company issued external monitor look like crap. Unfortunately Apple runs more like a fashion company than a technology company. Making people feel compelled to buy the latest product.
Attempting take this idea seriously, I just catalogued apps I have open right now that I know/suspect have no good equivalent on Linux:
* Safari - might seem unnecessary, but it's a lot of the web
* Discord - I hear they have a Linux version in alpha, but...alpha
* iTerm - best there is after years, why the hell can't Linux win at this?!
* 1Password - guess I use the CLI? ugh.
* iTunes - largely used for watching movies
* Messages - This is surprisingly nice to have on the desktop
* Microsoft Outlook - guess you are forced to use the web client
* Xcode - heh
Stuff I didn't expect to see but was pleasantly surprised to see:
* Spotify
* Hipchat
* Zoom
* Skype
I could live without Discord, Messages, Safari, iTunes and Outlook if I got motivated (Outlook I have to have, but the web client is halfway decent). That leaves Xcode, iTerm and 1Password as dealbreakers. We know one of those isn't going to change!
I'm of course not including the apps I use less often but really like when I need them, like Lightroom, Photoshop, Illustrator, Keynote, Excel/Numbers, OmniGraffle, Sketch, and Things.
I think Linux is safe from me, except as the system I have next to my daily driver, and which I use for Windows (when I need to look at something there) and dual boot for Tensorflow.
Safari - either Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem. At worst I think you'd find the experience on par, but when I owned a Mac, I found Google Chrome to be more performant than Safari on the regular, and its Linux build is just as quick. A lot of Linux folks recommend Firefox as well but I still am not convinced it beats Chrome in terms of out-of-the-box magical "Just Works" factor.
Discord - The client may be alpha, but this works just fine honestly. I've never had a problem. This is largely because the desktop app is Electron based, so it's running on a web browser anyway; it's very, very cross platform friendly. Slack too, if that's your thing.
iTerm - There are MANY good terminal clients for Linux. I personally use Terminator, which I find has a good balance of power features and stability. I particularly enjoy its terminal broadcasting implementation, but I'm in the unusual position of working on many parallel servers in tandem during my day job, so this feature is very important to me.
iTunes - If it's just for movie watching, I've found VLC to be perfectly servicable. mPlayer is also quite popular and there are many frontends.
Messages, Outlook - Here you've got a fair point. Outlook in particular is a pain point; there are workarounds to get it working in Thunderbird and Evolution, but they're just that - workarounds. Anything beyond basic email will need the web app; fortunately the web app isn't _terrible_, but yeah. Fair complaint.
Xcode - If your goal is to build Mac / iOS apps, there is no substitute thanks to Apple's EULA. For everything else, pick your poison; there are more code editors and IDEs on Linux than one can count, many of them excellent. Personally I'm happy with Sublime Text (paid, worth every penny) and a Terminator window, but I hear VSCode is also excellent, which is odd considering that's a Microsoft endeavor. (Now if we could just get them to port Outlook...)
> Google Chrome or Chromium (depending on how open you like things) can keep pace with Safari in terms of general usability and extension ecosystem
Google Chrome's extension ecosystem is undoubtably far, far ahead of what Safari has. As for usability, and…
> I found Google Chrome to be more performant than Safari on the regular
People use Safari because it integrates so well with macOS, is performant, and doesn't kill resources (CPU, RAM, battery, you name it). No other browser comes close, even on other platforms. Apple's just spent too much time here optimizing their browser that nobody else can match it (maybe Edge on Windows?).
> There are MANY good terminal clients for Linux.
iTerm just has everything and the kitchen sink. Like, it has some flaws, but it just does so many things that I haven't seen any other emulator do. Plus the author is really smart (he's at the top of the comments currently, if you want to check his stuff out)
> Xcode
Xcode can be surprisingly nice for C/C++ development, when it decides it wants to work.
Also peep Tilix. It's like the GTK answer to Konsole: super configurable yet accessible, with a sharp GUI. I think its arguably better than Konsole, except it doesn't scale as well as Konsole in Plasma (Plasma scaling is wretched).
That depends on the environment. I assume you're talking about Outlook as a frontend for an enterprise Exchange setup?
If your mail server admin configures IMAP and SMTP correctly, it's a breeze to get it set up (you will need SSL though). Use "DOMAIN\user.name" as username together with your AD password (and for team/shared mailboxes, use DOMAIN\user.name\mailbox.name, where mailbox.name is the cn or sAMAccountName attribute of the mailbox AD entry). Thunderbird can handle everyday emailing as well as responding to calendar events that way; I'm not sure about integrating "real" calendar stuff. Auto-completion of mail addresses must be done separately in Thunderbird, you'll need the AD structure information (root DN, plus the DN of your AD account so you can log in).
What you'll miss is forwarding rules access, but that can be done via webmail if the need arises.
Their Safari comment implies the need isn't due to a preference for Chrome, but rather for front-end testing. So yes, Chrome might make a fine daily driver, but those of us who have to occasionally code for the web need access to all of these browsers.
The idea of swapping away from macOS seems more enticing by the day, given the development directions of macOS (this is just one of the many nails; the main one is that Apple is likely going to merge macOS with iOS [or the other way around]).
Discord works perfectly fine in a web browser. IIRC the platform-specific clients are just running a web browser themselves. Would rather use PWA then.
As for iTerm, you don't need that when you use i3 but perhaps you can specify the features you need.
1Password, I recommend Bitwarden. Cheaper, open source, you can even run an open source backend.
Messages, no idea why you want that. I only use WhatsApp for IM, and even that is just a web app.
For development I recommend Sublime Text. You can use it for free, though I did buy it (as someone else commented; worth every penny).
Skype for Linux is actually fallen behind to the other ports. Though I don't use Skype I heard complaints about that.
For passwords I use enpass. It stores and encrypts passwords in your own personal cloud storage account e.g. Dropbox Google drive. Desktop version is free. Mobile version has a once off fee to buy the app.
For office I use WPS office (word/excel/power point). It has tabs for documents and I have never had document compatibility issues (like I have had with libre office). I believe office 2013 runs under Wine too if you really need office.
I prefer the web clients for email anyway.
I use wavebox to integrate web based chat and email with the operating system for notifications.
Safari uses WebKit so it's very similar to the engine chrome uses - but Google forked at some point.
Deepin Terminal is an amazing terminal.
Can't help you with Xcode unless you switch programming languages :)
It would be great if Adobe released Linux versions of their products.
I don't do much video editing but I think there are good options on Linux.
Try it as a challenge I think you might be pleasantly surprised - if you use all the above apps how far Linux has come. Most of these apps work on Mac as well so you may find a few new good tools.
I recently switched my primary machine to Linux and was pleasantly surprised to learn that 1Password has a web version (which does require a subscription) and the 1Password X extension for Chrome and Firefox to go with that.
I made the switch to Linux (Netrunner Rolling with KDE) in the last year and have not looked back. It is definitely not for everyone as some things just have no good replacements, but it suits all my needs.
I do a ton of things in the browser, love the availability of a good Terminal at my fingertips (same as macOS), develop webapps and love the simple availability of a full local web server stack to develop on and so on. I use LibreOffice for my very few "Office" needs. I use Firefox as my primary browser (moved away from Chrome to Firefox before I made the switch to Linux). I use Visual Studio Code as my primary editor. I use Krita for my very few photo editing needs - simple stuff really like cropping or color picking. That's about it. Everything else happens in the Browser or on the command line.
As Netrunner is a derivative of Manjaro and thus Arch it's very simple to access a plethora of packages from their AUR system. It's like Brew on steroids.
Actually Apple is definitely big enough to force change—they've done it multiple times in the past. Apple helped push the original USB standard into common use with the original iMac. Obviously it would have happened eventually, but the iMac made the transition occur much faster than it would have otherwise.
The problem with the USB-C standard is that USB-A now transcends far beyond the personal computer industry. It's now a power adapter standard for mobile phones and all sorts of gizmos, gadgets and accessories for the home and car. USB-A isn't going anywhere for a long time.
> Apple helped push the original USB standard into common use with the original iMac.
I don't think that's true. USB became ubiquitous because literally every PC included it after 1998. Yes, it might have helped that USB also worked with Macs, unlike previous connectors.
The USB port did start appearing on some PCs before the iMac—largely because Intel put it in its chipsets—but it was practically abandonware with precious few peripherals prior to the iMac. When the flood of peripherals did arrive, many of them were styled with translucent cables and casings to match the iMac, even when sold for Windows PCs.
> Actually Apple is definitely big enough to force change—they've done it multiple times in the past.
In many cases it's arguable whether Apple forced a change or it was a case of Apple "skating where the puck is going" and doing something that was inevitably going to happen given time.
I conceded as much in the paragraph you quoted from. What Apple did is force the market to move faster that it would have otherwise.[0]
Similarly, if Tesla never existed the electric car would still have been an inevitability. Perhaps it would have taken another decade to emerge. Perhaps two decades. But it would have eventually. On electric propulsion, Tesla is most definitely just "skating where the puck is going."
In both cases—Tesla and Apple—they're not just skating where the puck is going, they're also in control of the puck, even if only a little bit.
[0] And thinking about it further, I shouldn't have conceded the assumption that USB's dominance was inevitable. It seems that way in hindsight but was it really? It could have fizzled like Firewire. It is entirely possible that new incremental standards—perhaps an ultra-fast nine pin serial port protocol—could have filled the gap before another connector emerged to dominate.
I'd argue it is both. Apple isn't going to convince an industry to move to a closed standard, however they may accelerate the move to new open standards. For instance, moving away from floppy disks to compact disk media, and moving from serial/parallel/PS2 ports to USB.
Often these technologies are stuck until there is a first big adopter. Now Google, Apple Macs, and most of Android is behind USB-C for instance, there will be a lot more accessories that use it. Right now IMHO it is being held back mostly by cost - micro-usb and an A to micro-B cable are still the cheapest parts.
If you want to use usb 3.1 gen 2 speeds, you pretty much have to use Type C. The only other alternative is a full usb type a connector or the awkward micro b connector (which is rate).
I'm not sure. As more devices and peripherals have gained USB-C, lots of horror stories of chargers frying devices and other catastrophic failures have shown up. This isn't my wheelhouse, but just from the reading I've done, it sounds like USB-C is a difficult-to-implement standard and that peripheral makers are more interested in getting something out the door quickly and cheaply. With the amount of power that USB-C can drive, this is risky.
If the end result is that you can't tell if plugging into your friend's charger is going to blow up your phone (or video game console), I think USB-C is going to fall apart.
Cheap chargers and cables that aren't compliant are not unique to USB-C, and why we have things like UL. There are a lot of people using quite frankly dangerous USB-A chargers for their cell phones because they found them obnoxiously cheap.
With MacOS, I believe, they still know they're the best. As much as I'd want to move away from Apple because of their recent maneuvers, there's nothing that comes close to MacOS - it just works great. Users will see more appreciation if other Unix-like operating systems will catch up on their GUI. Once a BSD or Linux (I'd really want it to be a BSD!) gets a great UI, I'm switching.
GNOME is great, I personally like it as much as macOS. Try running it on Fedora. The latest Ubuntu also ships GNOME by default but I haven't tried it and I'm not sure if it's a tweaked version (if it's tweaked it's surely for the worse).
I don't see how they could "know they're the best" when Windows is the best. If you want the CLI tools from 1979 that come with your Mac, you can easily add them to Windows, the best and most widely used desktop operating system on the planet.
Seriously, there's nothing that comes close to Windows - it just works great and the manufacturer doesn't pull the rug out from under your feet every 5 years like Apple does. If there were something better I'd switch to that too, but there's not.
And even worse than outdated CLI -- the keyboard is a tool from like two hundred years ago (barely changed from movable type tech). Can't believe all these computers still use such old technology. Just because it works is a terrible reason for not updating it.
Hmmm, well I'd say that beyond a very superficial comparison with "keyboards" from 200 years ago (if there even were such a thing), modern keyboards have changed a lot and for good reason.
I'm sure glad that my ancestors decided to move out of their caves.
Microsoft has been having similar font rendering issues for years, so I guess Apple felt left out of this club. They spent all this awesome work & research creating ClearType and then when Vista(?) happened they suddenly decided screw everything, let's switch back to grayscale antialiasing for no reason... now everything on the taskbar (or seemingly anything DWM-rendered) is grayscale-smoothed, which makes me want to tear my eyes out.
> let's switch back to grayscale antialiasing for no reason...
The actual reason is that switching to hardware-accelerated font compositing means that you lose out on subpixel antialiasing, because of complications related to blending. The classic way of doing subpixel text rendering means you need three distinct alpha channels, and most hardware only lets you supply just one. There are ways of hacking around this with dual-source blending, but such a feature was not supported at the time of Vista. In Windows 8 and up, DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.
> The actual reason is that switching to hardware-accelerated font compositing means that you lose out on subpixel antialiasing
But what was this crazy obsession with hardware-acceleration of every rendered character and its grandmother? While you're busy Flippin' 3D I get it, you can temporarily render aliased Wingdings if that makes your life easier, but at least for Pete's sake I just want to enjoy not tearing my eyes out the other 99.99999% of the time. If there was anything slow or poorly implemented in XP it honestly wasn't the taskbar font rendering.
(Not angry at you; just annoyed at the situation they caused.)
> DirectWrite can now support hardware-accelerated subpixel AA, and the OS does so by default.
I can't substantiate my comment with academic sources, but here's my memory from the past decade: the people designing interface guidelines for mobile operating systems decided that carefully designing static visual assets for UIs was a waste of time (with similar technical complexity reasonings due to the number of different display resolutions/pixels per inch/aspect ratios)
Market context: Microsoft yearned to implement the Metro UI they'd been researching for a long time (Zune was its public appearance) — and Vista kind of soured people on Aero by osmosis; third-party Android applications always had consistency issues, so Google wanted to mandate something that would be generic enough to work (and be easily implemented) for all apps but also look distinctly specific to Android; Apple wanted to signal some shift to upscale high fashion (parallel to "everything is black glossy glass and aluminum" in its industrial design) and understood fashion-minded upscale as a mix of modernist typography with neon bright contemporary color schemes.
To make up for the boring visual aesthetic that often resulted, they recommended adding animations throughout apps. Most of these animations aren't very helpful, but they keep the user distracted for a moment.
I chose my words carefully: font compositing, not rendering. Font rendering still happens in software, but in order to composite it correctly into the surface requires DSB. I (hopefully) shouldn't have to explain the advantages of using hardware acceleration for app rendering in general.
I'm aware of Slug (and other GPU font rendering techniques). Slug, however, doesn't have subpixel AA (it doesn't make sense for it to, since its main goal is fitting in a 3D scene), and requires a moderately expensive (read: hard to do realtime) up-front bake process which converts the font into a pair of textures.
> then when Vista(?) happened they suddenly decided screw everything
What? ClearType was basically introduced in Vista and refined in Win 7.
Win 8 is when they start to remove it for various applications (UWP, and some others) due to its poor performance on retatable devices (tablets mainly).
I am actually quite happy with USB-C: Charging has become easier since I can has the ports on both sides of my MacBook. Dongles are necessary, however, they have been necessary for as long as I can remember with Macs. Of course, a native HDMI port would still be great …
macbook pro users of 8 years here, aside from ethernet/DVI at an office not once have I used a dongle, and tbh those two dongles aren't all that bad because i would just leave them at my desk.
I love my magsafe charge + 2 thunder + 2 USB + HDMI ports and can't imagine life with fewer.
There's on MAJOR difference though - you'd have to deliberately buy a USB-C only Mac to enter the dongle world. This change will be pushed to everyone with a software update and we'll be constantly nagged to update with popups if we don't.
I use a USB-C only macbook pro and I absolutely love it, I would not want to go back to a laptop with a lot of IO.
The main reason is that when I have to move somewhere, it is only one thing to remove (and then add back in) compared to the 5 minute shuffle I used to do.
There have only been one or two situations in a year where it caused a slight annoyance.
That is nonsensical, nobody is putting a gun to your head to use all the ports. Having them available is definitely better than not. And if you prefer dongles, keep using them.
i have 4 usb-c on my macbook of that one i used for a usb-c to displayport to connect to my 4k monitor and the other is used for a dongle, i dont know what this guy is talking about, to be honest i would love to have one regular usb to not to have to carry the adapter or dongles everywhere.
Or any GNU/Linux distribution, which actually offer a superior developer experience in my opinion.
People complaining about their Macs type posts constantly top HN. Usually the comments are filled with further moaning about how things have been so bad for so long with Apple etc., usually accompanied by threats of moving to another platform.
"Someone at Apple really seems to believe in pain as a motivator. And it may work in the short term, but in the long term, it always drives customers to other platforms."
Interestingly, it drove me to buy three maxed-out 11" MBAs before they were discontinued.
So I have not left the Apple computer platform but I won't be buying another laptop until I burn through these three - which could be as late as 2022-2023 ...
As for fonts/aliasing - I just won't upgrade past Sierra (or whatever ... I use Mavericks now ...)
One guys dell monitor doesn't - subjectively - render input from a beta OS as pretty as it did the non-beta, and suddenly everybody is bringing out the pitchforks.
If you read through the comments, you'll find this link buried, which shows a developer showcasing the difference in the new AA. Basically font-smoothing is grayscale-only in Mojave.
If I take a screenshot of some text and zoom in, I can see the colours, just like in the video. If I zoom back to normal size and change the saturation to zero (only greyscale) I cannot see a change in sharpness of the text on my 2009 macbook pro.
>Basically font-smoothing is grayscale-only in Mojave.
This is actually better for me. I wrote to Steve Jobs a long time ago and asked him if he could switch to gray-scale font smoothing because the tricks they use with colours don't actually work with people like me who are red green colour blind. The end result was that text on apple products looked terrible to me and I couldn't use any of their products until the retina displays came out. In windows, you can use grayscale font smoothing easily.
Anyway, he replied "we don't have any plans to do this" or something like that. Turns out I won in the end.
Very strange, this has been configurable in System Preferences > General (currently "Use LCD font smoothing when available", previously there were more choices) for ages.
Yes I remember back in Leopard it wasn't just a checkbox, but a drop down menu allowing you turn it off completely or choose "Light", "Medium" or "Strong." I've always found the Strong setting the best for me and have kept it ever since. (There's a `defaults write` command to set this value still, but the effect is a lot less noticeable on Retina.)
For most people who don't like it, it makes the text look like you are watching one of those 3d movies without the glasses. It looks all blurry with red and blue fringing.
Subpixel AA only works when you know exactly what the output pixel layout is, when you aren't being scaled by some other technology and when you aren't showing the same thing on disparate displays, and it requires you to adapt whenever the screen rotates, which is common on tablet-style devices. Also, some people (like me) can see the discoloration it imbues. So I'll be glad to see it go.
(Attempts to disable it tended to get various results in various operating systems, but rarely consistently disabled it)
Really? Somehow that 'only' has included every LCD monitor I've used for more than a decade. Windows AA is more aggressive, but it is still a huge improvement over the jagged over-hinted font rendering you get without ClearType.
No AA makes sense for DPI greater than about 300, but for anything less, with visible pixels, which is probably most non-4K monitors, I would bet most people prefer AA on. In general, people always prefer AA for 3D and all sorts of other rendering, at lower DPI.
So, this is a typical Apple decision to abandon anyone who doesn't have the latest device from them.
Personally i prefer to disable antialiasing and if possible use bitmap fonts designed for clarity. Sadly almost nothing supports properly bitmap fonts these days (assuming it supports them in the first place - i think only GDI and X11 core does, but those are increasingly ignored by programs) and even when you can find support, finding good bitmap fonts is hard because it looks like most bitmap fonts are converted from outline fonts - but with jaggies.
As for what i consider a good bitmap font, in Windows MS Sans (the one you could find in Win9x, it was removed in later versions of Windows and aliased with MS Sans Serif which is an outline font), plain Courier and Fixedsys. On Linux i like the console font that you get when booting Debian (not sure what is called) as well as the Terminus fonts. On X11 i like the helv* fonts (usually registered as adobe helvetica - honestly, why is font registration on Xorg such a clusterfuck?) and some of the Fixed fonts (the smaller ones mainly, the larger are a bit too jaggy). On Mac... pretty much all of the original Macintosh fonts. I also like the "WarpSans" font that OS/2 (and eComStation) has, it is very compact and i like compact fonts (although i'm not fan of a few details, like the w being a bit too jaggy).
> but it is still a huge improvement over the jagged over-hinted font rendering you get without ClearType.
There is no font smoothing at all (the jagged fonts), and then there is grayscale font smoothing, and then there is coloured font smoothing. Apple is switching from coloured to gray-scale, which like OP, is great for me personally as I can see all the colours in the other approach.
They did, but it was inconsistent [1]. Microsoft had the same problem [2]. In both cases, poorly-documented APIs meant many applications still used subpixel antialiasing.
PDFs looks very blurry and dirty since High Sierra (Mojave too). In Preview.app and in Safari. You can compare it if open PDF in Chrome (looks sharp in chrome).
https://imgur.com/a/TRpk1Oi
Is the first image for real? Like, that's not a temporary "hold on a second I'm working here" rendering that's replaced after a split second with a proper version?
So are you saying the High Sierra PDF rendering catastrophy is not a bug but a precursor of this „feature“?
If that is true and the blurred text is going to be the new normal, I do not see how any professional user looking at type all day long can stay on their platform.
Every time I open PDF I think about install El Capitan back. One thing that stops me is picture-in-picture mode. I use it heavily on youtube and it is only in Sierra and later.
Scaling is horrible on MacOS since it basically bumps up the 1x resolution to the next integer multiplier and then does a per-frame raster resize leading to performance drops, blurry edges and other render artifacts.
Meaning that your only reasonable option was to run a 1x monitor without scaling. Now they killed that option too.
Basically what they did, if your primary work machine was a MacBook hooked to a 27" display — is tell you "Buy an iMac and use Continuity".
I mean, I'm all up for removing subpixel antialiasing at the point where everyone is running a 220+ppi monitor. However, we're not at that point yet.
This is a horrible move that may cause me to drop the Mac as a development platform, since I'm looking at text for 8+ hours a day, and I'm not planning on shelling out 3k€ for an iMac or 2k€+ if they get back into the display business, just to fix something that they broke.
I use a display in the 'red zone' described in the article (a Dell P2415) daily paired to a retina MBP; there are no issues coming from not being an even multiple, it's the sharpest monitor I've ever had.
I recently worked on a 4k 24" monitor (~180ppi) on a 2017 MBP.
The text was either too small when rendered at 1x, or the framerate dropped to a crawl when using a scaling option. Everything was rendering at under 60 fps.
Plus I could clearly see the rendering artifacts when scrolling, mentioned in the article.
Enlarge the page using browser zoom. Subpixel AA will make the text look red on one edge and cyan on the other. Without it, the text is perfectly greyscale. This is visible even with extensive bilinear filtering.
You most certainly can. The effect is applied when shading the text, so it does appear in screenshots, as would any other in-software AA technique (AFAIK).
You can test this yourself in the browser. For instance, toggle subpixel AA using the CSS -webkit-font-smoothing property in Safari. There is a clear visible difference that you can screenshot. On some websites with a light-on-dark theme, disabling subpixel AA with CSS was an important part of improving visual consistency between browsers. It’s also important to explicitly control when working with animations that changes text scale or rotation.
It’s easier to see with an enlarged screenshot, but you can totally see the difference at 1x.
I don't know what you can toggle with CSS but I'm pretty sure you can't replicate 'thing rendering a font with knowledge of your physical display' with 'bitmap made of pixels and some CSS'.
The “knowledge” is applied when the text is rendered to the bitmap, not when the bitmap is pushed to the display. You’d be right if the effect happened quite late in the rendering process. It doesn’t — it happens really early.
Screenshots aren't very useful for meets-the-eye visual quality comparisons of subpixel AA unless your display has the same pixel arrangement and resolution as the display the screenshot was rendered on. Even then, there's probably other issues.
I agree that they work for technical comparisons of rendered output.
Maybe I'm confused about something then but you seem to be telling me the subpixel AA is positionally independent. I can take the subpixel AA-ed bitmap and plop it in a page, open it on some other browser and OS and be certain everything is the same down to each coloured subpixel. You'd think every, say, vertical line would have the same colour artifacting, if that were the case. But they don't.
They do? They're just coloured pixels. When you open a bitmap image on different PCs and screens and display it at 1x wrt the monitor's pixels, you would expect every pixel to have the same colour across every PC/screen, no?
The pixels will be the same on every PC/screen. The perceived sharpness/blurriness won't be–the whole point of the "color fringing" is that the subpixels of any particular display have an arrangement that can be exploited to add additional sharpness: https://upload.wikimedia.org/wikipedia/commons/0/0f/Antialia...
If the display you're viewing the image on has a different subpixel layout than the one that the image was rendered for, it won't look sharper.
Okay, this might be an unpopular opinion but I've always actually preferred plain AA (grayscale), not subpixel AA. That's especially on a low-res screen. I discovered this preference back in the flip phones days when a certain J2ME reading app offered a choice between grayscale AA and subpixel AA. I immediately noticed that for subpixel AA, there is a lot of red and blue even when the text is black on white. Those colors just make me nauseated. Grayscale AA makes the text a bit more blurry, but I can actually read that text for hours without feeling nauseated.
When I was using Ubuntu, I was offered four choices for font rendering. The subpixel rendering choice also looked terrible to me.
On macOS Safari (maybe Chrome too) exposes a CSS property called -webkit-font-smoothing. Many websites set this to antialiased, including many built by me, because I truly think this antialias mode looks the best.
In addition to the MacBook Air and normal non-4K external display, it seems that this will also affect projectors. Therefore, it seems that presentation using macOS Mojave will only make your slides/keynote look bad and less professional. Moreover, the projector is not something you can upgrade. It is usually preinstalled in whatever venue where you give your presentation, and totally outside your control.
Many projectors don't use RGB striped subpixel grids anyway. They use color wheels or multiple chips to achieve full coverage, and in that case you want grayscale antialiasing.
And on top of usually not using RGB stripes, they also often use software keystone correction, which means that you don't know the native pixel grid anyway.
So there's no disadvantage for projectors, not at all.
Apple said during WWDC that the change was made to support a wider variety of display technologies and scaling modes. It’s an odd choice considering they still sell the non-retina MacBook Air.
Forcing app developers to live with the same lowest-common-denominator aesthetics that most users see, will make them more likely to raise the bar for the aesthetics on lowest-common-denominator devices.
It's the same reason that video game developers shouldn't be allowed to have powerful GPUs. They need to be testing how the game looks (and works) for most people, not for a group of people only slightly larger than themselves.
I don't think most game shops do. Their devs typically are constantly comparing rendering on mid nVidia and AMD cards, as well as Intel integrated. The Indie titles have to do so because they know their games go for $9~$15 and they need to look consistent or at least decent or a lot of off-the-shelf laptops. Triple-A titles can afford larger huge testing teams that take on various Intel/amd/nvidia cards and report consistency issues.
Sure if the title has a huge AMD or nvidia logo when it starts, it's going to use a lot of specific functionality from that card, whatever those shops pay them specifically to advertise for. But devs need to ensure titles at least look okay and run at at least 60fps to get the largest sales audience.
Do you think most developers are using a non-retina screen? Compared to the general population I bet they skew very heavily towards newer, retina machines. Which means they'll barely notice anything. This is just plain lowering the bar, not raising the average.
I'd say most developers are docking with an external screen, and there are very few external screens that are retina. So yeah, I'd say most developers are using a non-retina screen.
4k screens mostly run at 60hz, making them terrible for anything but watching movies. I don't like coding on a screen where I can see the refresh as I scroll.
I get my devs whatever they want for hardware, within reason. $500 or $1000 monitors? No questions asked. The price of hardware that will last years is nothing compared to the value of engineer happiness and productivity, especially when compares to how costly good engineers are in the USA and Canada.
So get your own monitor. I understand that many employees make the decision that they will not pay for equipment, to be true. However it's still a decision.
Hmm, I can't reply to Symbiote for some reason. But yes, 2560x1440 screens and other wide variants are really prefer by gamers or people who want high refresh rates. 4k is limited to 60Hz, unless you use DisplayPort 1.4 and some very new monitors.
There are quite a few users who prefer the wider format screens for a variety of reasons.
That's a silly question. More stuff fits in more pixels. Even if you have to compensate for readability by scaling fonts larger to make text the same size as it was on the lower dpi screen, the end result isn't actually equivalent to the lower dpi. It still works out that the higher dpi screen is on average more functional due to more things fitting on screen. Especially on dual/triple 27" external monitors on my desk. They are large and I am close to them, and a mere 1440 vertical pixels would be tolerable but crap when better is available. I usually have 4 to 10 terminal windows open where I want to see as much code and documentation and log output as possible, not a movie.
Because "retina" can imply pixel doubling, making it more like an extra-crisp 1920x1080. The typical 1440p screen is nice and big, and replacing it with such a setup at the same physical size would needlessly cut down on what you can fit. If you could trust non-integer scaling to work properly then 4K would resoundingly beat 1440, but that's a pipe dream right now.
You can get the best of both with a 5K-6K screen, but that's currently a very expensive price point not on most people's radar.
These are cheap 27" screens though, which are not ideal for macOS, which is optimized for running at either 100% or 200% zoom. It's a shame that 4K screens at 24" and 5K screens at 27" are so hard to find. Apple uses these panels in their iMac line, but refuses to sell them as standalone displays.
One thing that I noticed when I had to switch from a Linux desktop to an MBP is how much worse text anti-aliasing is. The crispness just isn't there, especially at smaller font sizes.
Certainly the MBP has a high-res screen and can live without it, but it's highly reflective, and anti-glare external monitors provided by most employers are mere FHD, and most fonts become unpleasant and hard to read without anti-aliasing on them.
Maybe anti-aliasing was just hard to implement nicely in OSX graphics stack, they struggled with it, and decided to remove it.
I suspect that most of Apple's "proper computer" sales are laptops where they put a retina display anyway, and if you shell out $2k for a laptop, you can afford to spend $300 for a 4K external display.
In other words, they (correctly) decided that theirs is the premium segment, and that segment can afford to switch to retina-only hardware.
>I suspect that most of Apple's "proper computer" sales are laptops where they put a retina display anyway, and if you shell out $2k for a laptop, you can afford to spend $300 for a 4K external display.
My old windows computer is dying and I am thinking about getting a new mac. It seems the choice is a $2200 imac, or a $2700 mbp + a $1000 4k 21.5inch monitor.
LG is one of the very few companies that sells 4k 21.5inch and 5k 27inch monitors. Even 4k 24inch monitors, which I am guessing you are referring to, are quite rare. They are also not really approved by Apple. I mean, 4k 24inch will make everything look too big I assume, and 4k 27inch isn't retina quality.
So what exactly are we supposed to do? There are no external monitors to buy. Apple doesn't sell any, and the 3rd party solutions are few and far between. Maybe they want everyone to buy an imac and a mbp...
Fortunately, there are a quite few Windows-oriented machines with very decent specs now; some of them have an explicit Linux option, too.
21-22" 4k monitors are indeed pretty rare; 23-24" 4k are more widespread. I would gladly buy a 15-17" 3k / 4k display, like another laptop screen, but they seem to be absent from the market for some reason, despite the wealth of ready-made LCD panels for them.
The reason being that it tends to want to push pre-rendered bitmaps around, and sub-pixel anti-aliased pre-rendered bitmaps don't work well under rotation...
Signed up for HN (again) just to vent about this terrible news. This is a change that will be hard for me to live with, and I will probably end up selling all my Apple gear.
This is the same change/downgrade that occurs if you go to System Preferences-->General-->Use LCD Font Smoothing When Available.
I did this and the fonts on my 5K iMac display looked horrible. Just atrocious. My plan is to not upgrade to Mojave, and then within a year or so sell all my Apple gear and move back to Linux.
I don't understand Apple's thinking, but I believe a lot of people will do the same.
No they won’t. People don’t notice this stuff. I pointed out the weird font smoothing issue in Finder to my best mate – a professional photographer – and even then she barely had any idea what I was on about.
It does affect Retina devices. That's the whole point. I made the same change that Apple is going to make and the fonts looked far worse (almost unreadable). You can observe the same if you have an Apple device by going to: System Preferences-->General-->Use LCD Font Smoothing When Available.
Apple is removing sub-pixel anti-aliasing for all devices, not just non-Retina ones.
Maybe this person's eyesight is better or worse than yours, or perhaps just different. You might be surprised what some people are bothered by, or trained to spot, that others aren't...
Thanks, that's much better. I think it shows that Mojave renders the fonts clearly better: check out for instance 'll' in 'ullamco'. Generally, almost identical pictures though.
I've been on the beta for a couple days (using a 2016 12" MacBook) and I haven't noticed anything. There's a visible difference when I disable LCD font smoothing in settings, but not much changes besides fonts appearing somewhat thinner.
I installed Mojave beta on my MacBook Pro. It is the same thing. The fonts look identical to toggling off the setting "Use LCD font smoothing when available" in High Sierra. And by the same thing, I mean they look very bad in Mojave.
Mojave beta installed, and the fonts look equally atrocious. They are blurry, indistinct...and just bad. One of the reasons I bought a 5K iMac (three of them, actually) was to have great fonts. I am beyond angry that I will now have to sell them.
Can you post some screenshots? I find hard to believe that fonts on a retina display can be blurry and indistinct, when the antialiasing text weight is almost the same.
Those screenshots were both in Waterfox. I also tried in Safari (didn't take any shots, though) and the fonts looked identically bad in that browser as well.
The normal UI's fonts are also worse -- about the same as the browser screenshots. Fonts are indistinct and blurry, and very light. All the result of greyscale aliasing only, I presume.
Either Apple disable "LCD font smoothing" for retina displays, or there is a bug somewhere. Because in Mojave on a non retina screen "LCD font smoothing" has got the same weight as before. And on your screen it clearly doesn't.
I hope it is a bug. If the weight were the same, it'd at least perhaps be tolerable on a Retina screen. I know the technical reasons, but I just don't understand why Apple can't leave it alone for desktop systems.
They don't remove font smoothing. They switch it from subpixel level to grayscale only. Technically subpixel should produce better results then grayscale. But IMO the subpixel antialiasing in macos looks horrible. So you might not even see the difference as it looks exactly as horrible as before.
I'd be more outraged if Windows removed cleartype, because their subpixel font antialising actually works and is configurable.
Greyscale anti-aliasing is far inferior to sub-pixel. The fonts look much worse.
BTW, I did install Mojave beta and the fonts look just terrible even on a Retina display. Shockingly bad. I can't believe Apple is doing this. I just bought a new 5K iMac a few months ago. I wish it were still in the return period...but in other news, I am now selling all of my Apple equipment as it's pointless to have it. Without the great fonts I purchased it for, it's so much junk to me.
Subpixel font rendering was a cool hack back in the day, but it’s no match for high dpi monitors. Personally I could never get past the color fringing artefacts. Bill Hill himself talked about it as an interim measure to get people reading on screen.
I just installed the Mojave beta on my Macbook Pro. The fonts are just as blurry as if I toggle off "Use LCD font smoothing when available" in High Sierra.
So the contention that it won't be the same in Mojave is disproven. Even on Retina displays, disabling sub-pixel anti-aliasing leads to blurry, indistinct fonts.
I've been using Linux for 2 decades, and this year I moved to MacOS, with the newest MacBook and a 5K iMac. Everything pretty much just works, is much prettier by default than I ever got my Linux installations to be.
Given my experiences, I have full trust that the MacOS engineers will do good stuff with Mojave too, at least eventually. I still like Desktop Linux, though, and sincerely wish it will be as good some day. It most definitely is not today.
Am misreading this thread, or was this one person's complaint on reddit about beta software (with a single commenter saying they've seen the same behavior)?
Are there any more cases, or could this be a bug? Cause it seems like a bug to me.
I'm not surprised, Apple loves removing configurability and thinks "one-size-fits-all". It would be far better to make the font rendering settings more configurable instead, but sadly this trend is not limited to only Apple.
...and I'm saying this as one of those for whom any form of antialiasing is blurry, subpixel being worse (my eyes water and feel dizzy after prolonged exposure), preferring AA off and pixel fonts with sharp, high-contrast edges instead. Let people have the font rendering they want.
Seems like a further shift from user friendliness towards consumer friendliness. That is, alienating core and power user in favour of larger consumer masses.
So what's the solution here? Buy a new monitor if we want to use up-to-date versions of macOS?
Anyone who uses Apple's FCPX or Logic knows we'll have to update to Mojave because eventually we'll try to open the latest version of said software and be told "You must be running Mojave to open this"
I'd really ditch Apple because of stunts like this, but I really love Logic and FCPX so I'm stuck with it.
PenTile, RGBW Quad, etc -- basically everything except plain old RGB stripes. Of all the possible display layouts, SPAA really only works with one (or it has to be re-designed for each, which realistically will never happen).
The stats I've seen are that most Mac users, as of this year, are on @2x displays.
I so knew it! I even put a reminder in my calendar some time ago (not knowing what the next update holds) to not upgrade until I know for sure that my experience won't get worse.
Ok, so what are options in terms of external displays? I guess this will make my Dell 1080p monitor much less appealing... I planned to jump to 4k 24inch anyway... ¯\_(ツ)_/¯
Wow. The one biggest advantage my previous MBP had over my current Arch Linux + i3 setup on Carbon X1 is the natural support for HiDPI, as I connect my laptop to external monitors all the time (The only other one is the built-in dictionary). Now Apple are going in this direction further and further I definitely don't see myself returning to Mac any time soon, or indeed ever.
As of "lately" in the sense of "in the last 10 years", it was always great. Freetype always had great rendering, on part and often superior to both ClearType and OSX to my eyes.
I attribute the bad perceived performance of freetype to the lack of good and/or commercial fonts.
The default configuration in most distributions is decent, but with little tuning everything can be changed to your taste. I ran with grayscale AA since the beginning because I find the color fringing annoying.
I also used to like the bytecode hinter, but in the last years with my laptop having 120+ dpi, I find the freetype autohinter to be actually superior as it better preserves the letterform and provides sharper results even without subpixel AA.
The settings can also be tailored per-font, and apply system-wide, with the exception of some stupid QML and Electron apps.
Thanks for that detailed response. I last used Linux full-time as my main machine in 2009 or so. It has indeed been a while. Good to know that it has improved since that time.
I'll be that guy and say that I don't like sub-pixel anti-aliasing—I find the color fringes distracting. I turn it off wherever possible. That said, people seem to like it so it's a shame the option is going away.
I don't know how you can justify making devices so expensive. It's almost like buying Apple calls out for social status. That brand will always remind of class warfare.
Its’s not a bug. It was announced as a change at WWDC. It’s believed to be a consequence of the move to merge iOS code into macOS. iOS doesn’t support sub-pixel antialiasing.
Apple strikes again....they recently destroyed having two external displays with a recent update, and now they mess up single screen external displays too.
I have a MacBook Air connected to a 2560x1440 screen and fonts looks good. I was looking forward to Mojaves dark mode.
Now I'm afraid to install Mojave. I understand that Apple wants to simplify their code, but they can't just remove things before having an alternate solution to a problem that exist.
Font rendering is really super important... I really hope they reconsider.
But the reason I’m sad is that this wonderful hack was short lived: https://docs.google.com/document/d/1vfBq6vg409Zky-IQ7ne-Yy7o...
My pet theory is that macOS is going to pull in a bunch of iOS code and iOS has never had subpixel AA.