Hacker News new | past | comments | ask | show | jobs | submit login
Apple Turns Its Back on Customers and Nvidia with MacOS Mojave (forbes.com/sites/marcochiappetta)
101 points by redial on Dec 13, 2018 | hide | past | favorite | 77 comments

Headline blaming Apple and then, in the 8th paragraph,

> At this point, it’s somewhat unclear exactly as to why NVIDIA GPU support isn’t present in Mojave

I also wasn’t clear on that from the article. Is there a pending driver update that NVIDIA is waiting on Apple to approve?

It sounds like the current driver is bad and Apple is a convenient scapegoat to blame for the issues.

Since there's nobody else responsible for releasing Apple's closed source OS, what makes you think Apple is not to blame on this?

Perhaps Nvidia are writing drivers that don't conform to the guidelines Apple provide for approval? Maybe they're trying to pull a Logitech and include lots of data gathering that Apple object to. Or they're ignoring things like events from power management. Or their drivers are just shitty Mojave citizens. Or they're trying to force a Mojave equivalent of GeForce Experience to be installed with their drivers.

If Nvidia are being dicks in the face of reasonable requests, why would that be Apple's fault?

> If Nvidia are being dicks in the face of reasonable requests, why would that be Apple's fault?

Because Apple sold the hardware /with/ the software, and now they completely broke that (<5 year old) hardware?

Point of clarification, the article does not mention and I have no reason to suspect that Apple broke hardware they themselves sold. In fact the article does seem to point out there is support for specific Nvidia cards that Apple sold or approved. Also I find it rather impossible to believe that Nvidia couldn't release something that would restore this ability. Would a user need to disable some security feature temporarily to be able to install it? Maybe, but that's the price you pay for unsupported hardware.

Apple got burned hard [0] by Nvidia and swore off them back around 2009. And Linux Torvald also called them out back in 2013ish IIRC. Nvidia is not a "good" company. Now people have be running things unsupported and now Apple closes that hole and they are all up in arms?

[0] https://gizmodo.com/5061605/apple-confirms-failing-nvidia-gr...

> I have no reason to suspect that Apple broke hardware they themselves sold

In fact my own 15" MBP late 2013 has a GFX750, which is no longer supported according to Apple Support [1].

The worst for me is that this didn't withold Apple from pushing the update, so running Mojave with an external monitor is hardly possible now.

> Nvidia is not a "good" company.

Agreed. Neither is Apple.

[1] - https://support.apple.com/en-us/HT208898

> In fact my own 15" MBP late 2013 has a GFX750, which is no longer supported according to Apple Support [1]

That support article doesn't mention the GFX 750 and I can't find any record of Apple selling a MBP with a GFX 750...


I believe what you meant to say is you have a "NVIDIA GeForce GT 750M with 2GB of GDDR5 memory and automatic graphics switching" [0] which it appears does not support metal.

[0] https://support.apple.com/kb/sp690?locale=en_US

Metal's supported on that card. Works on a GT650M too, from a 2012 rMBP.

You're correct about the card, sorry about that. Got confused I suppose. Indeed I meant to say I have the GT 750.

I should have added this in my edit but I'll put it here:

I stand corrected. I DO think it's wrong for Apple to drop support for something they shipped, especially since it's just over 4 years old. I thought this issue was limited to people who had Mac Pro's or Hackintosh's and put an unsupported card in it. I'll admit I've only used 13" MBP's for the last 10+ years until my most recent MBP and so dedicated graphic cards wasn't in my wheelhouse. I honestly thought they stopped ALL Nvidia cards back in 2009ish.

As a sibling comment pointed out, it seems it does appear to be supported? I have the same machine:

  Chipset Model:	NVIDIA GeForce GT 750M
  Type:	GPU
  Bus:	PCIe
  PCIe Lane Width:	x8
  VRAM (Dynamic, Max):	2048 MB
  Vendor:	NVIDIA (0x10de)
  Device ID:	0x0fe9
  Revision ID:	0x00a2
  ROM Revision:	3776
  Automatic Graphics Switching:	Supported
  gMux Version:	4.0.8 [3.2.8]
  Metal:	Supported, feature set macOS GPUFamily1 v4

You are aware that third party hardware vendors have been writing drivers for closed sourced operating systems for well over 30 years aren’t you?

Anyone requiring special hardware for their work is a fool to use Apple computers.

Anyone requiring special hardware for their work is a fool to upgrade their computer or OS without a very good reason and a lot of very thorough testing.

Ah, the sweet, sweet sound of victim blaming

Seriously, It's Apple. They must have known.

How should anyone Jane known, based on previous experience being the exact opposite behavior?

Apple has a strong and well known history of vendor lock-in and control.

These people are discovering, and hopefully coming to terms that they never owned any Apple device. The device was always a slave to Apple. It follows the orders of its master, not the foolish human who thought they owned it.

And its master has chosen not to have external nvidia devices work anymore. The lock in will continue. The walled garden will continue to encircle.

It helps to spread the word. Too many people stay in an abusive relationship for too long. Those people need support, not blaming.

Calling him or her a victim is riddiculus.

I don't think there is anything inherently wrong with that.


Though it's relatively simple to start working on apple hardware then train on something like Lambda Labs hardware.

I've never been at the forefront of upgrading my Macbook, and that seems like a wise decision now. Still, increasingly poor versions of OS X present a clear end of the line for my 2011 Macbook Pro.

Some time ago I was forced to upgrade to Sierra, because LinkedIn's website stopped working in Chrome. (I don't use LinkedIn much because it's awful, but a lot of clients find me there.) Turns out LinkedIn felt my version of Chrome was too old to support. But why was Chrome too old? Doesn't it update automatically? Yes, as long as the new version supports your OS, and apparently Chrome had stopped supporting Lion quite some time before. So I had to upgrade, and although I would have preferred to upgrade to Maverick, Apple only offered the option to upgrade to the latest version: Sierra.

If Mojave is such a no-go, upgrading to the version just before Mojave may not be possible, so I might be stuck on Sierra until my Macbook collapses, slowly watching websites drop support.

Obviously my next machine is not going to be Apple. I'm probably going to get a ThinkPad with some version of Linux if I can find a nice one.

Owner of a late 2013 15" MBP here. I never been eager to upgrade to new mac version too.

> Some time ago I was forced to upgrade to Sierra

Until about a month ago I ran Sierra - which worked fine and in 2 years didn't crash my mpb once.

Then I had to compile a few ios apps for work, and since xcode was outdated on Sierra, I had to upgrade to High Sierra. High Sierra kept crashing on me. Several coworkers also had this experience and suggested to upgrade to Mojave.

Meanwhile Apple also kept pushing to upgrade to Mojave via an os notification they showed a few times a day. So I sadly ran the "upgrade" to find out the system was much less stable than before. No I see why. Sadly my GFX-750 isn't supported.

For me this is the end of the line on macs. Newer MBPs have broken keyboard that Apple refuses to fix, and are really expensive to the old hardware they come with. And apparently you cannot even expect a >$3500 MBP to outlive 5 years because Apple breaks it with their software patches.

I'll have to buy a new laptop because I cannot even connect a proper external monitor anymore. Obviously it won't apple product. I'm thinking to go System76.

Thanks for the warning. I'll stick to Sierra as long as I can.

It sucks, though, that these crappy updates make us reluctant to keep our system up to date. It shouldn't be like this. New versions should be better, not worse. And it should be possible to roll back a bad upgrade.

as an aside - the 2011 machines at this point have good support for Linux, you should be able to eke out a bit more time with your machine and still receive software updates and fixes if that's the route you wish to take.

New OS doesn't yet have drivers for some cards, got it.

"some cards" meaning the vast majority of discrete cards on the market.

Cards that Apple never formally supported and people knew that when they bought them. This is equivalent to hackintosh people getting worked up over Apple not supporting their MB/RAM/HD/PCI Card/etc.

No more like new OS does not support any cards except ones specifically branded for it. abandon ship

This isnt news. Apple has always done this.

how come external gpus worked fine on high sierra(there was even fan fare about egpus being supported with an "official" egpu) is that all gone now?

I saw the writing on the wall and made a cute little mini pc in a cooler master elite 110 instead of staying in the egpu space

Only AMD graphic cards work fine natively since HS. The problem was that people wanted Nvidia support, they had web drivers from Nvidia for HS but there is no Mojave version yet IIRC.

Apple has only been supporting AMD graphic cards natively with macOS since the beginning and they never said Nvidia GPU would be supported out of the box nor do any of their docs say it is. They have a strict list of supported eGPU boxes and GPU for each boxes (https://support.apple.com/en-us/HT208544).

eGPU support in macOS has been crappy at the start even for AMD cards but they've been steadily improving since then. The problem is Apple takes forever to improve their support after only offering the option to use eGPU for certain apps in Mojave.

Nvidia drivers are likely (I doubt given Apple's recent history track) to be added to macOS when they release the redesigned Mac Pro later next year as they'll need to support various cards. Right now, there is no Apple device with Nvidia cards, so no drivers either.

On the other hand, Apple may just do the silly thing of releasing Mac Pro with no chance of changing GPUs. If this happens, there will not be any chance of Nvidia drivers for a long time.

Apple hasn't quite tightened the noose of lock-in on OSX as much as they have on iOS.

In a walled ecosystem you are a sharecropper not the owner.

Well, as other have stated. There is nothing stopping Nvidia or anyone else from writing their own driver.

> Apple fully controls drivers for Mac OS.

How so? Can you just not write graphics drivers for macOS?

It's more nuanced than that. Nvidia can write the drivers but releasing them without the blessing of Apple blows away any chance of them getting that sweet GPU contract back.

Nvidia doesn't have much of a real world application outside of GPUs in enclosures to keep the drivers alive. The reason to keep drivers ready is the potential for a very large contract from Apple.

Word on the street is that Nvidia didn't get much value ($) out of working with Apple, so perhaps both parties aren't super interested.

Same story with Nvidia and gaming consoles - last few gens of consoles have not used Nvidia chips and Nvidia doesn't see it as a big loss. The margins must be too low.

Nvidia seems to aim for higher margin products these days with scientific computing/data center/deep learning/hardcore gaming.

>Nvidia seems to aim for higher margin products these days with scientific computing/data center/deep learning/hardcore gaming.

Which is a problem. No (Real) Games on Mac, No Deep Learning/ Scientific Computing on Mac.

Mac is now left with Programming, and Video / Graphics Editing.

May be Apple's strategy for Mac is to milk it for as long as possible. They don't see it as a platform for growth ( Despite having plenty of room ).

It seems to me like Apple has been on the road to slowly abandoning the Mac for quite some time. Newer Macbooks are unpopular, updates to OS X have problems, and on the whole, the platform just doesn't get much love from Apple. They want us all on iOS.

I'll be moving to Linux instead.

I think only apple ally is Adobe. If you want to do pro design or video you need adobe. Adobe will never release on Linux so if you dont want to use privacy killing cloud os like windows 10 then you are left with mac.

Hackintosh is way many people are taking for those reasons.

The “newer MacBooks” are unpopular is just a meme in the tech bubble. While Apple will stop reporting unit sells next quarter, they just posted last quarters numbers. There is no indication that they are “unpopular”.

Maybe they still sell, but I know of nobody who is enthusiastic about them. People used to love them.

Why should we pay more credence to the anecdotal “people you know” instead of the reported sales volumes?

That is Sales Volume, but its 100M Active Devices, is way slower than what most predicted. Apple took more than 2 years to add 10M Active Devices, on a ~40+M Unit sold during that time, that means 30M are either replacement or, there are quite a high churn rate to Mac. It is possibly the only reason why Apple suddenly come up with iMac Pro, new MacBook Retina etc. It wasn't the sales that matters, it was user leaving its platform. By previous trend and growth at a nearly consistent 20M unit per year, Mac should have at least 120M Active Devices by now, if not more.

Why does Apple care about active users for the Mac and not sales volume?

Active users for iOS devices makes sense. Apple gets recurring revenue from iOS users. Hardly anyone uses the Mac App Store or buys iCloud storage just for the Mac and OS upgrades are free.

You will need a user base for Software Distribution. Will you design your software for a 10M or 100M Mac user. If you are changing to a Services product that only oriented around your products, the more user the better.

It depends on the software...

I wouldn't care about the overall number of users, just the number of users in the market that I'm targeting. The creative market didn't abandon Apple when the overall Mac market was in dire straights because the market they cared about still had a lot of Mac users.

From what I can see, the consumer market for software is basically dead. Is anyone making serious money on non game personal computer software besides Adobe and Microsoft?

The smaller Mac only software companies are going by the "thousand true fans" strategy. Stay small and get a small loyal customer base.

It's a little funny how stable the desktop ecosystem has been for 10 (or maybe 30?) years.

Mac gaming still sucks, EE/ME/Enterprise engineers use windows. Web devs and Artists use macs. Academics and scientists use Linux.

These ecosystems are so stable now that I don't see it changing.

Granted, it could change, but they would have to really try... And Apple has shown decreasing vision for the Mac platform over time.

> Mac is now left with Programming, and Video / Graphics Editing.

You forgot Audio, which is HUGE on the Mac.

“Same story with Nvidia and gaming consoles - last few gens of consoles have not used Nvidia chips and Nvidia doesn't see it as a big loss. The margins must be too low.“

The Nintendo Switch uses a NVIDIA Tegra, and NVIDIA put a decent amount of work into selling it - the NVIDIA Shield TV console is basically a tech demo for the thing after all.

Is Apple still pissed about defective Nvidia 8600M GT laptop GPUs?

I find that to be an unsubstantiated excuse. There is nothing stopping Nvidia from writing its own driver for macOS.

Does NVIDIA actually have a contract with Apple? I was under the impression that Apple silently tolerated them, and that they just released drivers on their website for eGPUs.

> Can you just not write graphics drivers for macOS?

So this is seriously what you suggest owners of an older MBP (official Apple hardware, just over 4 years old) to do?

Since both Apple's drivers and NVidia's drivers are completely closed source, I'd say it's hardly possible to write a working driver (w/ hardware acceleration) for it.

I’m quoting an NVIDIA engineer here; I’m asking why they can’t write their own driver instead of waiting for Apple. (Though, I’m sure that people in the Hackintosh community would not shy away from stepping up if asked…)

Since it's hardware sold by Apple I would assume that Apple figures that out with NVIDIA and collaborates. Apparently I'm wrong and Apple just doesn't give a crap as long as they can sell new products.

For a hobby computer it can be interesting to write your own driver. For a professional laptop not so much :)

You can write them but you'll need to get them signed by Apple to be able to deploy them in a sensible way. As to whether Apple will sign drivers to run MacOS on non-Apple hardware remains to be seen.

Again, is this something specific for graphics drivers? Because I can write a driver today, sign it with my developer certificate, and distribute it to others without Apple’s approval; if I have blanket preapproval my users can install it without disabling SIP as well.

A standard developer certificate won’t help you for drivers (except for drivers that purely run in user space, such as FUSE filesystems).

Anything that is a kext (kernel extension) requires a special kernel signing developer certificate so macOS will allow users to install it without disabling SIP. Apple is extremely conservative in handing out those kext certificates, and even if they grant you one, they will impose super harsh restrictions on what you can do with them.

I found this out the hard way a month ago when I updated my hackintosh from 10.13 to 10.14.

Never assume, even several weeks after a macOS release, that working Nvidia drivers will be available!

You live on the edge... I am on a legit mac and I'm still on HS for at least another month or two just to be safe.

Why you would use an Apple computer in actual "production pipelines" that generate revenue, need upgrading, and use external hardware, is beyond me.

In that article we get to read about entire firms using rendering pipelines that are now useless. While that is a terrible blunder by Apple, I really would ask how the responsible parties thought it a good idea to rely on an ecosystem that they have zero control over and that should have been considered "supported" only in an unofficial sense, no matter what Apple says. Heck, the upgrade even breaks older Apple built machines.

Macs and Apple machines are only production machines "as is". And that means they are only made to be interface/user machines. They don't scale, they don't upgrade and they don't work with external hardware. All decisions by Apple - walled garden, the lack of connectivity and the upgrade policy make this ABUNDENTLY CLEAR.

If Apple technology is a node in a pipeline that isn't entirely Apple (or, even then), and those things can not be replaced by other machines immediately, or kept upgrade&update free, then it's your fault.

Who the hell upgrades their entire production system to new OS releases without thoroughly testing beforehand? Especially if you're running completely unsupported hardware configurations.

That's just extreme ineptitude at best, grossly negligent at worst.

You just don't do that no matter the OS/hardware vendor - How many people have run afoul of Microsoft releasing broken patches into the wild? If you have mission-critical systems, you test everything in isolation first.

I'm really unhappy that Homebrew twists user's arms to update to the latest version of OSX. I'm glad that I waited a year before I updated to High Sierra, seeing as how there were APFS file corruption bugs, and a default root login bug!

We need to stop being so enamored with Apple, and treat them with the same skepticism as Microsoft.

Several people in my office have been bitten by the Mojave bug and are now regretting it. You should wait at least six months to update OSX.

I'm not sure what you're talking about. I used Homebrew quite successfully this past summer on 10.7.5 - the only problem was that a few formulae weren't available for it. I've since upgraded that machine to El Capitan and Homebrew still works great and I haven't found anything that doesn't work, including CUDA.

I mean seriously. When you had to start to daisy-chain a curious mix of Thunderbold 3 aftermarket applicances, splitters and adapters to your Mac Pro to "set up your pipeline", because Apple has openly, transparently and clearly demonstrated and communicated its periphery usage policy, it should have become clear that what you are doing is VOLATILE.

It's because developer/designer time is worth much more than whatever hardware costs. If you have a great developer on staff and they say they do their best work on OS X, you give them that. If you don't someone else will.

I agree with you if "pipeline" has the narrower meaning I think you're using. In a more general sense, though, I use a Mac all the time in my software dev "pipeline", but I'm much more careful these days to strenuously avoid using anything on the Mac that I couldn't take with me to the next platform (which won't be iOS) once Apple courageously removes enough usefulness from new Macs that I give up and move on.

I bet that this is due to Nvidia pushing their agenda with CUDA, as opposed to Apple championing either Metal or OpenCL.

OMG, I can not use a Mercedes Engine in my BMW

Apple 100% does not care about it's creative class anymore. I love my 2013 macbook air and pro. It's over though. If you need to get things done there are plenty of linux options.

Yes, all of the trackpads suck.

EDIT: Please down vote if that expresses your feels, but if you've got a new MBP or Mac Pro and feel supported by Apple I'd love to know why and what you use it for. I miss being able to buy a solid computer from them that I knew would be my workhorse for 2+ years and have a long life after.

If you are a software developer, you have no excuse not buying a Linux compatible machine and using it as your daily driver. It really isn't that hard. I "downgraded" from a MacBook Pro 2017 non-touchbar because I realised I fundamentally disagree with Apple and the direction of our monoculture. Ironically I bought a second hand Levono ThinkPad Carbon X1 gen3, which was originally IBM, which Apple claim are 1984 dictators. How times have changed. Im nearly completely free of Apples ecosystem, thank god.

> If you are a software developer, you have no excuse not buying a Linux compatible machine and using it as your daily driver.

False. I've tried it and it doesn't compare to OS X. Also this SOOOO rich in a thread about Nvidia (which SUCKS on linux). You think you are in driver hell on Mac? Oh boy, strap yourself in.

> Im nearly completely free of Apples ecosystem, thank god.

Enjoy your "freedom", I'll enjoy getting real work done without futzing with something that "Pretty much works (tm)" but has some kind of gotcha. I'm sure the developers here at my work who use a Linux desktop would tell you "It's great, I love it" but somehow I'm the lone developer who doesn't have display manager crashes, complete rebuilds needed, and graphics driver hell. Yeah, I think I'll stick to my "imprisonment".

I mean, unless you're developing for iOS - then you don't really have a choice.

Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact