> At this point, it’s somewhat unclear exactly as to why NVIDIA GPU support isn’t present in Mojave
It sounds like the current driver is bad and Apple is a convenient scapegoat to blame for the issues.
If Nvidia are being dicks in the face of reasonable requests, why would that be Apple's fault?
Because Apple sold the hardware /with/ the software, and now they completely broke that (<5 year old) hardware?
Apple got burned hard  by Nvidia and swore off them back around 2009. And Linux Torvald also called them out back in 2013ish IIRC. Nvidia is not a "good" company. Now people have be running things unsupported and now Apple closes that hole and they are all up in arms?
In fact my own 15" MBP late 2013 has a GFX750, which is no longer supported according to Apple Support .
The worst for me is that this didn't withold Apple from pushing the update, so running Mojave with an external monitor is hardly possible now.
> Nvidia is not a "good" company.
Agreed. Neither is Apple.
 - https://support.apple.com/en-us/HT208898
That support article doesn't mention the GFX 750 and I can't find any record of Apple selling a MBP with a GFX 750...
I believe what you meant to say is you have a "NVIDIA GeForce GT 750M with 2GB of GDDR5 memory and automatic graphics switching"  which it appears does not support metal.
I stand corrected. I DO think it's wrong for Apple to drop support for something they shipped, especially since it's just over 4 years old. I thought this issue was limited to people who had Mac Pro's or Hackintosh's and put an unsupported card in it. I'll admit I've only used 13" MBP's for the last 10+ years until my most recent MBP and so dedicated graphic cards wasn't in my wheelhouse. I honestly thought they stopped ALL Nvidia cards back in 2009ish.
Chipset Model: NVIDIA GeForce GT 750M
PCIe Lane Width: x8
VRAM (Dynamic, Max): 2048 MB
Vendor: NVIDIA (0x10de)
Device ID: 0x0fe9
Revision ID: 0x00a2
ROM Revision: 3776
Automatic Graphics Switching: Supported
gMux Version: 4.0.8 [3.2.8]
Metal: Supported, feature set macOS GPUFamily1 v4
These people are discovering, and hopefully coming to terms that they never owned any Apple device. The device was always a slave to Apple. It follows the orders of its master, not the foolish human who thought they owned it.
And its master has chosen not to have external nvidia devices work anymore. The lock in will continue. The walled garden will continue to encircle.
Though it's relatively simple to start working on apple hardware then train on something like Lambda Labs hardware.
Some time ago I was forced to upgrade to Sierra, because LinkedIn's website stopped working in Chrome. (I don't use LinkedIn much because it's awful, but a lot of clients find me there.) Turns out LinkedIn felt my version of Chrome was too old to support. But why was Chrome too old? Doesn't it update automatically? Yes, as long as the new version supports your OS, and apparently Chrome had stopped supporting Lion quite some time before. So I had to upgrade, and although I would have preferred to upgrade to Maverick, Apple only offered the option to upgrade to the latest version: Sierra.
If Mojave is such a no-go, upgrading to the version just before Mojave may not be possible, so I might be stuck on Sierra until my Macbook collapses, slowly watching websites drop support.
Obviously my next machine is not going to be Apple. I'm probably going to get a ThinkPad with some version of Linux if I can find a nice one.
> Some time ago I was forced to upgrade to Sierra
Until about a month ago I ran Sierra - which worked fine and in 2 years didn't crash my mpb once.
Then I had to compile a few ios apps for work, and since xcode was outdated on Sierra, I had to upgrade to High Sierra. High Sierra kept crashing on me. Several coworkers also had this experience and suggested to upgrade to Mojave.
Meanwhile Apple also kept pushing to upgrade to Mojave via an os notification they showed a few times a day. So I sadly ran the "upgrade" to find out the system was much less stable than before. No I see why. Sadly my GFX-750 isn't supported.
For me this is the end of the line on macs. Newer MBPs have broken keyboard that Apple refuses to fix, and are really expensive to the old hardware they come with. And apparently you cannot even expect a >$3500 MBP to outlive 5 years because Apple breaks it with their software patches.
I'll have to buy a new laptop because I cannot even connect a proper external monitor anymore. Obviously it won't apple product. I'm thinking to go System76.
It sucks, though, that these crappy updates make us reluctant to keep our system up to date. It shouldn't be like this. New versions should be better, not worse. And it should be possible to roll back a bad upgrade.
I saw the writing on the wall and made a cute little mini pc in a cooler master elite 110 instead of staying in the egpu space
Apple has only been supporting AMD graphic cards natively with macOS since the beginning and they never said Nvidia GPU would be supported out of the box nor do any of their docs say it is. They have a strict list of supported eGPU boxes and GPU for each boxes (https://support.apple.com/en-us/HT208544).
eGPU support in macOS has been crappy at the start even for AMD cards but they've been steadily improving since then. The problem is Apple takes forever to improve their support after only offering the option to use eGPU for certain apps in Mojave.
Nvidia drivers are likely (I doubt given Apple's recent history track) to be added to macOS when they release the redesigned Mac Pro later next year as they'll need to support various cards. Right now, there is no Apple device with Nvidia cards, so no drivers either.
On the other hand, Apple may just do the silly thing of releasing Mac Pro with no chance of changing GPUs. If this happens, there will not be any chance of Nvidia drivers for a long time.
How so? Can you just not write graphics drivers for macOS?
Nvidia doesn't have much of a real world application outside of GPUs in enclosures to keep the drivers alive. The reason to keep drivers ready is the potential for a very large contract from Apple.
Same story with Nvidia and gaming consoles - last few gens of consoles have not used Nvidia chips and Nvidia doesn't see it as a big loss. The margins must be too low.
Nvidia seems to aim for higher margin products these days with scientific computing/data center/deep learning/hardcore gaming.
Which is a problem. No (Real) Games on Mac, No Deep Learning/ Scientific Computing on Mac.
Mac is now left with Programming, and Video / Graphics Editing.
May be Apple's strategy for Mac is to milk it for as long as possible. They don't see it as a platform for growth ( Despite having plenty of room ).
I'll be moving to Linux instead.
Hackintosh is way many people are taking for those reasons.
Active users for iOS devices makes sense. Apple gets recurring revenue from iOS users. Hardly anyone uses the Mac App Store or buys iCloud storage just for the Mac and OS upgrades are free.
I wouldn't care about the overall number of users, just the number of users in the market that I'm targeting. The creative market didn't abandon Apple when the overall Mac market was in dire straights because the market they cared about still had a lot of Mac users.
From what I can see, the consumer market for software is basically dead. Is anyone making serious money on non game personal computer software besides Adobe and Microsoft?
The smaller Mac only software companies are going by the "thousand true fans" strategy. Stay small and get a small loyal customer base.
Mac gaming still sucks, EE/ME/Enterprise engineers use windows. Web devs and Artists use macs. Academics and scientists use Linux.
These ecosystems are so stable now that I don't see it changing.
Granted, it could change, but they would have to really try... And Apple has shown decreasing vision for the Mac platform over time.
You forgot Audio, which is HUGE on the Mac.
The Nintendo Switch uses a NVIDIA Tegra, and NVIDIA put a decent amount of work into selling it - the NVIDIA Shield TV console is basically a tech demo for the thing after all.
So this is seriously what you suggest owners of an older MBP (official Apple hardware, just over 4 years old) to do?
Since both Apple's drivers and NVidia's drivers are completely closed source, I'd say it's hardly possible to write a working driver (w/ hardware acceleration) for it.
For a hobby computer it can be interesting to write your own driver. For a professional laptop not so much :)
Anything that is a kext (kernel extension) requires a special kernel signing developer certificate so macOS will allow users to install it without disabling SIP. Apple is extremely conservative in handing out those kext certificates, and even if they grant you one, they will impose super harsh restrictions on what you can do with them.
Never assume, even several weeks after a macOS release, that working Nvidia drivers will be available!
In that article we get to read about entire firms using rendering pipelines that are now useless. While that is a terrible blunder by Apple, I really would ask how the responsible parties thought it a good idea to rely on an ecosystem that they have zero control over and that should have been considered "supported" only in an unofficial sense, no matter what Apple says. Heck, the upgrade even breaks older Apple built machines.
Macs and Apple machines are only production machines "as is". And that means they are only made to be interface/user machines. They don't scale, they don't upgrade and they don't work with external hardware. All decisions by Apple - walled garden, the lack of connectivity and the upgrade policy make this ABUNDENTLY CLEAR.
If Apple technology is a node in a pipeline that isn't entirely Apple (or, even then), and those things can not be replaced by other machines immediately, or kept upgrade&update free, then it's your fault.
That's just extreme ineptitude at best, grossly negligent at worst.
You just don't do that no matter the OS/hardware vendor - How many people have run afoul of Microsoft releasing broken patches into the wild? If you have mission-critical systems, you test everything in isolation first.
We need to stop being so enamored with Apple, and treat them with the same skepticism as Microsoft.
Several people in my office have been bitten by the Mojave bug and are now regretting it. You should wait at least six months to update OSX.
Yes, all of the trackpads suck.
EDIT: Please down vote if that expresses your feels, but if you've got a new MBP or Mac Pro and feel supported by Apple I'd love to know why and what you use it for. I miss being able to buy a solid computer from them that I knew would be my workhorse for 2+ years and have a long life after.
False. I've tried it and it doesn't compare to OS X. Also this SOOOO rich in a thread about Nvidia (which SUCKS on linux). You think you are in driver hell on Mac? Oh boy, strap yourself in.
> Im nearly completely free of Apples ecosystem, thank god.
Enjoy your "freedom", I'll enjoy getting real work done without futzing with something that "Pretty much works (tm)" but has some kind of gotcha. I'm sure the developers here at my work who use a Linux desktop would tell you "It's great, I love it" but somehow I'm the lone developer who doesn't have display manager crashes, complete rebuilds needed, and graphics driver hell. Yeah, I think I'll stick to my "imprisonment".