Hacker News new | past | comments | ask | show | jobs | submit login
MacOS finally gains external GPU support (techcrunch.com)
402 points by harshgupta on March 30, 2018 | hide | past | favorite | 192 comments



There's an amazing website and community if you are interested in external GPUs at https://egpu.io/ . Both for Mac and PC you will find guides, explanations etc.

There's also an eGPU subreddit and I wrote a sticky on U series chips, PCIe lanes and Thunderbolt 3 https://www.reddit.com/r/eGPU/comments/7vb0gg/u_series_chips... which is more interesting for PCs than Mac.


I've heard that external GPUs have too much latency to be that useful for a lot of applications.

What I'd like to see is a Mac with a nice big fat Nvidia GPU. Why can't I buy a Mac with a 1080 ti? My suspicion is that Apple wants to get into the GPU game themselves, and bring some of the expertise from their mobile GPU team to laptops and desktops.

That'd be great, if they actually released something competitive. So far, they haven't, and Mac users, who would love to pay for better technology, are left out in the cold as a result.


>My suspicion is that Apple wants to get into the GPU game themselves

My suspicion is that they barely even want GPUs, which they seem to see as annoying sources of heat and noise, getting in the way of their ultimate vision of an imac so thin you could use it as a kitchen knife.

(I wish I were kidding about this, but I am not.)


Their investment in Metal and their own mobile GPU hardware suggests to me that they do, and that they want to own the technology from top to bottom.

https://developer.apple.com/metal/

https://techcrunch.com/2017/09/12/the-new-iphone-8-has-a-cus...


Are mobile GPU efforts relevant to MacOS and desktop GPU work?


They will be if they start putting them in Macs.


I think we agree on that. I'm certain they will. But that also agrees with my initial statement, no?

For Apple, they want thin and light. Mobile is both things, and the thought of designing a system that can cope with the 200W+ draw of desktop GPU parts doesn't even factor in to their thinking.

They're tossing a bone to the users who need actual heavy-duty GPU power by talking about the ability to use external cards, but (in my opinion) considering the cost and performance issues that come with that approach, it is more an admission that they truly don't care about that segment of the market than an actual solution.


https://www.apple.com/imac-pro/specs/ — released just a few months ago.

The AMD Vega 64 has got to push a few hundred watts.

Microsoft has the Surface Book 2 with GTX 1060 and they must be looking at that as competition for the high end MacBook Pro.


I did a little poking around out of curiosity.

According to: https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-...

Vega 64 has a board power rating of 295W.

According to the imac pro tech specs from the link you provided, it has an idle draw of 64W, and a peak draw of 370W for the entire system.

Unless I'm mistaken, that seems to imply that there has to be a good deal of throttling between the CPU & GPU to come in at that power budget. (The specs also note an additional 50W(!) of potential draw for fans alone if the imac is operated in a warm environment.)

Am I mistaken?

Not arguing, just hashing out my understanding. Your point regarding the Surface Book 2 and the 1060 makes sense. It does look like I was technically incorrect about them not engineering to a 200+ watt GPU, although I do wonder what sustained draw it can handle thermally compared to a full desktop part, and how much that would affect performance.


> Vega 64 has a board power rating of 295W.

John from Mantiz reported 650W peak. Surely only for milliseconds but still. https://egpu.io/forums/implementation-guides/2014-mac-mini-v... it is absolutely not in the interests of an eGPU chassis manufacturer to report figures which hinder their own sales so I believe him.

Similarly, someone with an awful lot of AMD knowledge (AMD employee?) on the egpu.io forums pretty much begged people not to try the Vega 64 in a Sonnet 550 hinting at similar problems https://egpu.io/forums/thunderbolt-enclosures/sonnet-says-th... and saying a solution is in the works.

Sonnet http://www.sonnettech.com/support/kb/kb.php?cat=524&expand=_... says the Vega 64 is only supported in the 650 box -- which was just released pretty much because of the necessity to support the Vega 64. Something is rotten in Denmark: 650W power supply supports up to 375W card (up to 8-pin + 8-pin power connectors) plus provides additional 100W of peak power.


Since 2003, they’ve always made their best hardware solutions as mobile devices. Desktops have been more for specific use cases - almost exclusively geared toward niche use cases. Ie Mac Pro and iMac Pro. It seems more than a little odd to me. They don’t really offer a seam less transition from desktop to laptop to handheld. Even the old iPods felt more naturally integrated into their desktop usage than current.

I wish they offered a full size, liquid cooled but fans included high end/Nvidia GPU desktop. A system that’s not afraid to be noisy because their users aren’t seeking that. Then sync up the less hardware intensive tasks between their mobiles. But maybe that desktop market died in their eyes.


I think the market for liquid-cooled desktops is pretty niche itself. It just happens to be a niche that Apple is not interested in. Whereas for the majority of people, a regular iMac is just fine.


My suspicions is that they barely even want devs (given the new "Pro" line) -- they absolutely don't want gamers using their pretty laptops.

(I wish I were kidding about this, but I am not.)

Ok... so we had a bunch of the new MBPs at work and kept having keyboard issues (like 50% of the team complained about the keyboards and had keys that stuck within the first 3 months). I brought one machine in to the Apple Store and was scolded for eating while using the computer. "Hey Guys, I'm sorry that humans weren't in your target audience focus groups -- the new butterfly keys are amazing!" Wasn't even my machine... just doing a favor for a friend. Ugh.


They want devs that care about Apple systems, Objective-C and Swift devs, not devs that use it as pretty UNIX.

That was only when they needed to survive.


I'm pretty sure they only tolerate developers given the poor state of the MacBook keyboard design.


What professional developer codes full-time on a laptop keyboard and screen? The ones at work are all on dual-27" displays and external keyboard and mouse.

I personally disagree with making engineering compromises on the human interface to make things a few mm thinner, but the results speak for themselves. People keep buying them because Apple has succeeded in the number one goal of branding - their products are instantly recognizable as luxury goods. So if you're a pro developer who doesn't care about the branding, you just buy a different laptop that has the attributes you do care about. If developers abandoning the Macbook Pro because of nerd rage about low key travel and the ESC key cause sales to plummet, I'm betting Apple will change the keyboard. I'm also betting that the sales won't plummet.


I have a Dell U3415W w/ a second 24" 1080 for my desktop, and to be honest, I'm often more focused and comfortable working on my 13" MBP in a comfortable chair instead.

Xcode is far better on the desktop but for terminal + vim + broswer type work, the laptop screen is perfectly reasonable.

As for using OSX, I've just gotten comfortable with it. I never have to fiddle anymore. I always wait to upgrade OS's and it generally just stays out of my way. Windows is annoying and I haven't found a comparable fussless linux laptop yet (and I'm actively looking because I won't be able to keep avoiding that stupid touchbar).


I thought that eGPUs work through stuff like thunderbolt, where you're basically using a PCI Express port, so the slowness would only come from the length of cables relative to an internal GPU?

Or is this a misunderstanding on my part


Slowness is also due to thunderbolt being PCIe x4 as opposed to x16 - x4 has a fourth of the bandwidth. Games however don't max out x16 usually, so you'd lose about 25%.


I thought for sure you'd be wrong, but nope!

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_...


My theory would be that most of the transfer is about assets and games have more of a memory constraint rather than a bandwidth constraint in that space


I think it’s the other way around. Assets are loaded at initialization anyhow.


Used to be the case but most open-world games stream in assets nowadays and you can see the lower resolution assets fade out as the higher fidelity assets finish loading on the fly.

In games that want to load fast loading a save game might drop you into a pretty low fidelity world and then start loading in the higher res assets.


Unfortunately I don't remember where I read this, but I've once read somewhere that Apple refuses to use nvidia gpus because nvidia doesn't give Apple access to the driver source code. AMD on the other hand does.


Latency is not the issue, as Thunderbolt 3 is a direct PCIe connection. Only bottleneck might be bandwidth, usually you get 10-20% performance loss with an eGPU but with a 1080Ti that's still plenty of performance.


'Direct' PCI-E connections can still be pretty slow. It's way more like a network than a traditional bus.

The fail0verflow guys bridged PCI-E over a 115200 baud UART and it still worked.


> My suspicion is that Apple wants to get into the GPU game themselves, and bring some of the expertise from their mobile GPU team to laptops and desktops.

Eh, desktop and mobile GPUs are radically different beasts. My thought is that since they write their own drivers, and they obviously don't have the resources they once had in the MacOS team (see all of the crap High Sierra went through), they simply can't support more than a few SKUs.


Another possible reason is battery life,which is hardly be over 16hours if they choose NVIDIA independent graphic cards.


> My suspicion is that Apple wants to get into the GPU game themselves

They said the upcoming Mac Pro will be "modular", which I think is a clear sign it will have external GPUs, or certainly will lean heavily in the direction of encouraging users with a heavy GPU workload in that direction.


There's some kind of animosity between Apple and Nvidia which is why you only see AMD hardware in Macs


It's a very helpful community. I used the instructions to add a 750ti to my wife's Mac Mini. It was painless, and the 3D modeling software that used to crash her computer runs great.


For the sake of archives: if the 750ti is a good level for your application (which it often is) then Lenovo has a really, really small Thunderbolt Graphics Dock containing an 1050. Reports of success on Mac varies. Read the relevant subreddit wiki https://www.reddit.com/r/thinkpad/wiki/intro for a nice discount on it.


This is speculation, but I’m betting that we will see a GPU-backed external display (sort of an iMac minus the Mac) sometime in the next year. Apple never builds out support like this unless they plan on having a product to sell.

And they seem to generally "dislike" putting discrete GPUs in laptops, going all the way back to the Titanium G4 days.


It could happen, but from Apple's own announcements it seems like they were worried that the Mac was increasingly marginalized as a platform for AR/VR development, which Tim Cook constantly tells us (for AR at least...) is something he's super excited about for the future of computing. Not arguing this is the sole reason, but it definitely seems like a feature to cover high end edge cases like this for the time being.

Much of Apple's own discussions on the eGPU feature at WWDC/the marketing blurb for their own eGPU devkit refer to VR a lot too. Apple added VR support to Final Cut Pro at roughly the same time as well.

> And they seem to generally "dislike" putting discrete GPUs in laptops, going all the way back to the Titanium G4 days.

Not so sure I'd agree with this - every single Titanium PowerBook model ever made had a discrete GPU. Even the iBooks had dedicated GPUs. It wasn't until the adoption of Intel processors that integrated GPUs was even an option for Apple really.


>> it seems like they were worried that the Mac was increasingly marginalized as a platform for AR/VR development

As well as ML development, which is a pain w/o a local CUDA-compliant GPU. Yes, you could do it on the cloud, but sometimes it is easier to do ad-hoc stuff locally without having to spin up GPU machines and remember to spin them down.


Ding ding ding.

I use OS X for all software development except CUDA at the moment. This is huge.


isn't lambda/cloud function/server-less the suitable alternative for that pain point ?


>It could happen, but from Apple's own announcements it seems like they were worried that the Mac was increasingly marginalized as a platform for AR/VR development, which Tim Cook constantly tells us (for AR at least...) is something he's super excited about for the future of computing.

Because it is. High end AR/VR is exclusively DirectX Wintel 64 based. There's pretty much no exception except for the toy implementation of things like Samsung Gear. Google abandoned Tango, and Daydream is on life support. Apple's ecosystem will remain irrelevant until they release their own hardware.


For software sure. VR video content production is still a thing as well though, and plenty of video creators use Apple platforms. I imagine it was no accident FCP gained VR support shortly after the eGPU support announcement.


The iPhone is their AR hardware.


And where do you develop and compile apps for those iPhones? ;)


It's very pricey, but Apple sells an 18-Core 2.3GHz Intel Xeon W in an iMac with a 27" 5K display and a Vega 64 GPU, and up to 128GB of 2666MHz DDR4 ECC memory.

So while that is not a cheap option, Apple does sell powerful hardware. That's a decent AR dev machine, surely.


On a 15” screen with cruddy video card options. And apparently it’s on your lap too.


Or on a 27in 5k display iMac.


In the PowerPC days, they didn't really even have the option of an integrated GPU though. Even Intel's GPU that was integrated into the north bridge didn't come along until 98. And it was more of a cheap option for desktop fleets rather than a power conscious chip for laptops.


oops, you are correct


I'm skeptical of relying on eGPUs for high end VR. TB3 can easily choke the GPU and cause stuttering/framedrops, and missing any frames in VR is a recipe for nausea.

https://www.pcper.com/reviews/Graphics-Cards/External-Graphi...


If Tim Cook is actually concerned about the Mac being marginalized for AR/VR, he'd get a lot more bang for his buck out of not treating OpenGL and Vulcan like the red-headed stepchildren on macOS.


All middleware that matters on VR space also supports Metal.


"... it seems like they were worried that the Mac was increasingly marginalized as a platform for AR/VR development ..."

If, by "worried", you mean: they went out of their way to smite and thrash the organic core of their customer base, ignored their most basic requests, and deleted five years from their product lifecycle (nobody used the trashcan mac pro) ... then yes, I suppose they might be worried.

Maybe they noticed how many copies of High Sierra are running on firmware-modded 2009/2010 mac pros ?


Nah , I think this is one of the best solutions as is. I have a MacBook Pro 13 that I carry around with me for software development. I also have a basic AMD 2600 G setup with a GTX650 for gaming . I only game at home or do deep learning. These thunderbolt systems do exactly what I want . Get rid of the need for an extra computer . I am thinking of selling the system and then buying one of the recommended systems .


This. My hope when I saw the Surface Pro 3 was that the middling eGPU (Thunderbolt 2) thing going on with Macbook Pros would eventually make its way to the Surface Pro line where the Surface pro essentially becomes a hub and it could be the tablet/gaming console-pc/desktop I had hoped.

At this point, I'm just waiting on an updated MacBook Pro 15in (ready to buy in again, but too far into the current MBP product cycle. Might as well wait it out) and that Sonnet box and calling it a day.


I bought a dev kit when they became available, hoping to move my gaming from my 10-year-old PC to my 15" MBPr. What I figured out was that, despite the graphics performance being great, I was still hampered by the fact that the AAA games available on the Mac are still just Windows ports. Civ V and VI ran smooth to scroll around the screen, even at high detail and resolution, but actually performing the actions, and waiting for the computer to process the AI players' turns, was abysmal.

Against everything I hold right and proper, I installed Windows 10 with Boot Camp. I only ever got the external display to work once, and couldn't repeat it.

I wound up just moving the RX580 to my aging PC as an upgrade.

I REALLY want to live in this world, but it's going to take natively-developed games to get there. But hey, they're coming. I'm buying them, and, often, not even playing them; just trying to send the message.


> Apple never builds out support like this unless they plan on having a product to sell.

Developers have been starting to leave the platform because there aren't any MBPs that can be used to create VR content. While I agree that they will likely release something, their hand was also forced here.


Not all developers care about VR, and Apple OS never were much relevant on CAD or gaming development anyway, other than those targeting iOS devices.

For those devices, even current hardware is good enough. It is no point being able to produce VR with a level of detail that an iPhone is not able to cope with.


> Apple OS never were much relevant on CAD

“Development of ArchiCAD started in 1982 for the original Apple Macintosh.”

https://en.wikipedia.org/wiki/ArchiCAD


Relevant in money, market share, not what started on the Mac.

Excel was also released initially on the Mac.


And well before that viscalc was a big selling point for Apple ][ 's back in the day.


> Developers have been starting to leave the platform because there aren't any MBPs that can be used to create VR content.

Developers have also been leaving the platform because they can’t buy a Mac laptop with physical function keys without making other performance sacrifices.


In many ways, the display is a more sensible place for it. This could be a good thing long as they make the GPU component replaceable. There is such a high difference between the expected lifespan of an upmarket display and a performant GPU.


they seem to generally "dislike" putting discrete GPUs in laptops, going all the way back to the Titanium G4 days.

The Tibook g4 had a discrete GPU. Not having one wasn't even an option until Intel's integrated doodads got good enough.


> I’m betting that we will see a GPU-backed external display

Does iOS show any signs of supporting external GPUs?


It would be great! My question is: Is it possible to have a GPU-backed external display and charge the laptop simultaneously with one wire?


Yes. I do exactly this today with a 1080ti in an Akitio Node Pro. Unfortunately NVIDIA cards still aren’t supported properly though, so my laptop crashes if I pull the thunderbolt cable out. I’m hoping this stuff gets ironed out over the next year or two - it’s a great solution otherwise.


Since thats the case today with eGPU thunderbolt 3 docks I would assume so.


PowerPC Macs never had "integrated" GPU's. If they didn't have a discrete chip they just had a dumb frame buffer.


While I find your prediction very plausible I could also imagine them just wanting to support 3rd party accessories.


Or an Augmented Reality / Vr headset with AirPods support.


aren't they completely out of the stand-alone display business now?


Not for long, probably thanks in part to the total shitshow that was/is LG's TB3 display.

"As part of doing a new Mac Pro — it is, by definition, a modular system — we will be doing a pro display as well. Now you won’t see any of those products this year [2017]; we’re in the process of that. We think it’s really important to create something great for our pro customers who want a Mac Pro modular system, and that’ll take longer than this year to do."

https://daringfireball.net/2017/04/the_mac_pro_lives


I've been making the same guess for years now.

Ever since hwtools (supplier of expresscard/mini pcie breakout boards) said they were not allowed to release a thunderbolt version, it was clear to me that this generation, the dma-exposing peripheral interface was finally meant for external graphics processing. And who, oh who could be the prime customer?

... Who seems to have the slowest product pipeline in the world, blocking technology adoption for years?

Not too many candidates :)


Apple has a pretty decent summary of the feature in this support article: https://support.apple.com/en-us/HT208544

Seems to be limited to AMD GPUs at this time.


And limited to Macbook Pro. Damn, I was hoping they would include support for Macbooks.


Not exactly, here are the specifics:

eGPUs are supported on MacBook Pro notebooks released in 2016 and later1, iMac computers introduced in 2017 and later, and iMac Pro.


Macbooks don't have a TB3 port


Which is a shame. This is the only reason my wife still uses old 11" Mac Air. She has an Apple Thunderbolt screen which she doesn't want to replace and is unusable on the Macbook. Pros are just too big for her, so she keeps on using the old Air.


The next version of the MacBook will likely support Thunderbolt. Current Intel chipsets require a rather large (and power-hungry) chip for Thunderbolt (MBPs actually have two of them!), but it'll be integrated soon; they'll get it for 'free'.


The current MacBook similar size bigger screen, but wow is the dongle life a horror.


It's not only a horror, it's idiotic and completely avoidable.

Some people say it's an upgrade, but I don't see why they couldn't upgrade and keep the other ports around, since people need those daily.

For sure I've used my 89-euro dongle a lot more than I've used the USB-C port (which is never), until I got sane again after 8 years of getting overcharged by Apple and switched to Microsoft hardware running Xfce Linux (which are both wonderful).


What dongle is €89!?


The one I bought with the laptops so that I could use my external monitor and plug my cell phone.


This one is $69 and has USB C, HDMI, and USB A: https://www.apple.com/shop/product/MJ1K2AM/A/usb-c-digital-a.... Does it not fit your needs?


Why does it matter, and why is it so important to you? Does it change anything?

It was the only adapter I could buy at Juice in Florence, Italy (where I bought the laptop) that had HDMI at the time.

The laptop was EUR 1799 ($2,150), the dongle was EUR 89.

After 2-3 months the keyboard started failing. It was replaced by Apple--along with the battery which they found out was faulty, and it started failing again (missed keystrokes, registering twice) after a short while. They replace it _again_, then after 3 days the logic board dies, taking all data with it. Apple replaced it, I sold that piece of shit of a laptop for EUR 1,000, and the guy called me after 2-3 days that the logic board died again and Juice (no Apple Store in his area) would give him a new one.

That was my about 5th and last Apple laptop.

I read lots of reports of these things happening to lots of people, but not getting a lot of attention anywhere for some reason, despite the fact that those things cost $2000 and you should absolutely be able to type for more than 2-3 months without replacing the keyboard.


> Why does it matter, and why is it so important to you? Does it change anything?

Yes. Your original point was something along the lines of you being overcharged by Apple because you now had to buy an expensive dongle. I showed that the expensive dongle you were talking about wasn’t actually as expensive as it seemed, which weakens your argument somewhat.

Regarding the keyboard on your laptop, I’ve heard plenty of anecdotes of the it just stop working on people, so I’ll concede that it probably has some sort of inherent issue there. You seem to be unlucky with an extreme case of this.


When the third failure/warranty claim happens, you have a right to refuse the repair, return the item and ask for the money back (the same amount that is on the bill). So by selling to the third party, you took the loss.


And even the "third try" rule is the most lenient case for the seller here. Firstly, For mass produced devices (not with BTO models!) you generally have the right to demand a new replacement instead of waiting weeks for the repair. And the "three tries" is actually only an upper limit (and AFAIK only explicitly stated in a few countries) - if you agree to a repair you can demand a reasonable deadline, which trumps any number of tries. So, no, you don't have to accept "repair->2 weeks->failed->another 2 weeks".

(All this under the assumption this happens in the first 6 months, when the law assumes a fault at the time of sale as a default and consumer rights are strongest)


is there an online resource with this type of "good to know EU pro consumer workflows" explained somewhere but easy to digest ?


In Europe..? I wasn't aware of that. I mentioned returning it to Apple and they said the could give you a new one at the most, but only after multiple repairs for the same problem failed.


Yes, in Europe. It happened to my brother, but with Asus (so they didn't try to talk him out of it). After third warranty claim, he took the money and got an Thinkpad.


See my other post. They were bullshitting you (happens too often, sadly). Was this Apple themselves or a third-party seller?


Add a typical European VAT and the price is similar.


€89 is around $110. Is VAT really 60%?


I guess I meant that was an equivalent price for electronics.

Around 20% is normal. The USD has slipped further than I thought compared to the Euro, and prices are usually a bit higher here. There are various explanations:

* Higher staff wages

* Government-mandated warranty

* Willingness to pay more

The same product is €79 in Germany: https://www.apple.com/de/shop/product/MJ1K2ZM/A/usb%E2%80%91... (or ~87€ in Denmark!)


Jeeze, I regularly see these things left in conference rooms at my work. I could make a killing in the German resale market.


Presumably, reselling these would require you to apply VAT to your products…


You don't pay VAT twice. If anything on the margin.

Of course, if VAT is 60% like you seem to think... That would be a lot! ;-)



This is all you got from my comment..?


> Your original point was something along the lines of you being overcharged by Apple because you now had to buy an expensive dongle. I showed that the expensive dongle you were talking about wasn’t actually as expensive as it seemed, which weakens your argument somewhat.


It also works on 2017 iMacs and iMac Pro. ie. Macs with Thunderbolt 3.


And iMac + iMac Pro.


Any idea why this is limited to AMD GPUs?


Because years ago NVIDIA screwed up the thermal solution in their laptop chips so they’d get so hot they’d unsolder themselves inside MacBook Pros, and then gave someone important at Apple the middle finger instead of helping to deal with the issue. Ever since then Apple does not use NVIDIA GPUs.


Apple has used Nvidia gpus many times since then. Whatever the reasons for Apple's vendor-hopping for GPUs, this isn't it.


Apple have used nVidia stuff a few times since the 8000 series solder screw ups.

2010, 2012, 2013 + 2014 MacBook Pro’s to name a few instances


I would love to hear more details about this.


The fiasco was widely called "Bad Bumps" due to the issue being related to the solder used for the bumps on the underside of their chips.

It was a huge deal affecting major computer vendors like Apple, Dell, HP etc at the time, with failure rates on laptops with these chips being ridiculously high. I've also long wondered if this played a part in whatever reason Apple have for avoiding Nvidia GPUs for so long.

> https://semiaccurate.com/2010/07/11/why-nvidias-chips-are-de...

> http://www.tomshardware.com/news/NVIDIA-GPU-Chipset-Graphics...

Not usually a huge fan of theinquirer's tabloid style, but they did a pretty good job reporting on the issue back then.

> https://www.theinquirer.net/inquirer/news/1047022/apple-note...


This was pretty well publicized circa 2008 when the events took place.


2012 and 2013 15" Macbook Pros used NVidia. I think Apple likes to alternate every generation or two, if only to keep their options open.


2010 MBP does, for sure. I need to disable that one, and use the Intel GPU instead. Using the Nvidia GPU causes random reboots.


Or maybe 2012. It’s been a while.


macOS doesn't currently have Nvidia drivers from this generation (from what I remember). It supports a handful of mobile cards and no more, which is the extent to which Nvidia has been implemented in Apple computers as of late.

Meanwhile, the past few generations of graphics gear in Apple computers has been a pretty wide range of AMD stuff. Vega, Polaris, FirePro have all been represented in the last 5 or so years. All they had to do was modify the driver to be able to read from the TB3 PCIe device rather than just an internal one.


... but Nvidia has their own drivers for even the most recent graphics cards that work perfectly fine. Writing this from a OSX setup with a GTX1080.


Perfectly? The latest drivers lag like crazy


The drivers were great for a GTX 650Ti in 10.8 and 10.9. They worked fine for games like Borderlands 2 and Arkham Asylum.

The drivers are terrible for a GTX 1060 in 10.12. After running for a while, even basic desktop operations bog down to the point of being unusable. I haven't tried 10.13 yet.


This is unfortunately a known issue, at least in the community if not by nVidia. The current workaround is to patch an earlier known-good driver version to work on whatever release of High Sierra you're on. I'm currently running 387.10.10.10.25.106 on 10.13.3 without lag.

https://github.com/Benjamin-Dobell/nvidia-update


Interesting that some people have no lags. I have also used that script, but stutters are still present.


I might be still using Sierra after hearing about all the issues with High Sierra. I don't want to spend a few days configuring my Hackintosh again!


I take it this is a hackintosh? Are you able to use it ok without signing into any iCloud services, or have you found a way around that?


AFAIK you were already able to use external GPUs via 3rd party thunderbolt enclosures, it just wasn't officially supported.


iCloud works perfectly fine on Hackintoshes.


I logged into one on Sierra. Every iCloud service I use got locked down over the next 20 minutes and it wasn’t fun getting them back. Turns out I’m rather locked into the Apple ecosystem. Somehow Apple detect hardware configuration and check it on login. The mess it left wasn’t pretty.


I can confirm that Sierra and Capitan do have Nvidia drivers for Pascal series.


Are you sure? I have a gtx 970 working on the maxwell drivers on el cap but no luck on pascal.


Apple hasn't released a product with an NVIDIA GPU for years. Also, driver support for NVIDIA GPUs on MacOS is not very good compared to AMD GPU driver support on MacOS.


Nvidia driver support is fine, generally speaking. It's just Apple choose not to bundle drivers OOTB in the OS, but it's a trivial install from Nvidia's website. You can drop a brand new 1080ti into an old Mac Pro today and it will work great. Hackintosh users have also taken advantage of this for ages.


In fact Nvidia cards are the only ones that are recommended for hackintoshes for OOB compatibility [1]. IME on Mac OS 10.12 getting a Nvidia GTX 1060 to behave properly has been much easier than a AMD RX 480.

[1] https://www.tonymacx86.com/buyersguide/march/2018/#Graphics_....


That's definitely not true since at least summer last year (release date of WhateverGreen [1]). The drivers for Nvidia cards are currently pretty bad and are only getting worse (they used to work acceptably ok on Sierra). For example, Pascal based Nvidia GPUs on High Sierra often get something like 50% performance compared to Windows and they also cause the system to stutter. There are also many other problems with using them. For instance, you cannot update the system (even small updates) until Nvidia releases new drivers which sometimes takes few days.

While with AMD, the drivers for RX460/470/480/560/570/580/Vega are already built-in and you only need to correctly inject them. You just need to use one kext (kind of a driver in Mac) called WhateverGreen (to be fully correct you also need Lilu, but you almost always have it anyway on your Hackintosh). It works without issues and is generally preferred among Hackintosh community.

Another thing is that tonymacx builds are generally bad if you want a stable and easy to build Hackintosh (at least as easy as it can get). They are being sponsored and their choice of hardware is strongly biased.

[1] https://github.com/vit9696/WhateverGreen


I hadn't heard that criticism of tonymacx86 before. Could I ask which site (or sites) you recommend as less biased / more straightforward builds than tonymacx?


Wow I had no idea that tool existed. Getting hardware acceleration working was the main reason I replaced my R9-390x with a GTX1070.


Because AMD GPUs are the only discrete GPUs Apple supports on current models?


Is it true that they disabled support for TB2 eGPU and in doing so broke many people's current setups (on pre-2016 hardware)?


There are reports from 2 weeks ago that it was removed from the 10.13.4 beta.[0] I haven't seen reports about the status in the release.

[0]: https://appleinsider.com/articles/18/03/14/thunderbolt-thund...


Did anyone filed a bug about this in Radar?


I've been using an eGPU for a few months now and It's fantastic. It's a bit annoying to deal with all of the workarounds for libraries like pytorch/tensorflow so you can use the latest version but other than that it's great.


Can you point to any blog post or any other link of a setup description for deep learning?


So there I didn't really follow any blog post or anything, there's a lot of gists about setting it up but they become relatively out of date fairly quickly and are usually specific to their setup.

I would checkout: https://egpu.io/forums/mac-setup/wip-nvidia-egpu-support-for...

A lot of people have put in a lot of work to make it as easy as possible to setup. I would just make sure you setup things one at a time and don't immediately jump to trying to get TF or pytorch to work after installing the drivers/ following any sort of guide. Verify your Cuda installation first by building the sample programs and running them. (see http://docs.nvidia.com/cuda/cuda-installation-guide-mac-os-x...)

Other than that maybe checkout: https://gist.github.com/jganzabal/8e59e3b0f59642dd0b5f2e4de0... https://gist.github.com/smitshilu/53cf9ff0fd6cdb64cca69a7e28...

The main thing is just getting the GPU drivers setup, after that installing tensorflow requires some modifications to the source (relatively trivial find and replace), and I don't think you have to do anything special for the latest version of pytorch.

Other tips:

1. Make sure you're using a thunderbolt 3 cable with whatever egpu you have, other cables will not work even if they're usb c. (USB 3.1 != Thunderbolt 3 != USB C) Read about the differences.

2. I would recommend the AKITO Node enclosure as it seems like most people use it and the community is really small already so if you aren't it might be more difficult to debug issues, but I wouldn't say you couldn't use something else.

3. You'll want to have a docker container that has the CPU version of tensorflow/whatever so you can use that when you don't have the GPU readily available as tensorflow won't work if you installed it with GPU support and your GPU isn't there.

4. If you're trying to use Nvidia-Docker I'm not sure if anyone has gotten that to work on mac because device binding isn't supported on mac for docker. You might be able to get this to work by modifying the docker VM but I'm not sure.


Thanks for the detailed response :)


No problem (:


CUDA support is pretty critical for a lot of media production. I'm thankful that official support is here. The downside is a Mac Pro looks like a rats nest of cables with external video I/O cards, 10GbE cards, eGPus. It's a much larger physical footprint than a tower with a lot more technical troubleshooting resulting in making sure a crappy thunderbolt 2 cable is seated correctly.


NVIDIA cards and thus CUDA are still not officially supported.


I have a older MacBook with an nVidia card specifically for CUDA development. I get my drivers from nVidia. Could this work the same way?


Yes, external NVIDIA GPUs already work this way and will continue to do so. This announcement is about official support which only applies to a few specific configurations. The only GPUs supported are Polaris/Vega and the only Macs supported are 2016+ MBPs, 2017+ iMacs, and the iMac Pro.

https://support.apple.com/en-us/HT208544


Can one not use a docking station and hide all the stuff?


So nVidia cards are indeed supported?


Apple doesn't want to release a mid-priced ($2000?) desktop for content creators, as that would cannibalize the market of the $8000 Mac Pro and iMac Pro.

Now they seem to be realizing they're losing that segment of the market to PC side of things, and that people in that segment are often thought leaders. If content creators have PCs, content usually ends up working the best on a PC.


Don't think anyone who moved would be wanting to move back just because Apple now supports a GPU dongle.


On the MacOS install page for TensorFlow, we have this goodie:

Note: As of version 1.2, TensorFlow no longer provides GPU support on macOS.

Will eGPU support help change that?

https://www.tensorflow.org/install/install_mac


Probably not, since Nvidia eGPUs are not currently supported by Apple.

Although, it is possible to compile TensorFlow from source with CUDA support for macOS. You have to make a few tweaks to the source code and symlink some libraries.


Can anyone comment on whether this will be useful for training neural networks?


So I've been using an eGPU for about 3 months now and it is amazing. This isn't officially supported so you end up having to do a lot of work arounds to get things like tensorflow/pytorch working.

It doesn't let you hot plug your eGPU into your computer (you have to restart for it to work) but for officially supported eGPUs you're now able to do that.

I've trained various models using a Titan XP and it's so awesome to be able to maintain the portability of your laptop and still get all of that power. A large benefit is also not having to move training data around between servers and other machines if you have it on an external drive or just on your laptop.

After you get everything up and running there isn't that much maintenance or anything you have to do regularly to keep it working. The speedup is incredible and it was definitely worth it for me personally, however it's not a walk in the park to get it set up initially.


Neat. Have you tried it with any games?


I've tried it with an external monitor which works great but I haven't tried it with any games. From looking at the eGPU forums it seems like people aren't running into that many problems with it.


IMO it's more cost effective and stable just to buy a Ryzen 2400G system (the same cost as a good eGPU case), put in Linux and use it for training at a full speed. TB3 has 4x lower throughput than PCIe, so if you need to transfer a lot of data (any meaningful non-toy model these days), it's better to stick with internal GPU. Not mentioning Linux is 1st class/preferred platform for many frameworks like TensorFlow etc.


I say for thease sort of application why bother with MAC at all just go with Ryzen or wait for Ryzen2 out this month or go the full Threadripper or Eypc / Rome


No. Officially, this doesn't support NVIDIA cards yet, which are a requirement for any modern DL framework.

There are beta NVIDIA drivers (http://www.nvidia.com/download/driverResults.aspx/131833/en-...) for macOS, but your mileage will vary.


These aren’t beta drivers. They’ve been released for quite some time and are updated with (almost) every macOS release. However I’d say they’re pretty barebones as Nvidia doesn’t seem to care too much about the Mac at this point.


The notes say "Includes BETA support for iMac and MacBook Pro systems with NVIDIA graphics." (specifically, Pascal drivers)

The reason for that is Apple hasn't shipped a iMac/MacBook Pro with an NVIDIA card in years, which is especially annoying for DL work.


In practice they work just fine, it's just very annoying to initially set them up and you have to do some patching to get the latest version of tensorflow working with it. I use one every day and I haven't run into any problems while running models with both pytorch and tensorflow.


There is a project working for this: https://github.com/plaidml/plaidml


Yes, it will allow you to use a much more powerful graphics card, as long as your training isn't limited by bus bandwidth.


Only if the software you have supports it. For example, Tensorflow on OS X doesn't use GPUs since version 1.2


I have an Akitio Node - Thunderbolt3 eGPU enclosure which says it accepts full size cards and works with High Sierra with AMD GPU's. (https://www.akitio.com/expansion/node)

What is a good GPU for a budget for someone that just wants to experiment with GPU's like OpenCL or some machine or deep learning?

If I didn't have a budget what would a better card be? :-)


There currently really are no good cards to get from AMD because of the inflated prices caused by miners. I've heard that Nvidia GPUs work just fine with the Nvidia drivers installed. I'd totally recommend AMD any time of the year, but you cannot get their cards at MSRP anywhere.

AMD got the following models (mid-range, kinda high-end): RX 540-580 and Rx Vega 56 and 64. The RX580 got an MSRP of around $200, but you can only get them for $400+. Same problem for the other models.

Nvidia got the following ones (mid-range, high-end): GTX 1050-1080 and Ti models meaning a boost in power. You might be able to pick up a GTX 1060 for $200-300 which is still more than the $200 MSRP, but that's what you can get.


A GTX 1060 would still be useful for someone who wants to learn more about GPU programming, practice and start working on ideas they might have.


I have a crazy question, why don't we put the big gpus in the monitor instead of putting them in the laptop? It is ok to have a small low performance gpu in the laptop like the one i have in my MBP but instead of buying an external gpu i would pretty much buy a large screen with a gpu in it. Maybe it is just me.


Unfortunately while a pretty concise solution, it’s been the case over the past 10-15 years that the GPU market has moved at a much faster pace than the display market.

So the main thing that would worry me is to have a monitor that is capable, far longer than the GPU that’s integrated within it. Like a inverse Mac Pro - where currently the internals are sound for the majority of use cases - with the exception of the GPUs which are quite poor performers at that price point.

Now we seem to be in pretty good shape to accomplish such a solution - the technologies are there with thunderbolt 3 offering enough performance now to accommodate a high end GPU with not too much performance loss.


And if you have 2 screens?


What is the point of this if it doesn't support any GPUS with a osx driver? I was under the impression that PCI-E was supported by TB3 - the little squabble between Nvidia and Apple needs to stop.


As others have mentioned in this thread, Nvidia provides drivers for their graphics cards on macOS.


PSA: If you are using a DisplayLink based device, this update will break your setup. They have a beta driver on their site that will re-enable clone mode, but not extended desktops, etc. Through the 10.13.4 beta the rumor was that changes to support eGPU's broke something they relied on.

Edit: 1.0.13.4 is not a version of macOS.


Very cool! Does anyone know if you can do SLI within any external enclosures with Mac or is that a pipe dream given there is no official driver?


I've always been curious about this. How can I try it out if I already own a GPU?


You will need to buy a TB3 eGPU enclosure. These range anywhere from $200 to $500 and typically come without a GPU. There are some more expensive options (like the Aorus Gaming Box with a GTX 1070/1080 or RX 580) that already come with a GPU installed.

Here is a nice comparison of the eGPU enclosures available today: https://egpu.io/external-gpu-buyers-guide-2018/

My personal recommendations are the Akitio Node Standard/Pro or Sonnet Breakaway Box 350/550/650. These companies are reputable companies in the Thunderbolt hardware realm.

Note that I am not affiliated with any of the companies mentioned above.


You'd have to buy an eGPU case, along with a GPU. Apple recommends Sonnet case with AMD GPUs [0].

However, as seen here [1], people have managed to use other GPU cases, with NVidia cards.

0: https://support.apple.com/en-us/HT208544

1: https://egpu.io/build-guides/


As others mentioned, you will need a case. You will, however, also need a device with TB3.


You would need an external Thunderbolt 3 enclosure for your GPU


Mac VR here we come. About time; this will be one of their bigger errors.


Are you able to use an external GPU to drive a MacBook Pros display?


Do you mean loopback mode?


Yes, thank you, now I know the correct term.


Is there something like this available for Linux and Windows PCs?


Yes, Mac is the last to support it


Yeah for a while now for winpc


Perviously


"How to downgrade your 1080Ti to 1060 by paying additional $400"


Why would this downgrade?


The OP is being a little disingenuous because I'm not sure Nvidia is supported anyway, but you lose some (not enough to go from 1080TI to 1060) performance. Something like 5-10%, depending on if you send the video signal to an external monitor or back to your laptop's screen.


More like 20-30% on high end cards. But it's also very workload-dependent.

https://egpu.io/forums/mac-setup/pcie-slot-dgpu-vs-thunderbo...


For gaming. Also don't forget the annoying latency. But if you need to transfer from/to GPU memory all the time, like in large Deep Learning training datasets, you downgrade your GPU significantly; that's why I wrote 1080Ti -> 1060. You already have high end PCIe GPUs starved and waiting on memory transfers in Deep Learning all the time.

You could have observed something similar with e.g. SATA 1/2/3 SSDs and M.2 PCIe. For normal workloads, each new generation was slightly better performing, i.e. booting OS etc. But once you went into processing 4K RAW video, only M.2 PCIe was usable.


Heh, why make another Mac Pro when they can sell a egpu dongle?


Wanted external GPU support while traveling some to be able to do deep learning on the road. Too bad this is only AMD support.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: