
MacOS finally gains external GPU support - harshgupta
https://techcrunch.com/2018/03/30/macos-finally-gains-external-gpu-support/
======
chx
There's an amazing website and community if you are interested in external
GPUs at [https://egpu.io/](https://egpu.io/) . Both for Mac and PC you will
find guides, explanations etc.

There's also an eGPU subreddit and I wrote a sticky on U series chips, PCIe
lanes and Thunderbolt 3
[https://www.reddit.com/r/eGPU/comments/7vb0gg/u_series_chips...](https://www.reddit.com/r/eGPU/comments/7vb0gg/u_series_chips_pcie_lanes_and_thunderbolt_3/)
which is more interesting for PCs than Mac.

~~~
doomlaser
I've heard that external GPUs have too much latency to be that useful for a
lot of applications.

What I'd like to see is a Mac with a nice big fat Nvidia GPU. Why can't I buy
a Mac with a 1080 ti? My suspicion is that Apple wants to get into the GPU
game themselves, and bring some of the expertise from their mobile GPU team to
laptops and desktops.

That'd be great, if they actually released something competitive. So far, they
haven't, and Mac users, who would love to pay for better technology, are left
out in the cold as a result.

~~~
Baeocystin
>My suspicion is that Apple wants to get into the GPU game themselves

My suspicion is that they barely even want GPUs, which they seem to see as
annoying sources of heat and noise, getting in the way of their ultimate
vision of an imac so thin you could use it as a kitchen knife.

(I wish I were kidding about this, but I am not.)

~~~
doomlaser
Their investment in Metal and their own mobile GPU hardware suggests to me
that they do, and that they want to own the technology from top to bottom.

[https://developer.apple.com/metal/](https://developer.apple.com/metal/)

[https://techcrunch.com/2017/09/12/the-new-iphone-8-has-a-
cus...](https://techcrunch.com/2017/09/12/the-new-iphone-8-has-a-custom-gpu-
designed-by-apple-with-its-new-a11-bionic-chip/)

~~~
Baeocystin
Are mobile GPU efforts relevant to MacOS and desktop GPU work?

~~~
doomlaser
They will be if they start putting them in Macs.

~~~
Baeocystin
I think we agree on that. I'm certain they will. But that also agrees with my
initial statement, no?

For Apple, they want thin and light. Mobile is both things, and the thought of
designing a system that can cope with the 200W+ draw of desktop GPU parts
doesn't even factor in to their thinking.

They're tossing a bone to the users who need actual heavy-duty GPU power by
talking about the ability to use external cards, but (in my opinion)
considering the cost and performance issues that come with that approach, it
is more an admission that they truly don't care about that segment of the
market than an actual solution.

~~~
doomlaser
[https://www.apple.com/imac-pro/specs/](https://www.apple.com/imac-pro/specs/)
— released just a few months ago.

The AMD Vega 64 has got to push a few hundred watts.

Microsoft has the Surface Book 2 with GTX 1060 and they must be looking at
that as competition for the high end MacBook Pro.

~~~
Baeocystin
I did a little poking around out of curiosity.

According to: [https://www.anandtech.com/show/11717/the-amd-radeon-rx-
vega-...](https://www.anandtech.com/show/11717/the-amd-radeon-rx-
vega-64-and-56-review/20)

Vega 64 has a board power rating of 295W.

According to the imac pro tech specs from the link you provided, it has an
idle draw of 64W, and a peak draw of 370W for the entire system.

Unless I'm mistaken, that seems to imply that there has to be a good deal of
throttling between the CPU & GPU to come in at that power budget. (The specs
also note an additional 50W(!) of potential draw for fans alone if the imac is
operated in a warm environment.)

Am I mistaken?

Not arguing, just hashing out my understanding. Your point regarding the
Surface Book 2 and the 1060 makes sense. It does look like I was technically
incorrect about them not engineering to a 200+ watt GPU, although I do wonder
what sustained draw it can handle thermally compared to a full desktop part,
and how much that would affect performance.

~~~
chx
> Vega 64 has a board power rating of 295W.

John from Mantiz reported 650W peak. Surely only for milliseconds but still.
[https://egpu.io/forums/implementation-guides/2014-mac-
mini-v...](https://egpu.io/forums/implementation-guides/2014-mac-mini-vega-
rx-64-akitio-node-apple-tb3-tb2-adapter-macoswin10-rolfl/#post-18787) it is
absolutely not in the interests of an eGPU chassis manufacturer to report
figures which hinder their own sales so I believe him.

Similarly, someone with an awful lot of AMD knowledge (AMD employee?) on the
egpu.io forums pretty much begged people not to try the Vega 64 in a Sonnet
550 hinting at similar problems [https://egpu.io/forums/thunderbolt-
enclosures/sonnet-says-th...](https://egpu.io/forums/thunderbolt-
enclosures/sonnet-says-that-350550-will-not-support-vega/#post-22333) and
saying a solution is in the works.

Sonnet
[http://www.sonnettech.com/support/kb/kb.php?cat=524&expand=_...](http://www.sonnettech.com/support/kb/kb.php?cat=524&expand=_a2&action=b914#b914)
says the Vega 64 is only supported in the 650 box -- which was just released
pretty much because of the necessity to support the Vega 64. Something is
rotten in Denmark: 650W power supply supports up to 375W card (up to 8-pin +
8-pin power connectors) plus provides additional 100W of peak power.

------
Eric_WVGG
This is speculation, but I’m betting that we will see a GPU-backed external
display (sort of an iMac minus the Mac) sometime in the next year. Apple never
builds out support like this unless they plan on having a product to sell.

And they seem to generally "dislike" putting discrete GPUs in laptops, going
all the way back to the Titanium G4 days.

~~~
giobox
It could happen, but from Apple's own announcements it seems like they were
worried that the Mac was increasingly marginalized as a platform for AR/VR
development, which Tim Cook constantly tells us (for AR at least...) is
something he's super excited about for the future of computing. Not arguing
this is the sole reason, but it definitely seems like a feature to cover high
end edge cases like this for the time being.

Much of Apple's own discussions on the eGPU feature at WWDC/the marketing
blurb for their own eGPU devkit refer to VR a lot too. Apple added VR support
to Final Cut Pro at roughly the same time as well.

> And they seem to generally "dislike" putting discrete GPUs in laptops, going
> all the way back to the Titanium G4 days.

Not so sure I'd agree with this - every single Titanium PowerBook model ever
made had a discrete GPU. Even the iBooks had dedicated GPUs. It wasn't until
the adoption of Intel processors that integrated GPUs was even an option for
Apple really.

~~~
TuringNYC
>> it seems like they were worried that the Mac was increasingly marginalized
as a platform for AR/VR development

As well as ML development, which is a pain w/o a local CUDA-compliant GPU.
Yes, you could do it on the cloud, but sometimes it is easier to do ad-hoc
stuff locally without having to spin up GPU machines and remember to spin them
down.

~~~
15155
Ding ding ding.

I use OS X for all software development _except_ CUDA at the moment. This is
huge.

------
ovao
Apple has a pretty decent summary of the feature in this support article:
[https://support.apple.com/en-us/HT208544](https://support.apple.com/en-
us/HT208544)

Seems to be limited to AMD GPUs at this time.

~~~
simongray
And limited to Macbook Pro. Damn, I was hoping they would include support for
Macbooks.

~~~
navidfarhadi
Macbooks don't have a TB3 port

~~~
tomca32
Which is a shame. This is the only reason my wife still uses old 11" Mac Air.
She has an Apple Thunderbolt screen which she doesn't want to replace and is
unusable on the Macbook. Pros are just too big for her, so she keeps on using
the old Air.

~~~
lostlogin
The current MacBook similar size bigger screen, but wow is the dongle life a
horror.

~~~
nkkollaw
It's not only a horror, it's idiotic and completely avoidable.

Some people say it's an upgrade, but I don't see why they couldn't upgrade and
keep the other ports around, since people need those daily.

For sure I've used my 89-euro dongle a lot more than I've used the USB-C port
(which is never), until I got sane again after 8 years of getting overcharged
by Apple and switched to Microsoft hardware running Xfce Linux (which are both
wonderful).

~~~
saagarjha
What dongle is €89!?

~~~
nkkollaw
The one I bought with the laptops so that I could use my external monitor and
plug my cell phone.

~~~
saagarjha
This one is $69 and has USB C, HDMI, and USB A:
[https://www.apple.com/shop/product/MJ1K2AM/A/usb-c-
digital-a...](https://www.apple.com/shop/product/MJ1K2AM/A/usb-c-digital-av-
multiport-adapter). Does it not fit your needs?

~~~
nkkollaw
Why does it matter, and why is it so important to you? Does it change
anything?

It was the only adapter I could buy at Juice in Florence, Italy (where I
bought the laptop) that had HDMI at the time.

The laptop was EUR 1799 ($2,150), the dongle was EUR 89.

After 2-3 months the keyboard started failing. It was replaced by Apple--along
with the battery which they found out was faulty, and it started failing again
(missed keystrokes, registering twice) after a short while. They replace it
_again_, then after 3 days the logic board dies, taking all data with it.
Apple replaced it, I sold that piece of shit of a laptop for EUR 1,000, and
the guy called me after 2-3 days that the logic board died again and Juice (no
Apple Store in his area) would give him a new one.

That was my about 5th and last Apple laptop.

I read lots of reports of these things happening to lots of people, but not
getting a lot of attention anywhere for some reason, despite the fact that
those things cost $2000 and you should absolutely be able to type for more
than 2-3 months without replacing the keyboard.

~~~
vetinari
When the third failure/warranty claim happens, you have a right to refuse the
repair, return the item and ask for the money back (the same amount that is on
the bill). So by selling to the third party, you took the loss.

~~~
nkkollaw
In Europe..? I wasn't aware of that. I mentioned returning it to Apple and
they said the could give you a new one at the most, but only after multiple
repairs for the same problem failed.

~~~
vetinari
Yes, in Europe. It happened to my brother, but with Asus (so they didn't try
to talk him out of it). After third warranty claim, he took the money and got
an Thinkpad.

------
mediocrejoker
Is it true that they disabled support for TB2 eGPU and in doing so broke many
people's current setups (on pre-2016 hardware)?

~~~
grzm
There are reports from 2 weeks ago that it was removed from the 10.13.4
beta.[0] I haven't seen reports about the status in the release.

[0]: [https://appleinsider.com/articles/18/03/14/thunderbolt-
thund...](https://appleinsider.com/articles/18/03/14/thunderbolt-
thunderbolt-2-egpu-compatibility-purged-from-macos-high-sierra-beta-5)

------
michaelgreen
I've been using an eGPU for a few months now and It's fantastic. It's a bit
annoying to deal with all of the workarounds for libraries like
pytorch/tensorflow so you can use the latest version but other than that it's
great.

~~~
nafizh
Can you point to any blog post or any other link of a setup description for
deep learning?

~~~
michaelgreen
So there I didn't really follow any blog post or anything, there's a lot of
gists about setting it up but they become relatively out of date fairly
quickly and are usually specific to their setup.

I would checkout: [https://egpu.io/forums/mac-setup/wip-nvidia-egpu-support-
for...](https://egpu.io/forums/mac-setup/wip-nvidia-egpu-support-for-high-
sierra/)

A lot of people have put in a lot of work to make it as easy as possible to
setup. I would just make sure you setup things one at a time and don't
immediately jump to trying to get TF or pytorch to work after installing the
drivers/ following any sort of guide. Verify your Cuda installation first by
building the sample programs and running them. (see
[http://docs.nvidia.com/cuda/cuda-installation-guide-mac-
os-x...](http://docs.nvidia.com/cuda/cuda-installation-guide-mac-
os-x/index.html))

Other than that maybe checkout:
[https://gist.github.com/jganzabal/8e59e3b0f59642dd0b5f2e4de0...](https://gist.github.com/jganzabal/8e59e3b0f59642dd0b5f2e4de03c7687)
[https://gist.github.com/smitshilu/53cf9ff0fd6cdb64cca69a7e28...](https://gist.github.com/smitshilu/53cf9ff0fd6cdb64cca69a7e2827ed0f)

The main thing is just getting the GPU drivers setup, after that installing
tensorflow requires some modifications to the source (relatively trivial find
and replace), and I don't think you have to do anything special for the latest
version of pytorch.

Other tips:

1\. Make sure you're using a thunderbolt 3 cable with whatever egpu you have,
other cables will not work even if they're usb c. (USB 3.1 != Thunderbolt 3 !=
USB C) Read about the differences.

2\. I would recommend the AKITO Node enclosure as it seems like most people
use it and the community is really small already so if you aren't it might be
more difficult to debug issues, but I wouldn't say you couldn't use something
else.

3\. You'll want to have a docker container that has the CPU version of
tensorflow/whatever so you can use that when you don't have the GPU readily
available as tensorflow won't work if you installed it with GPU support and
your GPU isn't there.

4\. If you're trying to use Nvidia-Docker I'm not sure if anyone has gotten
that to work on mac because device binding isn't supported on mac for docker.
You might be able to get this to work by modifying the docker VM but I'm not
sure.

~~~
nafizh
Thanks for the detailed response :)

~~~
michaelgreen
No problem (:

------
imagetic
CUDA support is pretty critical for a lot of media production. I'm thankful
that official support is here. The downside is a Mac Pro looks like a rats
nest of cables with external video I/O cards, 10GbE cards, eGPus. It's a much
larger physical footprint than a tower with a lot more technical
troubleshooting resulting in making sure a crappy thunderbolt 2 cable is
seated correctly.

~~~
opencl
NVIDIA cards and thus CUDA are still not officially supported.

~~~
waynecochran
I have a older MacBook with an nVidia card specifically for CUDA development.
I get my drivers from nVidia. Could this work the same way?

~~~
opencl
Yes, external NVIDIA GPUs already work this way and will continue to do so.
This announcement is about official support which only applies to a few
specific configurations. The only GPUs supported are Polaris/Vega and the only
Macs supported are 2016+ MBPs, 2017+ iMacs, and the iMac Pro.

[https://support.apple.com/en-us/HT208544](https://support.apple.com/en-
us/HT208544)

------
nikanj
Apple doesn't want to release a mid-priced ($2000?) desktop for content
creators, as that would cannibalize the market of the $8000 Mac Pro and iMac
Pro.

Now they seem to be realizing they're losing that segment of the market to PC
side of things, and that people in that segment are often thought leaders. If
content creators have PCs, content usually ends up working the best on a PC.

~~~
whywhywhywhy
Don't think anyone who moved would be wanting to move back just because Apple
now supports a GPU dongle.

------
euroclydon
On the MacOS install page for TensorFlow, we have this goodie:

Note: As of version 1.2, TensorFlow no longer provides GPU support on macOS.

Will eGPU support help change that?

[https://www.tensorflow.org/install/install_mac](https://www.tensorflow.org/install/install_mac)

~~~
steve_musk
Probably not, since Nvidia eGPUs are not currently supported by Apple.

Although, it is possible to compile TensorFlow from source with CUDA support
for macOS. You have to make a few tweaks to the source code and symlink some
libraries.

------
libdjml
Can anyone comment on whether this will be useful for training neural
networks?

~~~
michaelgreen
So I've been using an eGPU for about 3 months now and it is amazing. This
isn't officially supported so you end up having to do a lot of work arounds to
get things like tensorflow/pytorch working.

It doesn't let you hot plug your eGPU into your computer (you have to restart
for it to work) but for officially supported eGPUs you're now able to do that.

I've trained various models using a Titan XP and it's so awesome to be able to
maintain the portability of your laptop and still get all of that power. A
large benefit is also not having to move training data around between servers
and other machines if you have it on an external drive or just on your laptop.

After you get everything up and running there isn't that much maintenance or
anything you have to do regularly to keep it working. The speedup is
incredible and it was definitely worth it for me personally, however it's not
a walk in the park to get it set up initially.

~~~
dmix
Neat. Have you tried it with any games?

~~~
michaelgreen
I've tried it with an external monitor which works great but I haven't tried
it with any games. From looking at the eGPU forums it seems like people aren't
running into that many problems with it.

------
jason_slack
I have an Akitio Node - Thunderbolt3 eGPU enclosure which says it accepts full
size cards and works with High Sierra with AMD GPU's.
([https://www.akitio.com/expansion/node](https://www.akitio.com/expansion/node))

What is a good GPU for a budget for someone that just wants to experiment with
GPU's like OpenCL or some machine or deep learning?

If I didn't have a budget what would a better card be? :-)

~~~
Improvotter
There currently really are no good cards to get from AMD because of the
inflated prices caused by miners. I've heard that Nvidia GPUs work just fine
with the Nvidia drivers installed. I'd totally recommend AMD any time of the
year, but you cannot get their cards at MSRP anywhere.

AMD got the following models (mid-range, kinda high-end): RX 540-580 and Rx
Vega 56 and 64. The RX580 got an MSRP of around $200, but you can only get
them for $400+. Same problem for the other models.

Nvidia got the following ones (mid-range, high-end): GTX 1050-1080 and Ti
models meaning a boost in power. You might be able to pick up a GTX 1060 for
$200-300 which is still more than the $200 MSRP, but that's what you can get.

~~~
jason_slack
A GTX 1060 would still be useful for someone who wants to learn more about GPU
programming, practice and start working on ideas they might have.

------
StreamBright
I have a crazy question, why don't we put the big gpus in the monitor instead
of putting them in the laptop? It is ok to have a small low performance gpu in
the laptop like the one i have in my MBP but instead of buying an external gpu
i would pretty much buy a large screen with a gpu in it. Maybe it is just me.

~~~
Tsiklon
Unfortunately while a pretty concise solution, it’s been the case over the
past 10-15 years that the GPU market has moved at a much faster pace than the
display market.

So the main thing that would worry me is to have a monitor that is capable,
far longer than the GPU that’s integrated within it. Like a inverse Mac Pro -
where currently the internals are sound for the majority of use cases - with
the exception of the GPUs which are quite poor performers at that price point.

Now we seem to be in pretty good shape to accomplish such a solution - the
technologies are there with thunderbolt 3 offering enough performance now to
accommodate a high end GPU with not too much performance loss.

------
bischofs
What is the point of this if it doesn't support any GPUS with a osx driver? I
was under the impression that PCI-E was supported by TB3 - the little squabble
between Nvidia and Apple needs to stop.

~~~
saagarjha
As others have mentioned in this thread, Nvidia provides drivers for their
graphics cards on macOS.

------
adamcw
PSA: If you are using a DisplayLink based device, this update will break your
setup. They have a beta driver on their site that will re-enable clone mode,
but not extended desktops, etc. Through the 10.13.4 beta the rumor was that
changes to support eGPU's broke something they relied on.

Edit: 1.0.13.4 is not a version of macOS.

------
godzillabrennus
Very cool! Does anyone know if you can do SLI within any external enclosures
with Mac or is that a pipe dream given there is no official driver?

------
sharkenstein
I've always been curious about this. How can I try it out if I already own a
GPU?

~~~
navidfarhadi
You will need to buy a TB3 eGPU enclosure. These range anywhere from $200 to
$500 and typically come without a GPU. There are some more expensive options
(like the Aorus Gaming Box with a GTX 1070/1080 or RX 580) that already come
with a GPU installed.

Here is a nice comparison of the eGPU enclosures available today:
[https://egpu.io/external-gpu-buyers-guide-2018/](https://egpu.io/external-
gpu-buyers-guide-2018/)

My personal recommendations are the Akitio Node Standard/Pro or Sonnet
Breakaway Box 350/550/650\. These companies are reputable companies in the
Thunderbolt hardware realm.

Note that I am not affiliated with any of the companies mentioned above.

------
garyvee_
Mac VR here we come. About time; this will be one of their bigger errors.

------
atommclain
Are you able to use an external GPU to drive a MacBook Pros display?

~~~
lloeki
Do you mean loopback mode?

~~~
atommclain
Yes, thank you, now I know the correct term.

------
qwerty456127
Is there something like this available for Linux and Windows PCs?

~~~
asdsa5325
Yes, Mac is the last to support it

------
china
Perviously

------
bitL
"How to downgrade your 1080Ti to 1060 by paying additional $400"

~~~
chrisper
Why would this downgrade?

~~~
wishinghand
The OP is being a little disingenuous because I'm not sure Nvidia is supported
anyway, but you lose some (not enough to go from 1080TI to 1060) performance.
Something like 5-10%, depending on if you send the video signal to an external
monitor or back to your laptop's screen.

~~~
opencl
More like 20-30% on high end cards. But it's also very workload-dependent.

[https://egpu.io/forums/mac-setup/pcie-slot-dgpu-vs-
thunderbo...](https://egpu.io/forums/mac-setup/pcie-slot-dgpu-vs-
thunderbolt-3-egpu-internal-display-test/)

~~~
bitL
For gaming. Also don't forget the annoying latency. But if you need to
transfer from/to GPU memory all the time, like in large Deep Learning training
datasets, you downgrade your GPU significantly; that's why I wrote 1080Ti ->
1060\. You already have high end PCIe GPUs starved and waiting on memory
transfers in Deep Learning all the time.

You could have observed something similar with e.g. SATA 1/2/3 SSDs and M.2
PCIe. For normal workloads, each new generation was slightly better
performing, i.e. booting OS etc. But once you went into processing 4K RAW
video, only M.2 PCIe was usable.

------
digi_owl
Heh, why make another Mac Pro when they can sell a egpu dongle?

------
panoply2
Wanted external GPU support while traveling some to be able to do deep
learning on the road. Too bad this is only AMD support.

