Hacker News new | past | comments | ask | show | jobs | submit login
Cheap and Painless eGPU Thrills on a 2013 MacBook Pro (archagon.net)
238 points by archagon on Jan 2, 2017 | hide | past | favorite | 107 comments



This is interesting academically, but I wouldn't buy a hacked-together TB2 enclosure.

Thunderbolt 3 eGPU enclosures are available now, and are fully supported by Intel and windows. They started at $500 with the Razer Core, but new models like the AkiTiO Node and Powercolor Devil Box are coming in at $299, shipping Q1 2017. More will likely be announced at CES this week.

These TB3 enclosures have 400w+ power supplies, better cooling, and are large enough to fit large enthusiast GPUs like the Nvidia GTX 1080. You can plug a TB3 device into a TB2 laptop with a simple adapter. Apple sells theirs for $29; Chinese versions will be cheaper.

Depending on what features the manufacturer decides to add to their device, TB3 eGPU enclosures can also charge your laptop, offer USB3 ports, gigabit ethernet, audio jacks, etc. The $299 models tend to be very bare-bones but there's no doubt that competition will lead to cheaper prices and more features over time. Again, watch for more at CES this week.

Current list of TB3 eGPU enclosures:

AKiTiO Node ($300) https://www.akitio.com/expansion/node

Asus ROG XG Station 2 (price unknown) https://www.asus.com/Graphics-Cards-Accessory/ROG-XG-STATION...

BizonBOX 3 ($650, probably because it says it works with Macs) https://bizon-tech.com/us/bizonbox3-egpu.html/

Powercolor Devil Box ($300. Ugly, though.) http://www.powercolor.com/us/products_DevilBox_features.asp?...

Razer Core ($500) http://www.razerzone.com/gaming-systems/razer-blade-stealth


I spent a while trying to convince myself I could get away with a Mac Pro (Trashcan) hooked up to an external GPU to do VR & Deep Learning using a TB2 enclosure. After fighting for hours (days?) with how the PCI lanes were allocated on Windows, I finally got output to an external display, but it still ran below VR spec according to the various benchmarks. If you wanted to switch back and forth between Windows and Mac, it always required a few power cycles because Windows was constantly thinking it was reconfigured.

I returned the whole thing and bought a beefy gaming PC, which now has an absurd amount of power in it (a 1080 + 2x970... I had forgotten how much fun it is to continually upgrade a machine). For power on the go, I have a Razer Blade with a 1060 in it. It's a shame, because I prefer to develop on a Mac, but I need modern hardware ¯\_(ツ)_/¯.


One thing to bear in mind if you need raw power is the CPU is still going to be lacking in a notebook. The notebook class 6700HQ is clocked at a maximum of 3.5GHz vs 4.2GHz for the desktop class 6700K (and much more if you overclock it), and it's probably not going to run at that speed for long due to thermal limiting.

I was considering getting the new MBP, but instead bought a used workstation. It has a Xeon E5-2670 (8 core/16 thread) and 32GB RAM, I stuck in an RX480 and now have a pretty good working and gaming machine for ~€800. Some games are still poorly optimised for multiple cores though (e.g. Arma 3), so single core performance is still the king for gaming.


CPU performance is a small factor for gaming. GPU is vastly more important. Most games are console ports, and both the Xbox One and Playstation 4 have extremely slow AMD Jaguar cores, comparable to slow Intel Atoms.


For power on the go, I have a Razer Blade with a 1060 in it.

There are now alternatives for those that need power on the go.[0] Well, for number crunching power on the go, anyway. Won't help for gaming and 3D modeling.

[0] https://news.ycombinator.com/item?id=12610804


> Won't help for gaming and 3D modeling.

So, entirely irrelevant to the situation described by the parent poster?


No, because the parent poster also mentioned deep learning, which is number crunching.


I would have certainly preferred to do this, but it would have involved a $2000 upgrade first! (Well, barring the TB3-to-TB2 adaptor approach, but I've seen almost no builds doing this.) Also, from what I've seen, there's a lot of uncertainty regarding eGPU support with TB3; many people have reported problems on TechInferno etc., and even AKiTiO is unclear about Mac support. (If you check the specs for the Node, you'll see that Macs are listed as unsupported. Does this mean OSX-only or BootCamp as well? Who knows?) Might be worth waiting a year for things to shake out.

Also, I was optimizing in part for cheapness. Truly great gaming cards cost $300 and up, and $400 total was about as much as I was willing to pay for an upgrade.


TB3 is supposed to be fully backwards compatible with TB2; it's just a different physical connector. That's why even Apple only charges $29 for the adapter.

Waiting a year is way too much caution for me. I wouldn't pre-order anything, but once other people post it works I'd feel safe to proceed.


These adapters support hooking TB2 devices to a TB3 host, but not the other way around.

Intel initially announced that TB3 devices would be compatible with TB2 hosts, but it looks like they quietly dropped this feature. It's _really_ hard to find a reference on this though.


This guy[0] seems to say it works with the Razer Core.

https://www.reddit.com/r/apple/comments/5l3y0a/egpu_my_exper...


Looks like I had the wrong impression. Thanks for the link!


I thought the same thing as you, but it seems that at least some TB3 devices work with TB2 hosts, e.g. the LG 5K display: https://support.apple.com/en-us/HT207448


> Might be worth waiting a year for things to shake out.

Some of us have already waited a year, nothing changed. It's been Akitio + PSU or nothing for quite a while now.

Made the jump to PC myself instead of hanging on in hope that something will come out with blessed support for Macs.


> If you check the specs for the Node, you'll see that Macs are listed as unsupported. Does this mean OSX-only or BootCamp as well? Who knows?

BootCamp is fine. Watch this review by Linus for more details: https://youtu.be/JirCwapScUs?t=5m54s


> You can plug a TB3 device into a TB2 laptop with a simple adapter. Apple sells theirs for $29; Chinese versions will be cheaper.

Apple's adapter has turned out not to work on Linux. It might not work with BootCamp either. Seems to require a special driver:

https://bugzilla.kernel.org/show_bug.cgi?id=189731


That's what it feels like on the bleeding edge! Again, I wouldn't suggest being the first guy on the internet to try it.

If the Apple adapter doesn't work, there are a bunch of others on Amazon, although they are a bit more expensive today. Prices will come down. But if you have a TB2 plug on your laptop, you probably have a Mac anyway.


> This is interesting academically, but I wouldn't buy a hacked-together TB2 enclosure.

Huh? The article is about a setup built using an AKiTiO PCIe box. You even mentioned that manufacturer as an alternative. I guess it's not officially "for eGPU use", but that's just a question of power, and a small SMPS brick is fine for that.


He used an AKiTiO PCIe box that supports Thunderbolt 2. The model I posted is TB3, which Intel and windows both support for eGPUs. It also has a much more powerful power supply and can fit enthusiast-level video cards, rather than relying on the PCIe slot power capped at 75w.


TB3 enclosures are also more expensive (and the affordable ones aren't readily available yet), have minimal performance gains[1] and have limited support on MacOS.

1 - https://youtu.be/vvAB3U5umug?t=187


I would completely discount the performance difference, totally agree. TB2 is plenty of bandwidth for an eGPU.

$300 is affordable, and prices will come down further.

If you want MacOS support, only that BizonBOX /says/ it will work. But I betcha they all will. I certainly wouldn't be the guinea pig, though-- wait for other people to try it out and post their experiences.


Bizon box is just a pre-prepared Akitio TB2.


Except it isn't manufactured by Akitio, and isn't thunderbolt 2, and has a much more powerful PSU, and can fit an 11" video card, and... wait, how are they the same?


Up until the most recent version it was an Akitio PCIE-to-TB2 board, prior models used a modified Akitio case. There was also a max card size previously on BizonBox2 (because it's an AkitioTB2) which is why they give you the options to "Remove front panel" for "long GPUs", something they seem to have fixed in the latest version 3 by scaling up an almost 1:1 reproduction of the Akitio TB2 case internals (flanges, screwholes and mounting points are still in very similar locations from what I can see).

The "much more powerful PSU" you speak of appears to be a Dell DA-2 laptop power supply, which isn't that powerful compared to say an 800w ATX PSU.

Tell me that this isn't an Akitio TB2 with holes drilled in the side: https://bizon-tech.com/us/bizonbox2-egpu.html

I've built a couple of these from Akitio and Sonnet enclosures (first one took ages to get working, otherwise it saved me from the huge disappointment of the most recent MBPr).


The previous version of the BizonBox does look a great deal like a modded TB2 Akitio. But I wasn't talking about the TB2 BizonBox2!

The BizonBox 3 does indeed come with a 200w laptop PSU. But that is plenty for a GTX1080 with a 180w TDP.


Yep, I'll have to take your word for it regarding the BB3, as I'm not too familiar with the BB3. Going forward I'd really like Bizon to make better quality/priced products and add more competition to the eGPU space. I'd love it if the general direction of 'everyman' PC gaming were eGPUs.


[deleted]


That's what people theorize, sure. Thing is, nobody can say what Intel actually charges, because they won't allow their licensees to disclose that information. Many have said TB3 fees are much lower than TB2, just to toss another rumor on the pile.


> You can plug a TB3 device into a TB2 laptop with a simple adapter.

I don't know of any such adapter. Do they exist? The Apple one allows you to use TB2 devices with your new TB3 laptop, not the other way.


From Apple's Thunderbolt 3 (USB-C) to Thunderbolt 2 Adapter page http://www.apple.com/shop/product/MMEL2AM/A/thunderbolt-3-us...

    As a bidirectional adapter, it can also connect new 
    Thunderbolt 3 devices to a Mac with a Thunderbolt or 
    Thunderbolt 2 port and macOS Sierra.


But it's impossible to even plug it in: it has a male USB-C and a female mini-DisplayPort sized plugs.


Assuming the Thunderbolt 3 device has a type-C female on it, you could simply plug the TB3 connector on the dongle into the device, and use a TB2 male to male cable to connect it to the laptop.


I could never possibly come up with using a dongle that way, but you are right it can indeed fit when used like that at the device.


There is no need for 400 watts of psu for new generation nvidia cards.

Also, all the features you mentioned are already available in form of docks, which one might already have.


I don't understand, this is $300 for a box with a port adapter and a $35 power supply inside of it?


On the one hand, this is really cool. But, on the other hand, thats nearly $400 to use a $140 video card with a Macbook that's running Windows and basically behaving as a desktop computer.

It wouldn't be that much more expensive to just build a cheap Windows desktop with the same video card - especially when simplicity and low fuss are stated goals.

("Gaming computers" can be incredibly cheap. http://www.logicalincrements.com/ starts at $169, and the tier with a GTX 1050 Ti is $528. Also, given the 20% performance penalty from TB2, you could probably drop down a tier or two and still get the same result.)


Sure, but there are a few additional factors. First, $400 is the "lazy price". I had some credit card points to spend so I didn't want to waste too much time looking for deals. But if you wait for rebates, buy used, or get the less powerful non-Ti 1050, you could probably get close to $300. Second, I already had a very powerful CPU in my machine (as well as 16GB of RAM) and I didn't want to downgrade. Third, I move around constantly and prefer to keep my belongings to a bare minimum, so a compact eGPU box was far more compelling than even the smallest gaming PC case.


Fair enough. I'm glad you're happy with it - it's just not the route I would choose :-)

Nice write-up either way.


I've put a lot of thought and research into this already, and you basically summarized my thoughts on the matter. Building a gaming desktop is a better answer today.

But in a year, when you can buy a commodity Chinese-made TB3 enclosure for $200 that-- with a SINGLE CABLE-- powers your laptop, offers a bunch of extra USB3 ports, ethernet, audio jacks, and can also take a high-end GPU for gaming, well, that's a much more attractive proposition.


A year he won't be able to use the extra GPU juice.


Along these lines, I was also curious about the quote

"the kind of CPU I could buy for cheap would be comically underpowered compared to the i7 4850HQ I already had in front of me"

Specifically: will the high-end i7 in a MBP run at Full Turbo indefinitely, or does it throttle down hard after a couple of minutes at full load to avoid overheating, like the one in my Dell Precision does?

I'm suspecting the latter, due to the basic physics of heat dissipation from a high-TDP chip. In this case, a sub-$200 Haswell i5 will run circles around it.


I've had throttling issues in the past (though mostly on the dGPU side of things). Dusting out the case stopped the problem immediately. I also imagine that heat will be far less of an issue with the dGPU sitting idle.


I had to make the same decision in June, and I built a Windows desktop instead of using an external GPU.

However, I hope that I won't have to do that again in the future. I'd rather have just one computer for the sake of simplicity. Hopefully external GPUs will become mainstream in a few years.


Yeah, a i5-6600k is a bit over $200 right now, and Ryzen is just around the corner. If a TB3 enclosure was, say, $100 (then you'd add an appropriate ATX PSU) and there were reliable, tested cables (and OSX drivers), I'd be all over this. EGPU has been around for a few years now and has yet to take off. Hopefully TB3 will remedy this, but I'm not holding my breath. I think a lot of the blame is on Intel, and hopefully with TB3 now they'll fully support eGPU.


In case anyone's wondering about doing it with the new 2016 Macbook Pros, there's a video about it here:

https://www.youtube.com/watch?v=vvAB3U5umug&t=1s


This is really neat! I'm in a similar situation (Late 2013 Retina MacBook Pro 13") and have been doing some research on deep learning and neural networks. Has anyone had success using eGPUs with things like CUDA and CUDNN? Is TB2's lower bandwidth a bottleneck that would keep you from using a really beefy eGPU for these purposes?


When using a single GPU, as is the case here, you will not be able to saturate a TB2 slot in any reasonable deep learning workload. For multiple GPUs, sure, PCIe performance matters, but for single GPUs it is not a bottleneck for any practical scenario I can think of.

Right now, I am working with large volumetric datasets (think 100+GB) and even then I am only seeing ~200MB/s peak transfers, which are well within the capabilities of TB2. In my experience, large datasets are bottlenecked by the hard drive, which is not a problem for modern rMBPs.

Edit: I am using a 2.5GB/s NVMe SSD, so the 200MB/s is the raw bandwidth used when mass-evaluating batches on the GPU. For training, I am seeing around 70-80MB/s sustained.


> Is TB2's lower bandwidth a bottleneck that would keep you from using a really beefy eGPU for these purposes?

What limits you is the power input and cooling capabilities of your TB2 box. A 1080 has a 180W TDP (before O/C), so you need to feed it up to 180We and dissipate that much heat, a Titan X increases that to 250W. TFA went with a 1050 because it only has a 75W TDP, and that could be achieved with their enclosure (by swapping out the power brick, the "out of the box" version only handles 30W TDP).

After that, it's mostly a matter of feeding the GPU enough stuff to do, giving it a small number of expensive tasks will utilise the limited bandwidth better than the reverse, which is why the efficiency of eGPU (TB2, but also TB3) increases as you increase rendering quality and the like (compared to internal) e.g. you might get 70% of internal at low settings and 90% at high.


The performance penalty for paging in data from system RAM is already high enough that most games are highly optimized to keep the necessary data in GPU RAM. Paging in random bits of data is more latency-sensitive than bandwidth-constrained. I'm not surprised that the penalty is ~25%.

Using the eGPU to drive the internal panel means copying the frame buffer back to the internal GPU which is where the latency and performance penalty come from; I'm not sure how it is implemented, it might actually be copying twice (once to system RAM, then again to the discrete GPU RAM). Having more lanes over the TB connection would probably help here, though it might not eliminate the latency.


TB2 is akin to using one of those x16 slots that's wired to x4 (or worse, since if you're using a Gen3 x16 GPU, you'd be running at Gen2 x4, which would be an eightfold reduction).

It'll depend on your workload, but generally, it is going to be visibly slower than the "baseline" performance for any high-utilization workload.


Typically x8 can be used with near zero performance loss in many/most workloads, and withGen3 x4 should be fine again.


Thinking about it more, addition to my other comment re. power input: above 75W graphics card don't draw power from just the PCIe slot (as it's limited to, you guessed it, 75W), so you end up needing a "PC" PSU with 6-pin and/or 8-pin PCIe Power Connectors to provide additional power to the GPU (which you'll still have to dissipate).


The 13" would benefit a lot more than TFA I suppose, since then we'd upgrade from Intel GPU to "serious" GPU, and not just dGPU to eGPU?


There is a mini PCI-E (as well as NGFF and Expresscard) solution or every kind of laptop, as long as you are willing to fiddle around with your notebook a bit (hardware wise)

http://www.banggood.com/de/Mini-PCI-E-Version-V8_0-EXP-GDC-B...

Mine is on the way, so for now I cannot comment on the stability of the system, but I expect it to be better and easier to setup than Thunderbolt, because PCI-E is longer on the market. Sure the Thunderbolt solution is a closed one-cable thing which looks way cleaner, but I read reviews from e.g. Asus ROG EGPU users, with unstable systems. I absolutely don't want to bad mouth the Thunderbolt solution, I would love to use it, by my system does not have a port. Also, the TB dock is super expensive.


I wish there was a "target mode" for desktop PCs. Basically a Thunderbolt3 port on the motherboard, which when connected would expose the internal PCI slots over TB3.

I don't know if this is even possible, but if so, it would be a fantastic solution. Keep a monster desktop PC for gaming, then shut it down and connect your MacBook and get a powerful computer for work.


Won't this be a matter of writing drivers?


And you'd need a motherboard with Intel's top of the line chipset (Z170) for TB support. Lots of people choose to skimp on the extra $30 or so as the perception at the moment is that it's only necessary for overclocking.


Most Z170 motherboards don't have a TB3 port either, as it's an added cost that most people frankly don't give a fig about. In PC motherboards, USB-C has almost total market penetration; TB3 almost zero.

TB3 is much more prevalent on PC laptops, but still nowhere near common.


Regarding the AC adapter, if you check that Amazon page it says it's by Mini Box and Mini Box itself http://www.mini-box.com/12v-10A-AC-DC-Power-Adapter sells the right adapter.


That one looks a bit more legit than the CD120100A I got. Perhaps I will send it back and buy this one instead!


Fantastic write up, I've been meaning to attempt this very thing having found the same thunderbolt enclosure and wondering why nobody had used a "low-end" GTX rather than fussing with ATX PSU wiring.

Not that I'm a stranger to hacking a PSU, but my MacBook is not a RepRap.


Eh, because he still had to hunt down a kinda sketchy China PSU from Amazon and hope that it could do the rates wattage, spent twice as much on the enclosure as the GPU, etc for a ~3.5x upgrade. I'm guessing that for most people, if they're considering this, the few hundred extra to get as much performance as you can out of it makes sense. Rigging up an ATX PSU isn't too hard, and one could put it all into a SFF desktop case if you care about it looking nice. Apparently TB3 enclosures are coming which also address all of these shortcomings except price.


Wow, what a well explained blog and pretty impressive that it doesn't need more hacking to get it to work! EGPU is really an exciting future!


An alternative would be the Wolfe GPU[1]

I really wish we had an Apple "firing on all cylinders" that released commercial products like this.

As a business I'm sure the current product line generates the right numbers in someone's spreadsheet but to me it's just so ...dull.

For the first time since 2006 I haven't updated to the latest macOS and have no plans to. I've seen nothing compelling in it compared to El Capitan.

I wish Apple had better OpenGL or Vulcan (for cross platform games) or eGPU support for internal laptop displays. Instead we have Metal. So far I've not seen an example of a AAA macOS game that actually supports it.

Then again I wish Apple would add an undo feature for the auto"correct" in iOS but obviously they've got their own agenda. :)

[1] https://thewolfe.io


The way you undo autocorrect in iOS is backspace once. The black suggestion tip will appear with your original spelling and tapping it will revert the change.


Eh, the Wolfe looks like vaporware at this point. They cancelled their kickstarter and their explanation was vague and unconvincing. Their website hasn't been updated fully to reflect the cancelled kickstarter. I don't see a release timeline anywhere, and there's no mention of how their software works at all (but they do allude to a software key, even though they imply it's plug and play?).


That's sad news.

Hopefully the other TB3 enclosures will come along at a reasonable price.


I really wish laptops would expose the screen to act as a monitor. I'd love an external GPU, but I don't want an external monitor.


There have been so many times in my "IT guy" history I would have loved to use my laptop's screen as a monitor for another computer. It's kind of crazy to have to track down a monitor just to see if a headless desktop is booting properly so I can remotely access it from the laptop that's sitting right next to it.

And my current MacBook has a retina screen, compared to the 1440x900 screen on my desktop. I'd LOVE to use my desktop's power on my Macbook's screen.


Non-hacky Thunderbolt 3 eGPUs like the Razer Core do offer a loopback mode where the video feeds back to the laptops screen over the TB3 cable.

The extra traffic on the wire does incur a small performance penalty over an external monitor though.


A few Sony Vaio models had that feature circa 2011 (HDMI in to display on the 16" WXGA (?) 1080p screen). I was very close to getting one.


Word of warning for anyone thinking of doing this with a Dell XPS 13/15, the 2016 models (at least) only support 16Gbps of PCIe over the Thunderbolt 3 link.

http://www.notebookcheck.net/Design-flaw-in-XPS-9350-9550-ma...


Interesting-- I was not aware of that!

That's the same bandwidth as TB2, which will add another 10% performance penalty versus a native 16-lane PCIe gen3 slot, for a total of 20% performance lost. Still completely workable.


Nor was I until I started reading around this topic earlier. It's certainly still workable but just worth keeping in mind before anyone springs extra £££ for TB3 and the latest crazy-powerful GPU.


If GPU manufacturers can make an eGPU that's standalone and connect to PC/Laptop via Thunderbolt3(USB-C), that'd be pretty neat especially if they can make it <$100 more expensive than the cost of GPU in PC version. I can see a large portion of people who doesn't want to spend $300+ on an external GPU box just to be able to upgrade the $200 card inside.


< $100 might be tough.

You've got to have a decently made chassis to accommodate and cool a full-sized card and a meaty power supply for the latest and greatest GPUs. And that's before you add any docking functionality for extra ports.

The cheapest thing I've seen to date is the Alienware Amp which sells for around $150 but uses Alienware's proprietary port, which IIRC, is a bit slower than TB3.


I haven't seen them tested side by side but actually both TB3 and the Alienware Amplifier offer 4 lanes of PCIe gen3. So they should be pretty much identical.

As for the lower-end pricing, it's very feasible to fit a laptop-oriented MXM GPU moodule in a lower power envelope. The GTX 1060 mobile variant only draws 80 watts, and the GTX 1050 mobile sips power at 50 watts. They do obviously need active cooling, but not a LOT of active cooling.

Note that the 10-series of GPUs are not cut-down versions like previous generations. The 1050ti mobile is just as fast as the 1050ti desktop card at same clocks, they bin the chips so it can work with less power.

It's very reasonable to imagine a sealed laptop dock (as opposed to a full enclosure) with a 50 watt 1050ti inside selling for $300. Comin' soon!


Another poster mentioned the Wolfe earlier in the thread, which has everything I want in an eGPU box: https://news.ycombinator.com/item?id=13304774

The price is a bit steep for the hardware it offers, but it looks super portable and just powerful enough for my use cases.

Too bad it's apparently become vaporware.


Does anyone know if the GPU can then be passed thru to VBox or Fusion?


No, this cannot be done. In theory, it should be possible with xhyve in the future, but at this moment, no PCIe or TB passing is possible because of the lack of support on the hypervisor side on desktop operating systems.


What about these setups makes them more complicated to support than simple PCIe passthrough? Because PCIe passthrough is definitely a thing you can do currently in several virtualization implementations.

(Unless you mean you can't do it while also using it as a GPU in the desktop, which is true.)


If it works in Mac OS sure, although of course the performance won't be as good. The author seems to only be running windows for these tests


I think you could fit entire gaming PC in that little box. There are low profile gaming cards.


Yeah, though I imagine it would have to be a pretty weak gaming PC due to cooling issues.


Yep, you're not going to cram a proper gaming PC into 160inch³. AFAIK the smallest case designed for gaming GPUs is the Dan A4 at 433inch³.

https://www.dan-cases.com/dana4.php


There's also the NFC S4 Mini, which only supports half length GPUs (Gigabyte makes a good 1070). It 4.3L or 262in³.

Both of these options are certainly smaller than a macbook plus an enclosure though.


The S4 Mini is news to me, it looks very nice. It does use an external power brick though, unlike the A4 which is all internal.

Zotac just announced a mini GTX1080 so you could get some amazing performance density in the S4.

http://www.anandtech.com/show/10947/zotac-announces-geforce-...


The S4 Mini does look really awesome! It almost makes me want to build another desktop!

However, having 1 computer that does everything still has a lot of value for me over having to context switch between a desktop and laptop and constantly worry about keeping everything in sync like I do right now. So my next machine is still going to have to be a laptop + eGPU combo.



200W can be handled by two fans. Components today are very efficient.


One of the things I imagined and researched for thunerbolt was this. Unfortunately still quite expensive. Only due to supply/demand.

I hopes apple would offer an external gpu to use with their low powered laptop gpus


I am very curious if eGPUs would work with Mac instead of windows. I am a video editor with final cut as my main tool and if I could offload to a external GPU, I would immediately do it.

Anyone knows if this works?



Oh damn! That's impressive. I will definitely do some more research on this!

Having a TB3 core with graphic power at home that I could just plug in if I need the power, and otherwise carry a small lightweld MacBook with me sounds too good to be true. If it's daisy-chainable I could imagine quite a nice dock-setup with external display that could turn a small MBP into a work-horse whenever needed. I'm intrigued.


No, Appe deliberately blacklists eGPUs in macOS. It'll work out of the box only on Windows.


Dang. Any idea about potential ongoing hack projects to make this work?


I'm incredibly happy running a 2008 Mac Pro with a GTX 960, 32GB RAM and intel i6700 :) It's unfortunate that I also had to replace the case to fit the Samsung 950PRO in there, but wow, Sierra is screaming fast and Tensorflow runs perfectly.


[deleted]


Nobody knows what intel's licensing fees are, because they won't allow licensees to disclose them. You're repeating rumors as fact.


I understand the curiosity aspect of "Can/could I do this?", but from a practical standpoint, why? A gaming desktop is more practical, cheaper, easy to improve and change, etc. But if your problem is mobility, a gaming laptop is probably cheaper to start off with, then it definitely is after the external gpu.


My use case is admittedly a bit esoteric but: a) I need a MacBook for work, b) I only have room for a single computer in my life, c) I already have a pretty powerful computer with a TB2 port. Plus, it's nice to keep things relatively simple.


I have a mid-2015 MBP, and have it connected to a 4K monitor. Looks beautiful, but there's some obvious jerking going on from pushing so many pixels.

An eGPU would help solve this issue completely, and I could even go for dual 4K monitors (or more).


Nice hack, but wow, that's evidence of violence against a mouse (second photo), call the PETA ..


This is really amazing! I will be keeping an eye on this tech, as I do a bit of gaming on my mbp. It seems obvious that the prices will continue to fall on these components.


Related question: is it possible yet to connect an external GPU to a Raspberry Pi (where bandwidth is not the bottleneck)? If not, why not?


I would imagine you can, but nothing gaming-grade because you're doing it over USB 2.0 which is even worse on the Pi than it is on the PC.


Isn't the main cost of eGPU enclosures the thunderbolt-to PCIe-board? Why can't Intel sell them for cheaper?


Anyone here used a similar rig for machine learning?


This is great. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: