There's also an eGPU subreddit and I wrote a sticky on U series chips, PCIe lanes and Thunderbolt 3 https://www.reddit.com/r/eGPU/comments/7vb0gg/u_series_chips... which is more interesting for PCs than Mac.
What I'd like to see is a Mac with a nice big fat Nvidia GPU. Why can't I buy a Mac with a 1080 ti? My suspicion is that Apple wants to get into the GPU game themselves, and bring some of the expertise from their mobile GPU team to laptops and desktops.
That'd be great, if they actually released something competitive. So far, they haven't, and Mac users, who would love to pay for better technology, are left out in the cold as a result.
My suspicion is that they barely even want GPUs, which they seem to see as annoying sources of heat and noise, getting in the way of their ultimate vision of an imac so thin you could use it as a kitchen knife.
(I wish I were kidding about this, but I am not.)
For Apple, they want thin and light. Mobile is both things, and the thought of designing a system that can cope with the 200W+ draw of desktop GPU parts doesn't even factor in to their thinking.
They're tossing a bone to the users who need actual heavy-duty GPU power by talking about the ability to use external cards, but (in my opinion) considering the cost and performance issues that come with that approach, it is more an admission that they truly don't care about that segment of the market than an actual solution.
The AMD Vega 64 has got to push a few hundred watts.
Microsoft has the Surface Book 2 with GTX 1060 and they must be looking at that as competition for the high end MacBook Pro.
According to: https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-...
Vega 64 has a board power rating of 295W.
According to the imac pro tech specs from the link you provided, it has an idle draw of 64W, and a peak draw of 370W for the entire system.
Unless I'm mistaken, that seems to imply that there has to be a good deal of throttling between the CPU & GPU to come in at that power budget. (The specs also note an additional 50W(!) of potential draw for fans alone if the imac is operated in a warm environment.)
Am I mistaken?
Not arguing, just hashing out my understanding. Your point regarding the Surface Book 2 and the 1060 makes sense. It does look like I was technically incorrect about them not engineering to a 200+ watt GPU, although I do wonder what sustained draw it can handle thermally compared to a full desktop part, and how much that would affect performance.
John from Mantiz reported 650W peak. Surely only for milliseconds but still. https://egpu.io/forums/implementation-guides/2014-mac-mini-v... it is absolutely not in the interests of an eGPU chassis manufacturer to report figures which hinder their own sales so I believe him.
Similarly, someone with an awful lot of AMD knowledge (AMD employee?) on the egpu.io forums pretty much begged people not to try the Vega 64 in a Sonnet 550 hinting at similar problems https://egpu.io/forums/thunderbolt-enclosures/sonnet-says-th... and saying a solution is in the works.
Sonnet http://www.sonnettech.com/support/kb/kb.php?cat=524&expand=_... says the Vega 64 is only supported in the 650 box -- which was just released pretty much because of the necessity to support the Vega 64. Something is rotten in Denmark: 650W power supply supports up to 375W card (up to 8-pin + 8-pin power connectors) plus provides additional 100W of peak power.
I wish they offered a full size, liquid cooled but fans included high end/Nvidia GPU desktop. A system that’s not afraid to be noisy because their users aren’t seeking that. Then sync up the less hardware intensive tasks between their mobiles. But maybe that desktop market died in their eyes.
Ok... so we had a bunch of the new MBPs at work and kept having keyboard issues (like 50% of the team complained about the keyboards and had keys that stuck within the first 3 months). I brought one machine in to the Apple Store and was scolded for eating while using the computer. "Hey Guys, I'm sorry that humans weren't in your target audience focus groups -- the new butterfly keys are amazing!" Wasn't even my machine... just doing a favor for a friend. Ugh.
That was only when they needed to survive.
I personally disagree with making engineering compromises on the human interface to make things a few mm thinner, but the results speak for themselves. People keep buying them because Apple has succeeded in the number one goal of branding - their products are instantly recognizable as luxury goods. So if you're a pro developer who doesn't care about the branding, you just buy a different laptop that has the attributes you do care about. If developers abandoning the Macbook Pro because of nerd rage about low key travel and the ESC key cause sales to plummet, I'm betting Apple will change the keyboard. I'm also betting that the sales won't plummet.
Xcode is far better on the desktop but for terminal + vim + broswer type work, the laptop screen is perfectly reasonable.
As for using OSX, I've just gotten comfortable with it. I never have to fiddle anymore. I always wait to upgrade OS's and it generally just stays out of my way. Windows is annoying and I haven't found a comparable fussless linux laptop yet (and I'm actively looking because I won't be able to keep avoiding that stupid touchbar).
Or is this a misunderstanding on my part
In games that want to load fast loading a save game might drop you into a pretty low fidelity world and then start loading in the higher res assets.
The fail0verflow guys bridged PCI-E over a 115200 baud UART and it still worked.
Eh, desktop and mobile GPUs are radically different beasts. My thought is that since they write their own drivers, and they obviously don't have the resources they once had in the MacOS team (see all of the crap High Sierra went through), they simply can't support more than a few SKUs.
They said the upcoming Mac Pro will be "modular", which I think is a clear sign it will have external GPUs, or certainly will lean heavily in the direction of encouraging users with a heavy GPU workload in that direction.
And they seem to generally "dislike" putting discrete GPUs in laptops, going all the way back to the Titanium G4 days.
Much of Apple's own discussions on the eGPU feature at WWDC/the marketing blurb for their own eGPU devkit refer to VR a lot too. Apple added VR support to Final Cut Pro at roughly the same time as well.
> And they seem to generally "dislike" putting discrete GPUs in laptops, going all the way back to the Titanium G4 days.
Not so sure I'd agree with this - every single Titanium PowerBook model ever made had a discrete GPU. Even the iBooks had dedicated GPUs. It wasn't until the adoption of Intel processors that integrated GPUs was even an option for Apple really.
As well as ML development, which is a pain w/o a local CUDA-compliant GPU. Yes, you could do it on the cloud, but sometimes it is easier to do ad-hoc stuff locally without having to spin up GPU machines and remember to spin them down.
I use OS X for all software development except CUDA at the moment. This is huge.
Because it is. High end AR/VR is exclusively DirectX Wintel 64 based. There's pretty much no exception except for the toy implementation of things like Samsung Gear. Google abandoned Tango, and Daydream is on life support. Apple's ecosystem will remain irrelevant until they release their own hardware.
So while that is not a cheap option, Apple does sell powerful hardware. That's a decent AR dev machine, surely.
If, by "worried", you mean: they went out of their way to smite and thrash the organic core of their customer base, ignored their most basic requests, and deleted five years from their product lifecycle (nobody used the trashcan mac pro) ... then yes, I suppose they might be worried.
Maybe they noticed how many copies of High Sierra are running on firmware-modded 2009/2010 mac pros ?
At this point, I'm just waiting on an updated MacBook Pro 15in (ready to buy in again, but too far into the current MBP product cycle. Might as well wait it out) and that Sonnet box and calling it a day.
Against everything I hold right and proper, I installed Windows 10 with Boot Camp. I only ever got the external display to work once, and couldn't repeat it.
I wound up just moving the RX580 to my aging PC as an upgrade.
I REALLY want to live in this world, but it's going to take natively-developed games to get there. But hey, they're coming. I'm buying them, and, often, not even playing them; just trying to send the message.
Developers have been starting to leave the platform because there aren't any MBPs that can be used to create VR content. While I agree that they will likely release something, their hand was also forced here.
For those devices, even current hardware is good enough. It is no point being able to produce VR with a level of detail that an iPhone is not able to cope with.
“Development of ArchiCAD started in 1982 for the original Apple Macintosh.”
Excel was also released initially on the Mac.
Developers have also been leaving the platform because they can’t buy a Mac laptop with physical function keys without making other performance sacrifices.
The Tibook g4 had a discrete GPU. Not having one wasn't even an option until Intel's integrated doodads got good enough.
Does iOS show any signs of supporting external GPUs?
"As part of doing a new Mac Pro — it is, by definition, a modular system — we will be doing a pro display as well. Now you won’t see any of those products this year ; we’re in the process of that. We think it’s really important to create something great for our pro customers who want a Mac Pro modular system, and that’ll take longer than this year to do."
Ever since hwtools (supplier of expresscard/mini pcie breakout boards) said they were not allowed to release a thunderbolt version, it was clear to me that this generation, the dma-exposing peripheral interface was finally meant for external graphics processing. And who, oh who could be the prime customer?
... Who seems to have the slowest product pipeline in the world, blocking technology adoption for years?
Not too many candidates :)
Seems to be limited to AMD GPUs at this time.
eGPUs are supported on MacBook Pro notebooks released in 2016 and later1, iMac computers introduced in 2017 and later, and iMac Pro.
Some people say it's an upgrade, but I don't see why they couldn't upgrade and keep the other ports around, since people need those daily.
For sure I've used my 89-euro dongle a lot more than I've used the USB-C port (which is never), until I got sane again after 8 years of getting overcharged by Apple and switched to Microsoft hardware running Xfce Linux (which are both wonderful).
It was the only adapter I could buy at Juice in Florence, Italy (where I bought the laptop) that had HDMI at the time.
The laptop was EUR 1799 ($2,150), the dongle was EUR 89.
After 2-3 months the keyboard started failing. It was replaced by Apple--along with the battery which they found out was faulty, and it started failing again (missed keystrokes, registering twice) after a short while. They replace it _again_, then after 3 days the logic board dies, taking all data with it. Apple replaced it, I sold that piece of shit of a laptop for EUR 1,000, and the guy called me after 2-3 days that the logic board died again and Juice (no Apple Store in his area) would give him a new one.
That was my about 5th and last Apple laptop.
I read lots of reports of these things happening to lots of people, but not getting a lot of attention anywhere for some reason, despite the fact that those things cost $2000 and you should absolutely be able to type for more than 2-3 months without replacing the keyboard.
Yes. Your original point was something along the lines of you being overcharged by Apple because you now had to buy an expensive dongle. I showed that the expensive dongle you were talking about wasn’t actually as expensive as it seemed, which weakens your argument somewhat.
Regarding the keyboard on your laptop, I’ve heard plenty of anecdotes of the it just stop working on people, so I’ll concede that it probably has some sort of inherent issue there. You seem to be unlucky with an extreme case of this.
(All this under the assumption this happens in the first 6 months, when the law assumes a fault at the time of sale as a default and consumer rights are strongest)
Around 20% is normal. The USD has slipped further than I thought compared to the Euro, and prices are usually a bit higher here. There are various explanations:
* Higher staff wages
* Government-mandated warranty
* Willingness to pay more
The same product is €79 in Germany: https://www.apple.com/de/shop/product/MJ1K2ZM/A/usb%E2%80%91... (or ~87€ in Denmark!)
Of course, if VAT is 60% like you seem to think... That would be a lot! ;-)
Here it is. Right from the mother-ship.
2010, 2012, 2013 + 2014 MacBook Pro’s to name a few instances
It was a huge deal affecting major computer vendors like Apple, Dell, HP etc at the time, with failure rates on laptops with these chips being ridiculously high. I've also long wondered if this played a part in whatever reason Apple have for avoiding Nvidia GPUs for so long.
Not usually a huge fan of theinquirer's tabloid style, but they did a pretty good job reporting on the issue back then.
Meanwhile, the past few generations of graphics gear in Apple computers has been a pretty wide range of AMD stuff. Vega, Polaris, FirePro have all been represented in the last 5 or so years. All they had to do was modify the driver to be able to read from the TB3 PCIe device rather than just an internal one.
The drivers are terrible for a GTX 1060 in 10.12. After running for a while, even basic desktop operations bog down to the point of being unusable. I haven't tried 10.13 yet.
While with AMD, the drivers for RX460/470/480/560/570/580/Vega are already built-in and you only need to correctly inject them. You just need to use one kext (kind of a driver in Mac) called WhateverGreen (to be fully correct you also need Lilu, but you almost always have it anyway on your Hackintosh). It works without issues and is generally preferred among Hackintosh community.
Another thing is that tonymacx builds are generally bad if you want a stable and easy to build Hackintosh (at least as easy as it can get). They are being sponsored and their choice of hardware is strongly biased.
I would checkout:
A lot of people have put in a lot of work to make it as easy as possible to setup. I would just make sure you setup things one at a time and don't immediately jump to trying to get TF or pytorch to work after installing the drivers/ following any sort of guide. Verify your Cuda installation first by building the sample programs and running them. (see http://docs.nvidia.com/cuda/cuda-installation-guide-mac-os-x...)
Other than that maybe checkout: https://gist.github.com/jganzabal/8e59e3b0f59642dd0b5f2e4de0...
The main thing is just getting the GPU drivers setup, after that installing tensorflow requires some modifications to the source (relatively trivial find and replace), and I don't think you have to do anything special for the latest version of pytorch.
1. Make sure you're using a thunderbolt 3 cable with whatever egpu you have, other cables will not work even if they're usb c. (USB 3.1 != Thunderbolt 3 != USB C) Read about the differences.
2. I would recommend the AKITO Node enclosure as it seems like most people use it and the community is really small already so if you aren't it might be more difficult to debug issues, but I wouldn't say you couldn't use something else.
3. You'll want to have a docker container that has the CPU version of tensorflow/whatever so you can use that when you don't have the GPU readily available as tensorflow won't work if you installed it with GPU support and your GPU isn't there.
4. If you're trying to use Nvidia-Docker I'm not sure if anyone has gotten that to work on mac because device binding isn't supported on mac for docker. You might be able to get this to work by modifying the docker VM but I'm not sure.
Now they seem to be realizing they're losing that segment of the market to PC side of things, and that people in that segment are often thought leaders. If content creators have PCs, content usually ends up working the best on a PC.
Note: As of version 1.2, TensorFlow no longer provides GPU support on macOS.
Will eGPU support help change that?
Although, it is possible to compile TensorFlow from source with CUDA support for macOS. You have to make a few tweaks to the source code and symlink some libraries.
It doesn't let you hot plug your eGPU into your computer (you have to restart for it to work) but for officially supported eGPUs you're now able to do that.
I've trained various models using a Titan XP and it's so awesome to be able to maintain the portability of your laptop and still get all of that power. A large benefit is also not having to move training data around between servers and other machines if you have it on an external drive or just on your laptop.
After you get everything up and running there isn't that much maintenance or anything you have to do regularly to keep it working. The speedup is incredible and it was definitely worth it for me personally, however it's not a walk in the park to get it set up initially.
There are beta NVIDIA drivers (http://www.nvidia.com/download/driverResults.aspx/131833/en-...) for macOS, but your mileage will vary.
The reason for that is Apple hasn't shipped a iMac/MacBook Pro with an NVIDIA card in years, which is especially annoying for DL work.
What is a good GPU for a budget for someone that just wants to experiment with GPU's like OpenCL or some machine or deep learning?
If I didn't have a budget what would a better card be? :-)
AMD got the following models (mid-range, kinda high-end): RX 540-580 and Rx Vega 56 and 64. The RX580 got an MSRP of around $200, but you can only get them for $400+. Same problem for the other models.
Nvidia got the following ones (mid-range, high-end): GTX 1050-1080 and Ti models meaning a boost in power. You might be able to pick up a GTX 1060 for $200-300 which is still more than the $200 MSRP, but that's what you can get.
So the main thing that would worry me is to have a monitor that is capable, far longer than the GPU that’s integrated within it. Like a inverse Mac Pro - where currently the internals are sound for the majority of use cases - with the exception of the GPUs which are quite poor performers at that price point.
Now we seem to be in pretty good shape to accomplish such a solution - the technologies are there with thunderbolt 3 offering enough performance now to accommodate a high end GPU with not too much performance loss.
Edit: 126.96.36.199 is not a version of macOS.
Here is a nice comparison of the eGPU enclosures available today: https://egpu.io/external-gpu-buyers-guide-2018/
My personal recommendations are the Akitio Node Standard/Pro or Sonnet Breakaway Box 350/550/650. These companies are reputable companies in the Thunderbolt hardware realm.
Note that I am not affiliated with any of the companies mentioned above.
However, as seen here , people have managed to use other GPU cases, with NVidia cards.
You could have observed something similar with e.g. SATA 1/2/3 SSDs and M.2 PCIe. For normal workloads, each new generation was slightly better performing, i.e. booting OS etc. But once you went into processing 4K RAW video, only M.2 PCIe was usable.