It's sad to say but this is the only way to get a decently powerful desktop Mac right now (that's not using server grade Xenon chips, I don't need ECC).
One thing that's unfortunate though (about a Hackintosh) I think about to change (maybe already has) is the use of a R9 280x due to driver issues in OS X (i.e. there are no drivers for newer GPUs). I think now you can use, or soon will be able to, Pascal based GPUs which would great for deep learning/CUDA. It's a shame Apple has been relentless in their quest to ship inferior GPUs in Macs.
The biggest beef I have with this Touchbar MBP I've got is the GPU is pretty crap compared to NVidia's offerings-- and I think it's somewhere that Apple could dominate. Sure it'd take partnering with GPU manufacturers, innovation, and money in the space but that's exactly what the Mac needs right now and Apple is fully capable of... or someone's god forbid sacrificing a small bit of portability for the power.
I really want to love the mac but outside of iOS development I don't have much of a reason to run it these days. Windows + Linux subystem is pretty great, or just booting directly into Linux.
To be honest developing in native Linux lately has been wonderful, it's incredibly fast, driver support is pretty damn good, and desktop is attractive enough.
I even have Thunderbolt 3 ports on my desktop which have been good, I just wish VNC clients were faster so I could network-in my Macbook Pro's screen better. The latency even over thunderbolt three is pretty atrocious.
Edit: Just so I don't SPAM another comment, as the nice folks below have pointed out... PASCAL is supported. Upvoted you all because I'm so happy. Gonna get my hackintosh back in action tomorrow.
Edit 2: I'd also like to say NVMe is mind blowingly fast on the desktop I've got and something sorely lacking from Apple's offering adding another downside to the current Mac desktops.
Edit 3: Cleared up note about thunderbolt 3 and VNC.
> Windows + Linux subystem is pretty great, or just booting directly into Linux.
I built this new computer recently and gave Windows 10 and Ubuntu 16.04 both a shot. I decided I didn't want to relearn how to be productive with them and turned it into a hackintosh, and I'm very happy with it so far.
> They are supported! I'm writing this comment on a hackintosh running a GTX 1060. Nvidia released Pascal-compatible Mac drivers 2 weeks ago (April 11).
Same! I was waiting for the pascal drivers to take the 1080 Ti plunge, and I am super happy I did (literally installed it 48h ago, upgrading from a 980). I have a 6700k "mac" that flips to a top of the line gaming pc if I am in the mood for that.
There's the base guide at tonymacx86 [0], which got me 95% there. Then it was a matter of learning about specific things (like installing and enabling NVidia drivers, etc).
It doesn't take a ton of time. I did keep great notes about all the process in hopes I'd have time to write an extensive guide, but still haven't quite been able to come up with the time to write it.
Try running Instruments on it, or opening an iBook. Won't work. There's issues (driver code signing related, I believe) that appear when trying to run apps that use Metal.
Not sure I agree that the overhead of running windows and linux subsystem is greater than keeping a Hackintosh running. I ran a Hackintosh for several years, but finally got tired of the update process, waiting for the right video drivers to be released, maintaining all this context, and then still have some quirks every now and then.
The only solution to me would be running OS X on my own desktop hardware. I'd gladly pay Apple a nice premium for this.
I get that, and I haven't done any updates yet, but from the research I've done I don't expect much trouble until the next major version of macOS. We'll see how it stands up over time.
Its actually really unfortunate this is the only way go to get power/ adaptability.
I had a hackintosh for a couple years and while it ran great I was always nervous the next OS upgrade would render it useless. I got a Mac Book Pro and retired it (Linux and sold to a friend).
Recently I went to a projection mapping seminar (its kinda like DJing but using video loops), everyone who attended had older macbook Pros (with hdmi to power the projectors). The production machines they use to do the projections live are all microsoft windows machines, as its the only way to get enough video horsepower to run the multiple displays.
The artists running the seminar said they couldn't recommend macs anymore, excepting for some backend asset creation. Since most software is mac/windows right now its not a bid deal, I worry if the mac demand goes south, the software won't be ported.
I'm hoping linux starts being more viable solution for content creation. Seems like its heading that way.
I've been VJing professionally for ~12 years with some friends and we have always exclusively used Windows machines because our internal testing showed that Macs were rarely enough to handle high-quality multi-screen projections properly. For the same price as a top of the line Mac, we were able to build a PC with roughly 2x the specs (and always a better GPU). We did however rely on Macs for QT encoding (usually PhotoJPG for best quality/size ratio), because even our fastest Windows boxes couldn't keep up with our slowest Mac (our conspiracy theory is that Apple intentionally crippled QT encoding on Windows). Over the last couple years we have switched purely to content production so we let another external company do the projections + mapping and we don't care what they use because it's doing mostly just video playback.
Do you have a writeup on your content production pipeline? I'm terribly interested. I used to use VVVV for sound-reactive video that I would project at parties and small DJ shows around town, but then I spent 8 years living out of a suitcase and couldn't keep up with the hardware demands.
We started out in a similar way, doing parties and shows, barely surviving. Then HD came along when all our hardware was PAL (720x576) i.e. useless. We couldn't afford to buy replacements for our MX50 (never mind that such a beast didn't and probably still doesn't exist), EDIROL V-4s, reliable DVD-players, projectors, multi-channel MPEG decoder cards, etc. We have several rooms full of PAL hardware we will probably never touch again, and which nobody would probably buy except for nostalgic purposes.
I continually experimented with automating sound-reactive video, but it was unpredictable and difficult to get a pleasing, "fluid" result which didn't cause seizures, so we always did it by hand using hardware such as EDIROL V-4 or the MX50. It was also a lot more fun to punch buttons and slam sliders to the beat of the music ;)
Do you mean current or previous pipeline? Unfortunately most projects are somewhat unique and require at least some variation, if not a complete overhaul in workflow, depending on what the customer wants. I never got a chance to use VVVV in a commercial capacity, although there is an upcoming project where we are seriously considering it. Generally we used custom in-house software (e.g. for 360 projections using 2x 4-channel MPEG-2 decoder cards) or VJ software such as ArKaos and Resolume. Like I said before, these days it's purely content production, so the pipeline is relatively similar to a game studio. We still encode everything into various QT formats and we rent hardware for shows because as you said, it's hard to keep up with ever-changing hardware demands.
If you are interested in our projects, check out http://www.motionlab.at (go the old site for now, the new site is WIP).
The seizure-inducing aspects are my favorite part! One of the DJs once complained that every time the video came on people would stop dancing and stare at the screen with dazed expressions. :)
Given that you're mostly doing synchronized video on the projectors, I'm curious: if you did this setup today, would you still attempt to use a multi-GPU setup? Or would you prefer something like a bunch of cheap video-decoding-capable chip PCs attached to each projector/display and then networked to a computer that 1. sent (MIDI?) control cues and 2. acted as a NAS for the devices to stream from?
We have worked with both, but I would vote for the former - there are less points of failure and you don't need an IT person to make sure every node is running smoothly. We built the latter using multiple 4-channel MPEG-2 streaming cards + MIDI racks and used it for all sorts of events (the biggest was 360-degree back in 2006: http://www.motionlab.at/projects/content-production/urban-ar...) but there were always problems and I often had to rip hardware/software apart to figure out what was wrong. That's why I ended up writing a minimalistic tool for controlling the cards, the vendor's included software wasn't meant for our type of work - thankfully there was a decent C++ API available. Workflow-wise it's also much easier/faster to render one big video than a bunch of smaller videos, not to mention you can check final output on one screen. We wired up 8 mini-screens in the rack to preview the streams, but they were quite small, making it difficult to determine if the footage was ok.
Using a hackintosh for mapping/vj'ing as the off the shelf hardware isn't at all practical and I want to keep using OSX only software.
Honestly it's a nightmare.
I want as many video outputs as possible, enough power to run multiple layers of video, be running game engines at the same time, transcode a video in a pinch and more.
Switching to windows is probably the only option going forward.
I'd also be curious of numbers from people who DO run ECC about how many times it's saved them. Some things it's really necessary for (financial transactions comes to mind). That said, it should be much easier to get ECC in consumer hardware. Major props to AMD for their recent chips that allow it. Hopefully Intel follows suit.
I run ECC everywhere possible. I know of two instances when it mattered (detected a failing chip), and suspect it was correcting single-bit errors for a while before that in those cases. I've also resurrected someone's laptop by determining it failed due to bad RAM and replacing that. (Easy enough diagnosis - intermittent, random-seeming hard-lockups and corrupted data on disk.)
ECC also thwarts Rowhammer and similar attacks, if that matters to you.
Note that it isn't enough for AMD to restore ECC support in consumer kit; you also need motherboard support, and the MB makers are also complicit in raising ECC costs.
The problem is that it is unsupported, so getting a board where the BIOS can enable it & you can count on it working is a bit of a crap shoot. I had the same problem ~8 years ago when a friend and I built new desktops. You've got to do a lot of manual reading, forum reading and review reading before buying a board. Or buy it locally from a place with a good return policy. This is quite a bit different from server grade kit where ECC is fully supported.
Also, it's quite depressing when you need to change the motherboard especially if you are obsessed with cable management and invested so much time ensuring great air flow.
I had a K6/2 with ECC back in the late 90's. So, yes.
But for whatever reason, AMD never seems to put ECC in the bullet lists for why you should buy their parts. I guess as someone else mentioned its because the motherboard manufactures don't enable/qualify it even though its likely just a matter of firmware tweaks ever since the DRAM controller moved on chip. I got it working on a cheap phenom II/gigabyte motherboard (IIRC) some time ago as well. In that case I don't think the motherboard even advertised it, but I had some unbuffered ECC DIMMs lying around and I plugged them in, and they worked. Of course the only real indication besides booting the machine that it was actually working was a kernel blurb during boot about it. I don't think I got the EDAC reporting to give me soft error rates at the time.
I don't have the numbers handy but here's a basic explanation. Due to the amount of RAM we all run today the probability of having a RAM error is surprisingly high. IIRC it's at least once per year.
Google at the time was buying memory chips that had failed manufacturer QA, stuffing them on to DIMMs themselves, and then running whatever seemed to pass.
That number was consistent on-the-ground pre-launch and post-launch (with the exception of a short period of higher error instances due to a solar flare).
There are lots of alternatives to VNC that are much more efficient and depending on your connection can even stream HD video. Unfortunately many of the alternatives are poorly documented.
RDP will give better results and there are tons of implementations.
NoMachine is probably the best documented and supported. However is proprietary but they have a free version. I believe it can do streaming video.
Spice is poorly documented and seems to not be maintained however it is very efficient and can stream janky video.
Have a look at XRDP with libfxcodec for streaming video (need lots of bandwith) otherwise without libfxcodec for decent low bandwidth
I've done some research into streaming video on a remote desktop for a project before and I've never heard of libfxcodec, can you go into more detail about it? I can't find anything about it.
Mac OS "as it's now called" is a compelling offering because of third party commercial software support including Microsoft and Adobe. Linux is great for some developers who work with stacks that live entirely in Linux. There are still some things though that it doesn't stack up well against like the software available for Windows or Mac.
As someone who does front end graphic work to mockup apps before building them as part of my workflow I couldn't see myself switching off Mac OS unless Microsoft drops the registry, replaces NTFS, stops with the spyware being built in, disables the forced updates and finally goes with one UI/UX for everything.
> I couldn't see myself switching off Mac OS unless Microsoft drops the registry, replaces NTFS, stops with the spyware being built in, disables the forced updates and finally goes with one UI/UX for everything.
And for all intents and purposes you can pretend it doesn't exist, at this point.
I've opened up Regedit in the past year, but I work with a piece of benighted software that supports a weird but useful feature designed fifteen years ago when doing unfathomable things with the registry was something people did.
These days XDG spec and most apps put things in ~/.config/, either as a single file or in their own subdirectory. It's easy to navigate and search, and apps can use whatever config format they want.
I'd argue that's preferable to a complicated hierarchy of obscurely named keys.
Might not look better, but I can copy all my settings from computer to computer, unlike the registry.
Also, the registry is used for a lot of stuff besides configs. So my experience on Windows is that you have to run the installers (=slow), rather than simply copying all the files to clone a system.
Plus, it tends to fragment all across the disk, or something like that, and your Windows system inevitably slows down over time. Maybe SSDs eliminate that problem; I (fortunately) haven't had to use Windows in quite a few years.
It's straightforward to copy the per-user software settings from one registry to another.
You can't copy full software installs like that, but it's not the fault of the registry. Installing typical software means affecting dozens of system settings all in different places. You can't easily copy linux software along with every setting it affects either. You're best off either reinstalling or copying the entire drive, both of which work on Windows.
I haven't done any VNC client comparison, and don't use it regularly anymore, but JollysFastVNC[1] was originally written to be a much faster VNC client than the others that were available. Still works on macOS 10.12.
That's to be defined. I for example use a MBP with an E-GPU Setup (Sonnet Echo Express-III with a Pascal GPU inside). Put the MBP in my Docking station and I have a decently powerful desktop Mac. I love it, I hope that is the future.
The penalty is about 25% for my setup, but I'm still using a TB2 MBP and an external SSD on the same TB port. I think that's fair but of course with TB3 the performance penalty should be less.
But there are unfortunately other things to consider. In my case the Echo express enclosure is not made for Graphic cards and you have to modify it to make it work - done in about one hour when you are really careful. The wait for Pascal drivers was sad. Other than that the only other thing that comes to my mind is that you cannot remove the MBP from the docking station before shutting it down. (I have read that sleep should also work but it does not for me)
Edit: If things work really depends on the exact setup. I have just seen the video posted in another comment (https://www.youtube.com/watch?v=yho3rCNfzGE) and it looks like I have just been really lucky with my setup.
"I'd also like to say NVMe is mind blowingly fast on the desktop I've got and something sorely lacking from Apple's offering adding another downside to the current Mac desktops."
I see that folks are putting nvme cards into their older mac pros:
I would suggest this card, it should work, it is SATA based as opposed to NVMe. I have installed a few of these in on several older systems (x58 based and others) and saw a very large performance bump, even versus SSDs connected via aftermarket PCIE SATA-3 controller (running at 500-550mbs). YMMV if attempting to clone an existing drive - I have had this problem with add-in PCIE cards, older motherboards, and various operating systems - a fresh install to one of these drives will likely be easier.
Xenon is a gas, and is also a chip used in the XBox 360 [1]. The server chip model from Intel is called Xeon.
Sorry to be a name nazi, but some people may not be aware of the difference. Its hard to tell online if people don't notice the error or don't know they've made an error.
One thing that's unfortunate though (about a Hackintosh) I think about to change (maybe already has) is the use of a R9 280x due to driver issues in OS X (i.e. there are no drivers for newer GPUs). I think now you can use, or soon will be able to, Pascal based GPUs which would great for deep learning/CUDA. It's a shame Apple has been relentless in their quest to ship inferior GPUs in Macs.
The biggest beef I have with this Touchbar MBP I've got is the GPU is pretty crap compared to NVidia's offerings-- and I think it's somewhere that Apple could dominate. Sure it'd take partnering with GPU manufacturers, innovation, and money in the space but that's exactly what the Mac needs right now and Apple is fully capable of... or someone's god forbid sacrificing a small bit of portability for the power.
I really want to love the mac but outside of iOS development I don't have much of a reason to run it these days. Windows + Linux subystem is pretty great, or just booting directly into Linux.
To be honest developing in native Linux lately has been wonderful, it's incredibly fast, driver support is pretty damn good, and desktop is attractive enough.
I even have Thunderbolt 3 ports on my desktop which have been good, I just wish VNC clients were faster so I could network-in my Macbook Pro's screen better. The latency even over thunderbolt three is pretty atrocious.
Edit: Just so I don't SPAM another comment, as the nice folks below have pointed out... PASCAL is supported. Upvoted you all because I'm so happy. Gonna get my hackintosh back in action tomorrow.
Edit 2: I'd also like to say NVMe is mind blowingly fast on the desktop I've got and something sorely lacking from Apple's offering adding another downside to the current Mac desktops.
Edit 3: Cleared up note about thunderbolt 3 and VNC.