One thing that's unfortunate though (about a Hackintosh) I think about to change (maybe already has) is the use of a R9 280x due to driver issues in OS X (i.e. there are no drivers for newer GPUs). I think now you can use, or soon will be able to, Pascal based GPUs which would great for deep learning/CUDA. It's a shame Apple has been relentless in their quest to ship inferior GPUs in Macs.
The biggest beef I have with this Touchbar MBP I've got is the GPU is pretty crap compared to NVidia's offerings-- and I think it's somewhere that Apple could dominate. Sure it'd take partnering with GPU manufacturers, innovation, and money in the space but that's exactly what the Mac needs right now and Apple is fully capable of... or someone's god forbid sacrificing a small bit of portability for the power.
I really want to love the mac but outside of iOS development I don't have much of a reason to run it these days. Windows + Linux subystem is pretty great, or just booting directly into Linux.
To be honest developing in native Linux lately has been wonderful, it's incredibly fast, driver support is pretty damn good, and desktop is attractive enough.
I even have Thunderbolt 3 ports on my desktop which have been good, I just wish VNC clients were faster so I could network-in my Macbook Pro's screen better. The latency even over thunderbolt three is pretty atrocious.
Edit: Just so I don't SPAM another comment, as the nice folks below have pointed out... PASCAL is supported. Upvoted you all because I'm so happy. Gonna get my hackintosh back in action tomorrow.
Edit 2: I'd also like to say NVMe is mind blowingly fast on the desktop I've got and something sorely lacking from Apple's offering adding another downside to the current Mac desktops.
Edit 3: Cleared up note about thunderbolt 3 and VNC.
They are supported! I'm writing this comment on a hackintosh running a GTX 1060. Nvidia released Pascal-compatible Mac drivers 2 weeks ago (April 11).
Interesting but unverified side story, the drivers were apparently released because of an email from a redditor: https://www.reddit.com/r/apple/comments/65hpc8/funny_story_a...
> Windows + Linux subystem is pretty great, or just booting directly into Linux.
I built this new computer recently and gave Windows 10 and Ubuntu 16.04 both a shot. I decided I didn't want to relearn how to be productive with them and turned it into a hackintosh, and I'm very happy with it so far.
Same! I was waiting for the pascal drivers to take the 1080 Ti plunge, and I am super happy I did (literally installed it 48h ago, upgrading from a 980). I have a 6700k "mac" that flips to a top of the line gaming pc if I am in the mood for that.
It doesn't take a ton of time. I did keep great notes about all the process in hopes I'd have time to write an extensive guide, but still haven't quite been able to come up with the time to write it.
2018 is still a long way from here and who knows how that new Mac Pro will be.
The only solution to me would be running OS X on my own desktop hardware. I'd gladly pay Apple a nice premium for this.
Recently I went to a projection mapping seminar (its kinda like DJing but using video loops), everyone who attended had older macbook Pros (with hdmi to power the projectors). The production machines they use to do the projections live are all microsoft windows machines, as its the only way to get enough video horsepower to run the multiple displays.
The artists running the seminar said they couldn't recommend macs anymore, excepting for some backend asset creation. Since most software is mac/windows right now its not a bid deal, I worry if the mac demand goes south, the software won't be ported.
I'm hoping linux starts being more viable solution for content creation. Seems like its heading that way.
Edit: low res demo from 2008: https://vimeo.com/2577056
I continually experimented with automating sound-reactive video, but it was unpredictable and difficult to get a pleasing, "fluid" result which didn't cause seizures, so we always did it by hand using hardware such as EDIROL V-4 or the MX50. It was also a lot more fun to punch buttons and slam sliders to the beat of the music ;)
Do you mean current or previous pipeline? Unfortunately most projects are somewhat unique and require at least some variation, if not a complete overhaul in workflow, depending on what the customer wants. I never got a chance to use VVVV in a commercial capacity, although there is an upcoming project where we are seriously considering it. Generally we used custom in-house software (e.g. for 360 projections using 2x 4-channel MPEG-2 decoder cards) or VJ software such as ArKaos and Resolume. Like I said before, these days it's purely content production, so the pipeline is relatively similar to a game studio. We still encode everything into various QT formats and we rent hardware for shows because as you said, it's hard to keep up with ever-changing hardware demands.
If you are interested in our projects, check out http://www.motionlab.at (go the old site for now, the new site is WIP).
The V4s still sell quite well for some reason, some of the other analog gear is still in demand for circuit bending.
If the projectors are decent they can still hire pretty well too.
Keijiro Takahashi at Unity Japan has a great github repository full of relevant libraries.
Using a hackintosh for mapping/vj'ing as the off the shelf hardware isn't at all practical and I want to keep using OSX only software.
Honestly it's a nightmare.
I want as many video outputs as possible, enough power to run multiple layers of video, be running game engines at the same time, transcode a video in a pinch and more.
Switching to windows is probably the only option going forward.
We all need ECC. It should be a standard feature by now.
ECC also thwarts Rowhammer and similar attacks, if that matters to you.
Note that it isn't enough for AMD to restore ECC support in consumer kit; you also need motherboard support, and the MB makers are also complicit in raising ECC costs.
But for whatever reason, AMD never seems to put ECC in the bullet lists for why you should buy their parts. I guess as someone else mentioned its because the motherboard manufactures don't enable/qualify it even though its likely just a matter of firmware tweaks ever since the DRAM controller moved on chip. I got it working on a cheap phenom II/gigabyte motherboard (IIRC) some time ago as well. In that case I don't think the motherboard even advertised it, but I had some unbuffered ECC DIMMs lying around and I plugged them in, and they worked. Of course the only real indication besides booting the machine that it was actually working was a kernel blurb during boot about it. I don't think I got the EDAC reporting to give me soft error rates at the time.
RDP will give better results and there are tons of implementations.
NoMachine is probably the best documented and supported. However is proprietary but they have a free version. I believe it can do streaming video.
Spice is poorly documented and seems to not be maintained however it is very efficient and can stream janky video.
Have a look at XRDP with libfxcodec for streaming video (need lots of bandwith) otherwise without libfxcodec for decent low bandwidth
Shortest Googlewhack(-ish) I've ever accidentally stumbled upon myself, for sure!
It would be a nice protocol if there were functional clients, but I fear that train has sailed.
As someone who does front end graphic work to mockup apps before building them as part of my workflow I couldn't see myself switching off Mac OS unless Microsoft drops the registry, replaces NTFS, stops with the spyware being built in, disables the forced updates and finally goes with one UI/UX for everything.
Why do you care about the registry and NTFS?
Desktop developer could care, but a web one?
I've opened up Regedit in the past year, but I work with a piece of benighted software that supports a weird but useful feature designed fifteen years ago when doing unfathomable things with the registry was something people did.
I'd argue that's preferable to a complicated hierarchy of obscurely named keys.
Also, the registry is used for a lot of stuff besides configs. So my experience on Windows is that you have to run the installers (=slow), rather than simply copying all the files to clone a system.
Plus, it tends to fragment all across the disk, or something like that, and your Windows system inevitably slows down over time. Maybe SSDs eliminate that problem; I (fortunately) haven't had to use Windows in quite a few years.
You can't copy full software installs like that, but it's not the fault of the registry. Installing typical software means affecting dozens of system settings all in different places. You can't easily copy linux software along with every setting it affects either. You're best off either reinstalling or copying the entire drive, both of which work on Windows.
No, that's not true--I did once in 2015 to make a game written in 1998 work.
I think we can lay this one to rest.
Copying putty settings from one host to another? That's a regedit.
(Semi-) permanently disabling live scans from Windows Defender? That's a regedit.
One of the first things I install on any new system. Gotta have my Caps Lock->Escape.
That means you can do a Hackintosh build with their latest video cards (GTX1080 or Titan) with full driver support.
That's to be defined. I for example use a MBP with an E-GPU Setup (Sonnet Echo Express-III with a Pascal GPU inside). Put the MBP in my Docking station and I have a decently powerful desktop Mac. I love it, I hope that is the future.
Any real world issues or hiccups, or has it been as good as you make it sound?
But there are unfortunately other things to consider. In my case the Echo express enclosure is not made for Graphic cards and you have to modify it to make it work - done in about one hour when you are really careful. The wait for Pascal drivers was sad. Other than that the only other thing that comes to my mind is that you cannot remove the MBP from the docking station before shutting it down. (I have read that sleep should also work but it does not for me)
Edit: If things work really depends on the exact setup. I have just seen the video posted in another comment (https://www.youtube.com/watch?v=yho3rCNfzGE) and it looks like I have just been really lucky with my setup.
You can use them right now but the driver is still in beta. See https://arstechnica.com/apple/2017/04/nvidia-releases-beta-g...
I see that folks are putting nvme cards into their older mac pros:
Is there a recipe that allows one to boot from an nvme add-in card on an older mac pro ? (in my case, early 2009)
I would suggest this card, it should work, it is SATA based as opposed to NVMe. I have installed a few of these in on several older systems (x58 based and others) and saw a very large performance bump, even versus SSDs connected via aftermarket PCIE SATA-3 controller (running at 500-550mbs). YMMV if attempting to clone an existing drive - I have had this problem with add-in PCIE cards, older motherboards, and various operating systems - a fresh install to one of these drives will likely be easier.
Sorry to be a name nazi, but some people may not be aware of the difference. Its hard to tell online if people don't notice the error or don't know they've made an error.
...but you might not like the results under MacOS
(Edited for typos)
Mine is a dual boot setup on an i7-6700k on Z170, 64GB RAM, and a very old and well-supported Radeon 5770 GPU. I will admit mine sits mostly in Windows with a number of Linux VMs running for development, which is why I maxed out the RAM.
Previous build was an LGA775 build. More specifically, an LGA771 hack. Again, ancient Gigabyte P35DS3L motherboard, 6GB RAM, and a modified Xeon X5470. It was extremely stable. I may bring it back, but I used the SSDs and PSU in my new build.
What can you do with Hackintosh that you still can't with a Linux distro?
Edit: Possibly a bit late to ask, but I should have asked if you had ever seriously tried Linux and if so what your experiences were like?
I moved to OS X because I want a desktop that actually works. I have several Linux VMs - one per project - which I use mostly from the command line.
At the moment, OS X and iOS are the least annoying desktop and mobile operating systems. Note that I said least annoying. They all suck in different ways and I wouldn't use OS X as a server. I also think Apple has no idea how to do online services; for example I don't use iCloud for anything. Still, the operating systems beat the alternatives.
Only problem is, you have to use OS X for a few months to realize how much invisible polish has been put in. I first bought a MacBook, and it took me a year or more to decide to switch my desktop to a Hackintosh as well.
Notice that I had downloaded these floppies through a 55.6 kbps dialup line (!!) from the Slackware ftp server (!!!)
The great thing about the Slackware distribution was that it was separated in "subsystems" each one being more or less standalone and being able to download and install separately. So there was the "base" subsystem which was on N floppies (I can't remember probably around 6) that contained the Linux kernel, libraries, bash and a couple of other useful utilities which could be downloaded, installed and with this you could start using Linux! And then there were various extra subsystems, for example the "development" subsystem that contained gcc, the "X" subsystem (with the FVWM window manager IIRC) etc. This made it really great to be downloaded through a dialup line.
I can remember various great stories from my installation and configuration process, for example becoming an expert in how disk partitioning works, configuring the modem by giving it AT commands (to dial my ISP's phone I needed to issue the ATX3DT command followed by the number), to make my (ISA) sound card work I needed to boot on windows (95) so that it's IRQ and memory address was configured properly and only then then (cold) reboot on Linux so it would work (!), using autoconfigure/make/gcc to compile stuff (this actually is needed and in 2017), configuring X by editing text files and playing with my monitor resolution and refresh rate etc.
Happy days !
Also the Slackware linux had a nice installation guide:
Finally, some configurations were explained in the corresponding man pages of each util.
Luckily I had ethernet and a cable modem - Lynx to the rescue!!! A lot of the linux forums were very 'text' friendly in those days - I dread to think what it would be like nowadays trying to navigate a forum from the cli.
I left slackware about 5 years ago for Xubuntu, mainly because I needed to use my laptop for work and not just having fun tweaking and learning.
The whole "Apt get into it!" just really works for me now. I think the final straw for me was last minute trying to compile a video editing suite (Cinelerra?) on slack, getting frustrated and then finding out it was much quicker for me to nuke the HDD, install Ubuntu Studio and an hour or so later I was happily video editing.
Now nostalgia is knocking at the door so maybe I will make a new partition and see how slackware has developed in the last few years...
Being from that time period myself (18 years old) and installing Slackware as well RedHat (and even Caldera ... anyone remember that?) there was plenty of documentation.
However unlike the parent poster I got a bunch of Linux distributions by either mail ordering or going to conference. In fact I went to Linux conference in Atlanta circa 1997 and meet Linus himself... but more importantly picked up some distribution CDs/disks.
It was the good ole days for sure.
Actually come to think of it, Ubuntu really hit the CD nail on the head with Shipit, those of you with 28.8 modems will know this feeling. I still have my 7.04 CD.
And of course there were computer magazines that offered Linux distros in their CDs!
I use Linux ALL DAY EVERY DAY at work, but IMO the desktop environments still suck, and the Mac is my desktop.
The Linux desktop experience has come a long way since then. Try Ubuntu 16.04.
I assume you're not including Windows Phone in that, because its surely less annoying than either iOS or Android (apart from lack of apps, which I consider to be a blessing in disguise)
Unfortunately I am cursed with a long memory and I keep grudges forever, and I'm old enough to remember the old Microsoft with the patent threats against Linux and the illegal monopoly abuse practices.
Yes, LibreOffice in Ubuntu or Fedora are good enough technically, and there are same problems when exchanging documents with Office for Mac as there are with LibreOffice.
However, deflecting blame works. When exchanging files with someone and something is broken in LO, it is automatically fault of LO. When something is broken in Office for Mac, it is an uh-oh moment, as obviously it is the latest product from the same company, so the effort to resolve the issue is much more constructive.
Another reason is, that both Mail.app and Outlook are usable as Exchange clients (yes, I'm stuck with that, with IMAP disabled there :(). Evolution is a trainwreck. There is a plugin for Thunderbird, but it is a no-go, as it is a yearly subscription. One time price would be fine, but no way I'm going to pay yearly for using Exchange mail in Thunderbird.
This saved my bacon in a similar situation back in the day: http://davmail.sourceforge.net
It's all about Mac OS. It lets me get things done without dealing with silly things.
Let's quickly list what I sorely miss when switching to my Linux machine:
* Multiple monitor support: connect any number of monitors, any time, and have them Just Work. HiDPI or not, doesn't matter. The OS even remembers which of your windows where placed on which monitor at which size, and will do its best to move them when you connect a monitor. No other OS even comes close.
* Consistent keybindings (Emacs-style) in all windows and dialogs. Control-a always gets me to the beginning of the line, whether in a text editor or in a file-open dialog box.
* Reasonably consistent keybindings in apps. Can expect Cmd-q to quit every app.
* Flawlessly working suspend/resume on laptops.
* Full-screen any app and it works fine.
* Apps like LaunchBar (I think quicksilver used to be a free alternative).
* Spotlight, which finds everything.
* Ability to remap any key to whatever I want and have it work everywhere.
* Drag & drop everywhere. And if you laugh at this, consider that this is coming from a command-line guy with 25 years experience with computers. The way Mac does drag & drop is faster and more convenient than fiddling with command line. For example, did you know you can drop a file or a directory from anywhere into any file/open dialog box?
* Apps like Simplenote, Bear, Ulysses: excellent tools for specific purposes.
* Predictable ubuquitous working clipboard.
For me, switching over to my Ubuntu machine is an exercise in frustration. I can't redefine my keys, there is nothing like TextExpander, multiple monitors just don't work unless stars align just right and you have all the monitors in just the right order at boot time and you better not mix hidpi with normal. Drag and drop is nonexisting. Copy/paste is a free-for-all where each app does things differently and you have multiple clipboards (Ctrl-V vs middle mouse click).
Basically, in order to get things done, I'd much rather work on a Mac.
A side note: to really understand why the Mac is so good, you have to work on it for a while, with someone showing you things (like dragging files into file/open dialogs). People seem to think this is about superficial things like aesthetics, or Adobe software. It's not.
The problem was, they need to put a LOT of polish in to reach OS X levels of usability, and they didn't look like they were doing that, they had some nebulous dreams of making a unified mobile/desktop interface instead.
There isn't anything in particular I'd suggest to them, because they need improvements just about everywhere. Say, is the network manager actually usable these days? When I switched it was easier to just disable it and move along.
And the main elephant in the Linux room right now is systemd. Low level Linux is basically being taken over by an incompetent developer who doesn't understand the Unix philosophy of software design and can't deliver working software anyway. He's the author of the abomination called pulseaudio - back then it was required to uninstall/disable it to have sound again. Now, software written by him is becoming a dependency of almost everything - he is trying to replace all the Linux infrastructure layer with his ideas. Sorry, I see no reason to take a look at Linux on the desktop again.
Systemd in Linux has the same role, as Launchd in macOS. Some aspects of systemd were inspired by launchd and SMF.
Interestingly, it is a good, useful thing that contributes to polished experience when it comes to macOS, but for some reason, when someone does equivalent for Linux, it is suddenly a bad thing, just because it is different than in the past and moves the polish and experience with Linux to a higher level?
Unix philosophy aside, Pottering can't be trusted to provide working software...
If you have such problems, that your machine crashes and loses filesystem data, just redirect it to the syslogd and go on.
Before Ubuntu, if you expected to get a build of a typical popular Windows program/game you would get laughed at. Now it's virtually expected. That's not failure.
systemd is not as much a matter of controversy as people want it to be. If it was, you would see people like Linus rejecting it, but they haven't because it's not a major factor right now.
You've just made one of the errors discussed in http://uselessd.darknedgy.net/ProSystemdAntiSystemd/ .
Really, consistency is something that macOS holds over other desktop operating systems in a lot of ways, and that holds true even for third party apps for the platform. After having used macOS for many years, using other operating systems where most never gave much thought to or saw any value in getting on the same page and following a set of conventions is jarring.
A dialog pops up with 2 choices: Stop or Replace. Replace will delete everything in the destination folder before copying over the source. If you hold the Option key while drag&dropping, you can get one of multiple different pop-up dialogs, depending on the circumstances... and the results are not very predictable.
We use ubuntu at work and I was VERY worried about this. But autokey is actually good enough. The UI is not quite as slick but the rest of it is a little better. The exception is the little pop up form fill things which are cool but I found I did not really miss. https://github.com/autokey-py3/autokey
I'm describing this experience because it's a common one: it's symptomatic of the "Linux desktop" experience and is the reason why I would much rather use Mac OS. It's not that I can't make things work -- I probably could, but I want to spend my time on other things.
The 2 major issues for me with osX were:
1: Selecting icons to copy. I don't know if this has been fixed, but if you have a file browser window open and you want to click then shift-click to select a range of files, it would always just make a rectangle (graphically) between the 2 click points, not run left to right through the window and select everything in-between, (I don't know if I am explaining this clearly)
To select the range you wanted you would have to either switch to list view or cmd click any files which got left out. I really though it was a bug until I found this forum post https://forums.macrumors.com/threads/shift-doesnt-select-ran... where they defend the behavior.
2: And kind of a silly one, but when you plug in an hdmi monitor, you can only control the volume on the monitor itself. And as with the previous issue, I was surprised at the responses I found in the forums. Complete finger-in-ears-lalalalala "It's better this way, and impossible the way you say even though linux and windows can do it that way"
Anyway, for my personal use case, linux trumps both Windows and Apple, and, in opposition to most comments I see on this thread, I run Win7 and osX in VMs if I need specific software (SketchUp on Win and nothing really on osX, I just have the VM set up in case I need something)
You mentioned this twice, and now I'm intrigued. What exactly do you mean by this?
Similarly drag a folder into a Save dialog and it will switch to that folder.
Drag a file onto the "Open..." button on a web page to skip the open dialog entirely.
Drag a file into Terminal to paste that full path into the terminal...
And the far decried HFS+ file system had a feature I hope they will maintain in APFS. When you moved or renamed a file around, most of the time even third party app managed to figure out the new location of the file when seeking for it instead of just displaying a broken path dialog.
And this replacement only occurred if the resource was not present at the search path. Meaning that moving and replacing by a similarly named file result in similar file being used, while moving result in suggesting new location.
Don't know what's the behavior is as of today, and maybe by now Linux also provide this reliably, but first time I've seen this was even pre OSX I believe. And maybe there is a simple trick, but it looked like black magic for the 20 years me.
That was the case (and created a huge debate) back in the old Carbon vs Cocoa days
Infamous tech note 2034:
Get to the Open dialog, activate App A, click-hold on the little file icon next to the file name on the top bar of App A and drag it onto the Open dialog over at B. Suddenly, the dialog has the appropriate file selected, wherever it happens to reside on your file system. Magic!
Edit: literally everyone beat me to it.
Personally I prefer MacOS over Linux because it just works from the UX point of view and I can run Linux on it pretty easily with Vagrant.
That said, I have 4 hard drives, with 3 OS's in my main computer. So, I can boot into whatever, depending on my needs.
1: Lightroom/Photoshop. I'm a hobbyist photographer, and the tools just aren't as good in Linux. Gimp, Darktable etc are interesting, but they just aren't up to par yet with Adobe's offerings. And basically everyone in photography seems to be using Lightroom + Photoshop.
2: iMessage - being able to have my SMS and iMessage messages on the desktop (and synced back and forth from my iphone) is just way too useful every single day. Yes there are apps that provide similar functionality, but the beauty of iMessage is that nobody has to use an app. They just send texts like normal. Maybe there is a solution to this that works in Linux but I haven't seen it.
There rest might do it for the heck of it too, slick UX & benefit of doing ML/DL from one single OS & machine instead of context switching between Linux machine & a macOS based laptop for the above reason.
I've tried using it more. I recall using Ubuntu full time for about 2 months a couple of years ago.
But I'm a designer and use various CAD apps which are mostly available in Windows, some in MacOS and none in Ubuntu. I also recall breaking a lot of things when tinkering with Ubuntu.
So while I constantly have this drive to try Linux and open-source workflow I just never manage to make it work. I break things too easily and many trivial tasks are too much of an "hassle" (like getting proper CUDA support to use in Blender) meaning the chances of breaking things while tinkering is higher.
It's slightly amusing to me that by far and away the best terminal emulator that I've ever encountered is OSX-only, and it's embedded in my workflow enough that it would be a real pain to do without. I was going to note that FinalTerm inspired some of those features but was now dead, but going by  it appears it may be resurrected.
I'm pretty happy with my my laptop that has the high end mac book stats (besides the display) and costs less than 50% of it.
I run Xubuntu on it and it basically has everything I need, but XCode.
- Apps that I've grown used to that are macOS only like 1Password, Photoshop, and Quiver
- Integration with iMessage - I really enjoy being able to send messages from my laptop
- Avoiding weird edge cases where shit just doesn't plain work on Linux. I have a lot of issues with a HiDPI display on Linux that I don't ever have to deal with on Mac.
I know Linux is about configurability, and that's why I'm running Elementary OS at home, but these are the reasons why I would consider a Hackintosh now.
Some may have equivalents (but I'm sure nothing as polished) or will run under Wine but I have bad experiences with both.
There are also workflows that take advantage of macOS features (e.g. location service) that wouldn't work on Linux I guess.
I used Visual C++ long ago, in a past life when I used to develop on Windows, and it was pretty good. Played around with more recent versions of it (2010 - 2015) and boy has it gone downhill! Very slow and clunky compared to what I remember.
Replace "git" with every other package....
Linux is so amateur hour with all plainly stupid decisions, I'd rather pay good money for proper unix.
Git today in many ways could be seen as fundamental component of many tools. Many packaging and build tools use it to fetch data. (like homebrew, plugin systems for many text editors etc.)
Also, the git version of your distribution IS relevant because other packages depend on it. For example, on my Ubuntu system the git package is a dependency of over 170 other packages. If you could install a newer version, a lot of these other packages might break.
The recommended way of installing git on macOS is via Apples git variant by installing Xcode which also requires root privileges btw.
I do believe that Linux-based desktop OSs should separate base system from user software, kind of like *BSDs have been doing; actually, I'd really distros to embrace something like homebrew, where packages are installed per-user
You're kidding me, right? https://git.archlinux.org/svntogit/community.git/log/trunk?h...
There's sometimes weeks before node is updated.
Updating PostgreSQL from 9.6.1 to 9.6.2 took 4 months.
And right now .net core build is failing on my archlinux test server.
If you want latest - use macOS with brew
There's only so much of the hardware rat race worth keeping up with anymore at the desktop level. I just pray that external GPUs will become supported on MacOS. That's the final piece for many people, I think.
I believe there's only so much of the hardware rat race worth keeping up with at all. Assuming most of Hacker News is coding, we're spending a lot of time in text editors; even if you are constantly recompiling code, CPU-wise is anything giving you that much of a boost over anything past Haswell or the PCIe/NVMe drives in the current MBP?
However, my work machines are a 2014 MacBook Pro 15 with a GTX 960 eGPU / TB2 when I need mobile hashcat, a T440s, and a Surface Pro, all pretty old hardware. 90% of the time I don't notice RAM pressure or that I am CPU bound, even when running a few VMs (say, Kali + Win10 on my MBP, or a few Win10 images in Hyper-V on the T440.) When I need more power than that, I can usually just rent it out of AWS/Azure.
I think sometimes the hardware game - especially on the Apple front - seems about keeping up the cycle and appearances of getting the new hotness. Yes, the exhilaration of having the next big thing is great, but functionally I'm inclined to believe that having the old thing is just fine 99% of users, and probably 80% of HN readers.
A couple of months ago I dusted off an Acer Aspire One netbook (released in 2010, Atom N450 processor, 2GiB RAM) to install OpenBSD 6.0, after a few years with Linux and Mac OS X on other systems. Surprisingly it is my primary machine now.
Today I wanted to see if there were even smaller netbooks with usable keyboards around and came across this video comparison of the Sony Vaio P and Fujitsu UH900: https://www.youtube.com/watch?v=szbfvV4vwEI
before buying a new machine recently, i wanted to know what benefit a fast processor would offer for my specific usage pattern. so i repeatedly recompiled a representative Android project at different CPU speeds on my existing (over-clockable) machine.
the machine needed about 15% less time to recompile at 4.1 GHz than it did at 3.4 GHz.
The work user problem is the iMac isn't designed for 100% CPU usage for extended amounts of time.
I work from home as a consultant, so I have both problems :) Thus, hackintosh...
My personal data point of Moore's Law slowing down: my 8 year old Core 2 Quad is still half as fast as a modern i5. For it to still be usable would have been impossible in previous decades. With an SSD and modern video card I still game on it.
It was great. Basically everything worked, and it never crashed.
Well, almost. Sometimes sound would stop working until reboot. Sometime wifi would not work after wake from sleep. I couldn't upgrade Mac OS. The keyboard was garbage, as was the screen. Ironically, A friend tripped over the cord and smashed the screen - if it was a real mac MagSafe would have saved it. (I replaced the screen for $35, which was nice).
All in all it was great, and after it I saved and saved and bought a real Mac Book Air 13 in 2012. I could not be happier. I have used it for tens of thousands of hours and it's still going strong taking a lot of abuse in West Africa.
At the end of the day, a Hackintosh just approaches the Apple experience, but I don't think it will ever get there.
Did you leave it on to where it got to 0-5% battery and shut off? When I do that it takes me about 10 minutes to start actually using the thing again it's so throttled regardless of the fact that it's now plugged into an outlet. Even restarting it at that point doesn't help much. I have to just sit there waiting for it to hit 8% or 10% or whatever it hits to stop throttling everything.
More often than not when I plug in my two monitors one won't come on until I do it again. Or maybe the third time.
Then about 30% of the time when I close it and unplug everything and remove my headphones the thing is still running and my music blasts through the office.
No manufacturer is infallible but this is the kind of stuff I would've quickly replaced my laptop and gone to another brand over if I wasn't so tied into OSX.
if I go to activity monitor and find the "android file transfer agent" process and close it, the camera starts working again
My 2012 MacBook Pro would just refuse to wake up sometimes.
My 2016 MacBook Pro is sometimes slow to wake from sleep, but no major issues yet.
My 2007 Mac Pro would cause a kernel panic when putting it sleep.
The hackintosh can be a good way to taste the hardware diversity (even if the article don't go very far on the hardware side). Once you do, you realise how locked up you are by Apple.
There is one thing missing still, it is something like iTerm 2. I am currently running terminator on linux via X11. Having a nice native Windows terminal emulator would be a big plus. There is ConEmu, but after a lot of tinkering I was unable to make proper color and mouse support working.
MacOS is a very nice system for developers, and I am very sad to move away from it. I wrote about it here: https://medium.com/the-missing-bit/leaving-macos-part-1-moti...
Q & A with Special guest Rich Turner, Senior Project Manager, Bash on Windows and Windows Console.
He talks about future bash/Linux support.
I switch between all three systems over the course of a day, but in terms of (A) system consistency, and (B) well-designed (third-party) applications, Mac still wins.
I tried setting up MacOS through Qemu/KVM (using these instructions ) and it installed pretty much fine. I don't have a spare graphics card or monitor for VFIO, so I tried running it in a window, but the mouse and keyboard capturing was really finicky, so it was completely unusable.
I almost wish Apple would just quietly make this easier for folks, blessing it without explicitly blessing it.
Was about on part with the difficulty of installing Ubuntu now-a-days.
"The spec I was going to go for was an 8-core model, with 32 gigs of RAM, a 1 TB flash drive, additional external storage, two D700s"
Unless I missed something.
Also, I've often wondered who needs a 700+ watt power supply? How many video cards, drives, CPUs, etc etc could that handle?
While I believe that power supply longevity is better than for hard disks/fans and other mech components, power supplies are still prone to failure and also damage caused by line voltage irregularities. That's part of the reason why serious servers have two redundant power supplies, its not just for battery back-up. Power supplies do fail, that's why they're FRU's (field replaceable units) in datacenter equipment.
Warranties are more about getting people to buy stuff than they are about actual longevity. Very very few consumers will bother to warranty a power supply from a 10 year old gaming rig, for example. Few people excercise warranties period. But they do create a warm-fuzzy feeling for buying decisions and if the margins are high enough for the MFG its not a big risk anyway.
If I can avoid that by not pushing the limits, it's worth doing.
That's why I always invest in a great PSU.
I could suspend that sign overhead with one extremely high quality aerospace serial numbered M2 bolt or six M4 bolts from the hardware store, financially I'm better off with the hardware store bolts.
Do you have any statistics or other sources on that? (honestly curious)
I'd like to see the numbers too.
The way I read English the "was going to go for was" part acts as a conclusive delimiter.
As in: I was going to go for an 8-core CPU but went with an i7 4770K instead.
The machine has two older graphics cards, 16 drives, an E5-1650 and a fairly power-hungry motherboard (X99-E WS). Also, lots of 200MM fans - it is nearly silent at full load.
...but who gives a f? It's a desktop, no battery to drain. And of all the appliances in your home, the computers surely don't mean that much as a percentage of power usage. Yeah, other things are probably wrong too, but I think you picked the wrong detail to care about here :) Better to have a bigger power supply and be sure it'll handle whatever you might add to the machine in the future than to care about replacing it. And from my (and everyone else I talked to) experience, inefficient (power-wise) electronics tend to last way longer than efficient ones, so I prefer them (since my time wasted finding a replacement or a repair shop is worth more than the price of the extra power), even when it comes to fridges and washing machines.
(Commenting mostly because all this insane obsession with energy efficiency is getting on my nerves.)
It's the first time I can hear that oversized PSU leads to more power consumption - how come?
Additionally, many power supplies are worse at load regulation if their load is too low.
Additionally up to around 35% load the fan is off, passive cooling is enough.
It's probably here nor there, shame about no real curve though. Could be crapola at 10%, who knows.
You aren't the only one to post something like this. Where are y'all getting this idea from? Efficiency is very flat in switching power supplies once load gets above about 10% of rating. Plus, computer power supplies have multiple voltage outputs, so in theory one output could be running near full load while another is basically idle. The power output rating is calculated by increasing load current until the voltage drops unacceptably, then calculating the power output at that point. A 750W power supply is just as efficient running at 250W as it is at 750W.
 Not that this happens much at all in practice, but it could.
It really isn't. Sure, the engineers who design the power supply do that test, but that is not at all what determines the advertised rating of the final product. At 100% of the advertised load, most computer power supplies are still delivering nominal voltage or slightly above, not 5% below as allowed by the ATX spec.
> Efficiency is very flat in switching power supplies once load gets above about 10% of rating. [...] A 750W power supply is just as efficient running at 250W as it is at 750W.
Yes, on either side of the peak efficiency, you'll have points of equal efficiency. And in the middle, you'll have a few percentage points higher efficiency. But more importantly, at the ~30W a typical desktop will actually be drawing most of the time, a power supply with a smaller rating will be substantially more efficient.
Buying 750W and larger power supplies just doesn't make sense for single-CPU, single-GPU systems. Power supplies with lower ratings already have plenty of headroom both built-in to their rating and in the difference between 500W and what a real desktop actually uses on real workloads. To the extent that having excess capacity helps longevity, a 550W or 650W model is already well past the point of diminishing returns and going up to 750W is pure vanity. If you want reliability, shop for PSUs that use high-quality fans and capacitors, don't just stupidly add an extra 30% on top of what's already for more PSU than you really need.
I assumed power supply manufacturers would want to slap the peak power number on their supplies for marketing purposes. If they are actually being conservative, then you are correct.
> Yes, on either side of the peak efficiency, you'll have points of equal efficiency. And in the middle, you'll have a few percentage points higher efficiency.
My point is that efficiency is a plateau, not a "peak". Once in the plateau the fluctuations of efficiency from one load point to another are not significant. Below a certain minimum load and past the peak power "knee" is a different story, but that plateau is very large.
> But more importantly, at the ~30W a typical desktop will actually be drawing most of the time, a power supply with a smaller rating will be substantially more efficient.
I never disputed this. I disputed the nonsense that power supplies have a meaningful "peak" efficiency, and that it is a function of it's rating. 30W is probably not enough of a base load for efficient operation of larger computer power supplies, but once that point is hit it no longer matters what the actual load is.
I'm also not suggesting it is a good idea to waste money on a larger supply than you really need.
The fan is off if the load is below 30% which is neat if you are into silent builds.
~ 230W = 30 % Load (760W)
~ 135W = 30 % Load (450W)
The idle draw of such a system is below 100W (okok.. max 100W ;) ) so you won't hear any fan even with a smaller PSU.
This is something to start with. But i don't know why they don't go with a NVMe SSD? Clover supports it and it's way faster. But overall these are relatively good guides. I still would make some minor adjustments but that doesn't matter.
Motherboard: Asus Maximus VI Gene - 57 - 123
CPU: Intel Core i7 4770K 3.5 GHz Haswell - 70 - 80
RAM: Kingston HyperX Beast 4 × 8 GB DDR3 2400 MHz - 12
SSD: Kingston HyperX 3K 480 GB (×2) (Striped) - 1 - 4
HDD: Western Digital WD Black 3 TB (×2) (Mirrored) - 16 - 20
GPU: Radeon R9 280X - 15 - 257
Cooling: Corsair H80i - 4
Fans: Noctua NF-S12A (×2) & Noctua NF-F12 (×2) - 4
Sound: Bowers & Wilkins MM-1 - 12
Other: Bluetooth LE & Wi-Fi PCIe Module - 2
Actually it has 8 logical cores and 4 physical cores. For all practical purposes, you have 8 cores at your disposal. I build workstations for 3D rendering and I have yet to see a tool that cares whether the cores are physical or logical, as long as you have enough RAM for each core, you can use each one to it's fullest extent.
On average, a setup with 4 logical cores and 4 physical cores performs 15%-30% faster than a setup with just 4 physical cores.
Regardless, my point is that the article didn't specify if he meant logical or physical cores, therefore saying that the article is wrong, is partially incorrect (and seems like an unnecessary attempt to nitpick for the sake of nitpicking).
All Hackintoshes require the FakeSMC kext (to bypass DSMOS), that's it. Some systems require more kexts, for example, to have networking, etc.
Clover doesn't patch or install kexts, but it patches the ACPI tables (especially DSDT). Maybe that's what you mean?
FWIW I use a MBP 13 non-Retina mid-2012 with 16 GiB and two SSDs. Good enough and no weird compatibility issues.
Specs off hand: Asus Z170-A, Intel 6700K (Overclocked to 4.2), 32 gigs ram, 1tb ssd, nvidia 1060 powering dual 4k screens
What amazed me was how easy it was compared to years ago the last time I tried.