Hacker News new | past | comments | ask | show | jobs | submit login
Building a Hackintosh Pro (dancounsell.com)
350 points by milen on March 18, 2017 | hide | past | favorite | 258 comments



I used to use a Hackintosh. It was a time-consuming thing, I feared OS updates, and I could never be sure about the reliability (you can get really weird problems sometimes. I then switched over to a Mac Pro (cheese grater) and was very happy with it. Unfortunately, since the trash can Mac Pro I'm left out in the cold, so I'm considering running a Hackintosh again...

Unfortunately, the information is still very fragmented and it takes a lot of time and effort to get one running. Actually building the thing is the easiest part, it's the booting, installation and OS patches/fixes that eat up time.


It's quite easy to do now, but the community is incredibly annoying.

The software is usually on sourceforge or something even shadier, it's rarely actually OSS-licensed, and configuration/documentation/interfaces are a nightmare of incomprehensible incoherence. There cannot be a software community less professional. I've seen Minecraft plugins by 8 year olds much better in these categories.


Android ROMs are the same way. Maybe something about the semi-illegality of it brings out the old DOS scene culture.


IMHO this have really little to do with legal status, but more with work atmosphere in community. You can see exactly the same problem on console hacking / homebrew scene.

From one side there is huge user base full of people who want to get something working with no effort and usually tend to demand something from developers, but very few can provide even proper bug reports. On other side there is bunch of "experts" some of which tend to leech of others work and even less of actual skilled developers.

So it's just stressful to participate those projects. In the end most of enthusiasts that stick there must have very strong personality or be selfish persons that looking for cheap popularity.

Also there are humble guys who silently work on their own thing, but they usually work alone or in very small groups and don't communicate with outsiders. And of course most of time they don't ever get credit for their work on BSP reverse engineering and fixes.


Back when I was in high school, I ported the game DOOM to Nintendo DS with a friend on IRC, and watched dozens of "forks" afterwards with minor changes or repackaging show up trying to take credit for our work (sometimes not even mentioning us at all as the authors!).


Oh man, I just saw DOOM on a DS the other day. Was it a source-port or a dosbox implementation?


Source port using SDL and my friend Wintermute's libnds. There's no way the DS's ARM9 processor would've been able to handle running DOSBOX. Just out of curiosity, where did you see it, if you don't mind me asking?

Man, I remember feeling disappointed people weren't making more FPS games for DS, so I decided DOOM would be fun to port. I even got the wifi multiplayer working.

EDIT: The biggest unsolved hurdle was getting the music to play (it was MIDI and the DS hardware didn't really have anything to play it back). Eventually someone got a hand-optimized OPL3 ASM player to work on ARM7, but I don't think anybody ever managed to connect it to DOOM running on the ARM9 processor using the FIFO. The DS had two CPUs (ARM7+ARM9), and the ARM9 was completely utilized.

I also helped out in a very minor capacity with DS Linux, but pepsiman did nearly all the work.


Thanks for your work. Yeah, I can definitely see some parallels between the homebrew/emulation/console scene and the Hackintosh or even jailbreak scene. It's getting better, with lots of tools being open source and information being more public (see the 3DS for an example) but there's also moves in the wrong direction, too (closed source Wii U emulator on track to make a million dollars over a few years)


smea (aka smealum) did a great job with making the entire 3DS scene possible. One of the smartest guys I know.


Honestly the most annoying thing about that community is that you're expected to wade through hundreds of pages on forums, piecing together information, and people get furious because the answer to your question was on page 63 out of 124. The custom roms will list the very specific things that differentiate them but there are hardly any guides with best practices except maybe if you have the most popular phone out that year.


So basically the installability of Linux circa 2003, but with the bonus that security updates might brick my system?


So it's slightly better than trying to find an answer on Apple's support forums.

I'm kidding, but only a little. This actually sounds a lot like the state of the Minecraft ecosystem last time I checked, about two years ago.


There's no reason to say an Android ROM is illegal. Or whatever semi-illegal might be. Maybe it's like semi-falling-off-a-cliff? Semi-hit-by-a-train? I don't mean to mock.

I always guessed Android ROMs were such a misery in the community aspect, because they're very easy and relatively useless. All you need is a computer, which is easy enough, then some open and free software, and you're ready to go and compile your own ROM. The source code is often available, and you can change a color here and there and maybe change some of the Linux kernel parameters so you can say it's for better battery life, or more free RAM if you're into that.

That's the easy part. Now the relatively useless part: phones don't last long. Someone who is keen on tweaking their phone will probably be looking for a new one every year, or every other year. At best they last about three years. The changes to the software is minimal at best, and every custom ROM is practically the same. There is no marketability in these ROMs. There's not millions of dollars of potential revenue to be gotten out of a server that has to run with 20 years of uptime, that a company can jump into and support, like you can see with other uses of Linux. Red Hat supports for like 10 years? Android devices get discarded in two. Therefore there is little professional attention to third party, open source distributions of Android.

Google develops it, companies alter it a little bit, then ship it. Hobbyists from all around the world take the open source aspects of this and maybe add some new stuff here and there. There's no need to expect professionalism from hobbyists.

(I find CyanogenMod, or LineageOS or something now, to be pleasant to use on a phone. It gives the option to run apps as root, and it comes with a terminal, and it tries to remain as open as it can.)


> I find CyanogenMod, or LineageOS or something now, to be pleasant to use on a phone. It gives the option to run apps as root, and it comes with a terminal, and it tries to remain as open as it can.

Absolutely. I especially like it because it gives me the ability to restrict network access from apps (I usually just deny background cellular access but some apps I think have no business talking to the network even over WiFi). Moreover, the permission model is much better which brings me to the reason I uninstalled Facebook Messenger: denying start at boot or any other permission to Facebook Messenger (Or Facebook) on Lineage crashes the permission manager. Whatsapp and Instagram have no such issue. They will happily accept not being able to run at boot (I haven't looked into whether it actually works but at least I can set the permission as I want).

I guess my number one request for a custom rom would be the ability to say I don't want any app to run in background/run at boot/use network connections unless I specifically white list the app for the purpose.


Depends on the device and the ROM creator. I'm using an OnePlus One with SultanXDA's excellent CyanongenMod-based ROM for over a year. Updated every month with security updates and works perfectly fine.

Similar good experience with Nexus 4 and NitrogenOS for my standby device. Android 7, works perfectly fine.


Wow, "security updates" from some random person making roms on the internet. Good luck buddy!


So run a few of the exploits that are posted against the phone yourself and check if they run? Although I wouldn't trust everything on XDA, a lot of the time you're a lot worse off with the manufacturer's code. Hell, there's at least 4 working exploits for my XT1572 and Motorola have shown no interest in updating it, even though it's less than 2 years old.


There's also turf war nonsense. The site that the OP mentions so much, from what I remember, plagiarises the software it offers.


Is "plagiarism" even the right word for re-hosting something in a more accessible central-directory format? They're not claiming it's theirs; they're just giving you a link you can actually click to directly download the thing, where the source is e.g. a phpBB forum that requires registration to enable downloads.


I don't mean stuff they rehost, if any. I mean their “own” tools (the ones with the site's name on them) are plagiarised.


> it's rarely actually OSS-licensed

It seems strange to worry about licensing in a community predicated on violating a term of an OS license.

Is this a problem because source code isn't available?


It's unclear whether a license can actually prevent you from running licensed software on the HW you like, if you buy said SW in the first place.


_When_ you're buying a license, you're not "buying said SW", and you will have to abide by the rules of the license.

(As to the _When_, that can a point for contention, given that consumer software marketing material typically doesn't explicitly mention the license)


I was serious: you can't put any obligation you want in a SW license and hope it can be enforced against those who buy said license. And it has been debated whether restriction to run consumer general purpose software on specific hardware, especially when that sw can be obtained independently of any hardware, can be enforced or not.


Yes I remember bringing up a software license with a lawyer I was talking to some years ago for a product I was selling - he laughed. I don't know if it's changed much since then - most of that stuff is untested in court AFAIK


I mean, of course right. These are guys fiddling with settings, rebooting numerous times, they finally get it right, and they upload what they have then move on and start using the machine. An incredibly frustrating experience.

It isn't the same as some kid who writes a plugin and then uses it and keeps playing minecraft.


"I used to use a Hackintosh. It was a time-consuming thing, I feared OS updates, and I could never be sure about the reliability (you can get really weird problems sometimes. I then switched over to a Mac Pro (cheese grater) and was very happy with it. Unfortunately, since the trash can Mac Pro I'm left out in the cold, so I'm considering running a Hackintosh again..."

As the owner of an early-2009 octo-mac-pro, I am very interested in learning about how to max that system out to the gills.

For instance, I am under the impression that it is now possible to boot from a pci based SSD card on that system.

Further, I believe it is possible to add a card that gives me USB3/SATA3 (although if I am booting from PCI, I don't really care if my slow mass storage internal four drives are SATA2).

Finally, I think there is some apple-blessed video card that is quite a bit stronger than the original GT120 cards I have in my system.

I've seen bits and pieces here and there about maxing out a mac pro and bringing it very close to "modern" but it would be nice if it was all in one place somewhere...

FWIW, other than maxing out the ram, I really have no problem with the existing RAM options and the dual-cpus that I already have (which bring me to octo-core). Even 8 years later, the machine is not slowing me down at all in the cpu/ram department (and I only have 6 GB).


I have the same Early 2009 octa-core model and have done a few upgrades here and there and I have to say I find the system nearly as responsive as the new 2016 MacBook Pro I use at work.

The first thing I added was a CalDigit FASTA-6GU3 SATA3/USB3 card. This gives you two internal SATA3 connectors (bootable) which I'm using for my macOS boot SSD. It also has two eSATA ports and 2x USB 3.0 ports and only takes up a single PCIe slot. They've since come out with a new card that adds a USB 3.1 Type-C port. No drivers needed, it "just works".

Next was a NVIDIA GTX 970 card, this works perfectly in 10.11/10.12 once you've used the GT120 card to install the NVIDIA Web Drivers. You don't get the boot screen unless you get the card ROM flashed by macvidcards.com ($$) but I haven't found this to inconvenient as I have left the GT120 in a spare PCIe slot so if I need to use the boot selector for some reason I can just swap the mini-DisplayPort cable over. You can go up to a GTX 980 Ti but you're limited to Maxwell cards as NVIDIA hasn't released macOS drivers for Pascal.

Finally I picked up 64GB of DDR3 1333 MHz ECC RAM on eBay that was pulled from server for under $200, this was mainly to fulfill my dreams of being able to say I have 64GB of RAM but I run a lot of VMs for network simulations and it really helps to have the extra memory.

I've also flashed the SMC so the machine thinks it's a MacPro5,1 of the 2010+ generation, you'll need to do this if you want to upgrade the CPUs (I haven't yet) but has the added benefit of letting 10.12 install without any hacks or modifications to the OS. If you're on 10.11 now you'll have to disable SIP temporarily to run the firmware update but it was otherwise quick and painless, this is the best place I found with clear instructions: http://forum.netkas.org/index.php/topic,852.0.html

All in all this is the best Mac I've ever owned and it amazes me it works as well as it does being almost 8 years old now.


Great - many thanks - this is very helpful.

"Next was a NVIDIA GTX 970 card"

How many PCI slots does that card take up ? I currently drive four monitors with my mac pro and the single slot aspect of the gt120s makes that easy ...

"All in all this is the best Mac I've ever owned and it amazes me it works as well as it does being almost 8 years old now."

I know, right ? I regularly have 20+ chrome tabs open along with 1-2 VMs running in vmware fusion and I have never once felt like the system was slow. My only issue is that I have only ever run snow leopard on it and now chrome no longer has updates, so I need to move to ... mavericks maybe ? I feel like mavericks is the most stable/sane OSX release since SL ...


> My only issue is that I have only ever run snow leopard on it and now chrome no longer has updates, so I need to move to ... mavericks maybe ? I feel like mavericks is the most stable/sane OSX release since SL ...

This will obviously depend on the hardware/software you use but Mavericks was the buggiest and worst OS X release on my rMBP. I was rebooting weekly due to weird graphics drivers glitches and inexplicable OS slowdowns. El Cap had a rough start, but was fine after a couple point releases. I've had no issues whatsoever with Sierra.


El Capitan is probably the best choice at this time; Sierra is an absolute train wreck:

http://tidbits.com/article/16966

EDIT: 10.12.3 doesn’t seem to be much better, sadly:

http://tidbits.com/article/17010


Sounds like it's only an "absolute train wreck" if your workflow depends on editing PDFs with annotation layers in software that uses PDFKit.

My interaction with PDFs starts with viewing a linked one in Safari and ends with "save as PDF to web receipts folder" and for that kind of usage there are absolutely no problems. I always thought that for anything half-advanced with PDFs (like filling out forms) you'd want to install Acrobat Reader anyway...


Filling out PDF forms is a pretty normal consumer task these days, I wouldn't call it even half-advanced.


FWIW, I have a Mid 2010 Mac Pro. I found it was getting slow. I tried putting a lot of memory in it, to little effect.

Then, I put this:

https://www.amazon.com/OWC-Accelsior-PCIe-SATA-Adapter/dp/B0...

And this:

SAMSUNG 850 PRO 2.5" 1TB SATA III 3-D Vertical Internal Solid State Drive (SSD) MZ-7KE1T0BW

In it as my system drive. The volume looks external (its got that orange and white icon), but everything has worked perfectly for about 18 months. It boots just fine.

For me, the performance increase with this change was incredible. Its like a new computer. I my OS and all my installed applications on it. I keep all my data on traditional drives in the other four bays. Its like a brand new computer. It was completely worth it. Right now, I use it for very intensive workloads of various kind, and never even think about replacing it.

Finally, you can find a lot of info online about whether you should enable TRIM or not. FWIW, I did do it, and its been working fine for 18 months:

https://www.cnet.com/how-to/installing-ssd-on-mac-trim-mista...

I cannot remember at this point how I copied the data over. I do remember that whatever method I found on google worked quite quickly, and that there were no hassles with it. After doing some kind of copy operation, I booted to the new drive and never went back.

Hope this helps.


> I am under the impression that it is now possible to boot from a pci based SSD card on that system.

Yep, put a Samsung SM951 M.2 (AHCI version, not NVMe) on a Lycom DT-120 PCIe mounting card. Got two of those in a MP5,1 myself.


a while ago i paid the premium for the mac sapphire hd 7950 and it works great. it significantly speeds everything up and allows me to run 4k + another display on my cheese grater 2008 pro.

i'm sure it's probably very easy to make a non-mac video card work but i can't be bothered.


I think things have improved in the Hackintosh community. I built one a few months ago and was up and running in a day.

Small issues since, but there's always a solution + guide within 15 minutes of searching (and it's usually just flipping a flag in Clover).

When an update comes out, I just wait a week or so to make sure there are no issues and then update (no issues at all so far with that).

Having a ridiculously powerful computer running macOS at a great price has been such a joy.

If I'm talking to someone who is the kind of person that builds computers and is comfortable with a bit of command-line & config, I always recommend them to look into a hackintosh because a lot of the uncertainty has been removed thanks to the great community around it.


Does iMessage work without utilizing a real mac's serial number? I learned my lesson after I used the same serial to get it working on my hackintosh and then I got locked out on my real mac. Ended up resolving it after a few support messages.


Yes it works with a spoofed serial


it really depends what type of hardware you get.


Yeah. I buy Macs because they "just work", so a Hackintosh completely destroys the point of even owning a Mac for me.


I always felt the same way. Even a huge cost premium matters very little to me since it's a machine I use professionally.

However I'm starting to give it serious consideration as Apple is signaling pretty hard that they don't give a shit about desktops any more, and MacOS is not seeing much development other than some sloppy seconds and half baked ports of iOS ideas.

I'm pretty heavily depend on MacOS for my daily workflow these days, but the writing is on the wall that I've got to start thinking about Linux or even Windows over the next 5-10 years.

In that light, the benefit of the hackintosh is not only being able to get current hardware at a fair price, but also being able to hedge my bets and dual boot to Linux to get my feet wet.


As a data point, I was putting together a super cheap "build server" (for compiling stuff on) recently. Supermicro motherboard, and dual 6 core Xeon cpu's (E5645). Cost of the motherboard and cpu's (from Ebay) was ~$US140.00. 32GB of ECC ram to match was ~US60.00. Everything else needed (power supply, etc), I already had around.

Just to see if it would work, I hackintoshed it prior to installing FreeBSD. Apart from the network ports, everything else (Nvidia card, USB, etc) worked.

Network connectivity was fixable by just plugging in a USB3<->ethernet adapter (also had one handy), and voila, usable desktop.

https://www.amazon.co.uk/gp/product/B00XZQRMRU/

It wouldn't be surprising (to me), if newer generation Supermicro boards also worked. That would open up possibilities for lots more cpu grunt in an OSX box. :)


> MacOS is not seeing much development other than some sloppy seconds and half baked ports of iOS ideas

While I may not wholly disagree with this, what OS is doing much better? Linux is still slowly catching up to the macOS graphics compositor from 10 years ago, and Microsoft is busy integrating its own half-baked port of Windows Tablet Edition... True Windows does have some momentum with the new ubuntu subsystem, but that's still catching up when macOS is a real native UNIX from the ground up.

There's not much more I want out of my OS... a new filesystem to replace the 30-year-old-design HFS+ is already on the way, and while I'd love a more modern CLI userland, that isn't going to happen due to legal reasons, so I'm fine with homebrew. They just need to fix SMB and clean up bugs and performance issues for me to be totally happy.


Except that Macs don't "just work" any more. At best they "just work" until the day that Apple decides to deprecate a feature or an application that you have come to rely on (for me it's iPhoto) and then it stops working. It doesn't crash, it just says, "Sorry, this application cannot run on this version of MacOS." But from the user's point of view the effect is the same.

[UPDATE] Case in point: one of the banking apps on my iPod just stopped working, insisting that I upgrade to the latest version. So I did, but the new version doesn't work either. It just hangs when I try to log in. Granted, this is an iPod, not a Mac, but the iOS philosophy of having Apple control the device is slowly creeping in to the Mac. Every day and every upgrade is a crap shoot in terms of whether or not what worked yesterday will still work today.


I think Macs "just working" has always been a myth. Ask anyone who gets paid to fix Macs.

Sure, I see less of them, but hardly unexpected considering people buy fewer of them (and they're less targetted by malware).

On the other hand, I hate dealing with them because they can suffer from the same "it should work but doesn't" crap that Windows machines do but without the critical mass of support via Google.

And unless you get your issue escalated, the Apple Geniuses are about as much use as a chocolate fireguard.


> I think Macs "just working" has always been a myth.

To a certain extent that is certainly true. But there was a time during the Snow Leopard era when Macs were extraordinarily reliable compared to the then-contemporary competition, and even compared to anything before or since. Things have gone downhill since then because Apple decided to de-emphasize reliability in favor of other things like thinness, and the merging of desktop and mobile environments.


We had an (recent) iMac that just kept randomly crashing. Apple replaced most of the parts inside it, and it kept doing it. Eventually they have us new one. Kudos for that; random errors are a pain to diagnose and fix, but they did.


Indeed; I switched to Mac for 6 months after getting sick of trying to get things to work how I want them and be stable, on windows or Linux. Ended up being Just as fiddly, and needing to reboot at least once per week due to some kernel item chewing through increasing CPU. back on windows now; not perfect, but good enough for now.

My conjecture is that my hatred for any system will increase the more I use it, to the point of switching to another system, then the cycle repeats.


Agreed -- the new "productivity" applications are a disaster. If you shoot RAW, you have to either stick with an old version of Aperture, or switch to Lightroom (typical Adobe crap), because Photos doesn't do lens profiles. If you want to create a bound book, you can't use Pages, because it doesn't do inner and outer margins, only left and right. For now, it's less work for me to stay than to go, but Apple needs to pay more attention to their software.


The trick is to ignore OS updates after the initial setup, until you absolutely need a newer version or at least 6 months have passed and you can reserve a day for troubleshooting.

Of course, it helps if you have a macbook pro in addition to your desktop; you can check out the new stuff on the genuine apple machine, and maybe get away with postponing updates on your desktop even more.


It's _NOT_ a problem anymore. I just bought a Hackintosh with components from Tonymacx86's guide. It was literally just to assemble everything, create a bootable USB with the Unibeast tool (which uses the Clover bootloader) and then install it. And it just works. OS updates etc. are not an issue, and I do it just like I would on my Macbook Pro.

Only thing is iMessage/iCloud. It takes a little to set up correctly, as you'd have to "hack" the serial numbers etc., but once it works, it just works.


If you are trying get OSX on an arbitrary piece of hardware, it could take quite a bit of effort and have any number of weird problems, as you suggest.

However, if you stick to known hardware (mainly this means specific motherboards with well supported chipsets) it's quite straight forward and reliable. If you're starting out without existing hardware, just stick to the known MBs and you shouldn't have any issues. Very dependable hardware lists are available at tonymac and osx86 project, just use that.

Hardware support has come a long, long way in the past few years.


As a Hackintosh user, I'd recommend an NCASE M1 v5 or the Shuttle SZ170R8 for a significantly better chassis.

With the NCASE, you can get a LGA 2011-v3 mini-ITX board and get as fast of a computer as you want. Though truthfully, I don't think there's much of a point.

Likewise, an old NVIDIA card for a Hackintosh? Also not much of a point. The web drivers are so bad. Who knows when Apple will ship another NVIDIA chip in a Mac?

If you need "CUDA stuff," use Linux or Windows. Software like Octane is so buggy and suffers from worse performance on Mac anyway. Final Cut and After Effects both support OpenCL acceleration. Besides, the RX 480 is $189.

If you're doing "VR stuff," well pity on you if you're developing for an Oculus on a Mac. The Vive doesn't, and probably never will, support Mac. Whatever VR solution Apple releases will obviously run great on the video cards they support, so that again strongly points to purchasing an AMD card at this time over your alternatives.

With regards to this specific build, a high DPI display will greatly improve the enjoyment of this computer. The Dell P2715Q is the best deal. Mac OS has such good HiDPI support compared to Windows (and especially Linux). Enjoy the features you pay for!

Truthfully, I'm hard pressed to see the point of a Hackintosh, and I own one.


a GTX 980 Ti is considered old now? Unless you're working with datasets where the 6GB is limiting, it is quite capable for both CUDA programming & gaming. I tried CUDA programming on Win10 and it is an exercise in frustration, and find macOS a more pleasant environment compared to linux in part b/c of software library and it's what I'm personally used to.


It's quite capable, but if you need more than 6GB, the other capabilities are essentially irrelevant. Lack of support for the newer NVIDIA cards is increasingly becoming a major issue for many people.


Problem is, you can either spend the same and do more, or spend less and do the same, if you have support for a newer generation of cards. Not to mention that if you're into silent computers (not being able to hear the fans when gaming because of the game noise doesn't qualify as silent in my book) newer nvidia cards are much better.


What also gets me mad is "oh, so you need CUDA? You are a very niche user then, and ought to suffer!" It's when everybody and his dog are into ML, and having a decent GPU is a must unless you want to pay Amazon $2.5 per hour to train your models...


CUDA stuff is awful on Windows (try setting up Tensorflow/Keras there — nothing but overwhelming frustration). So I have to use Ubuntu and miss my happy days with macOS.


I think that's exaggerating it a bit.

I managed to install Tensorflow with CUDA on Windows just by following theirs official installation guides.

I also managed to compile Tensorflow with GPU support from source, also by just following the official guide.

Sure, it's a bit more work than "sudo apt install".


It was 5 months ago; perhaps things happened to change since then. :)


I installed Tensorflow 1.0.1 for Windows for the first time like, 3 weeks ago. I've never installed it before on any system and I never even had Python3 on this machine. I had to use Windows because it has my new GTX 1080.

Short story: worked great, running in about 5 minutes. (With VSCode, I even get autocomplete out-of-the-box on the TF libraries, imports, etc.)

I think it was much different before 1.0. Also, building on Windows is a little more difficult. But just installing it really isn't. And Keras was far more annoying because it wants scipy, which wants BLAS, and I don't use Anaconda. I haven't gotten around to that one yet...


> And Keras was far more annoying because it wants scipy, which wants BLAS, and I don't use Anaconda. I haven't gotten around to that one yet...

Christoph Gohlke to the rescue: http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy

I also don't think that the numpy installed via pip is using mkl, so his version of numpy should be faster too: http://www.lfd.uci.edu/~gohlke/pythonlibs/#numpy


I have the GTX980 and the NVIDIA driver for macOS works fine. It's just a pip install tensorflow-gpu nowadays.


I'm in the same boat this author is. It's disappointing to see the desktop market lagging so far behind. For all of the 68k/PPC years, we could at least say "different architecture" and Intel hadn't caught up. Now we're all Intel, and it's Apple that isn't keeping up. It's frustrating, and I'm not necessarily a "Pro user" anymore at all. I have a current generation Mac Mini, and it's over 2 years old now. What I can buy from the Apple Store right now for the same price is what I got 2 years ago.

I wish I know what Apple was thinking. One would have thought that the "iPhone halo effect" was something that they would have wanted to give momentum to. Instead people are looking at Windows units again.


> I wish I know what Apple was thinking.

It's literally incomprehensible. Just make a really good fast desktop computer. They're one of the largest companies in the world now with essentially unlimited financial resources, and decades of core competency in making high end desktop computers. So like why don't they just fucking do it.


> So like why don't they just fucking do it.

Google shovels billions into these absurd projects that never go anywhere and nobody blinks because it's Google. They announce projects and then cancel them just before they hit store shelves, or worse, after they've sold some and then decide to arbitrarily brick them. It's just Google being Google. Nobody cares.

Apple removes a single connector on their product and everyone goes apeshit. "Apple's doomed!" "Apple profits are going to suffer! Sell!" Doesn't matter that their device sold in record numbers. It's about perception.

Apple has to tread carefully. They too unbelievable amounts of shit on the Cube, a computer that in retrospect was just one stepping-stone on the path towards the very successful Mini, and on other "failures". They're allergic to public humiliation like that, and a desktop computer that sells only in low volumes is a failure.


Google treats their real product pretty well (such as the Google Pixel), it's just that Google has a lot of "beta" products NOT actually meant for consumers.

And concerning Apple, their connector choices have been really stupid for the past 10 years, only the magsafe was nice.


Google will kick their "real product" to the curb pretty fast. How many people bought a flagship Android phone only to find updates dried up after two years? How many people committed to various Google services only to have them abruptly shut down?

Apple might have stubbornly stuck to their 30-pin connector for way too long, but at least people who bought accessories for it weren't frozen out when the latest incarnation of USB Mini/Micro/Whatever caught on.

"Really stupid" is shit like the Nexus Q (https://en.wikipedia.org/wiki/Nexus_Q) discontinued before launch, or the half-assed effort to get Android vendors to update their software more often. It's Microsoft's utter failure to capitalize on any meaningful market share in the mobile space when they were a leader in mobile/PDA/tablet applications for years.


The problem is that they still have a pretty small product design department, and they're committed to not shipping "crap" products.

What I really don't understand is why they didn't foresee this, and just updated their cheese grater Macs. After all, it's way easier to simply update a motherboard to fit a faster CPU than it is to rearchitect the Garbage Can design every time there's a new CPU and GPU architecture out there.

The only thing I can think of is that they've either just downprioritized a lot to focus on $new_secret_product, or that they've developed a new Mac Pro that for some reason didn't pan out, so they needed to start over again.


They've invested heavily into new manufacturing processes since they designed the cheesegrater. Those processes can produce things like the garbage can easily enough, but they can't be made to produce the old cheesegrater parts as-is. They'd need to effectively redesign the cheesegrater at this point anyway, if they wanted to build more of (something that looks and works like) them using the suppliers+factories they have today.


The cheese grater design is one of my favourite designs for a tower ever and I've never owned a Mac.

The business partner has an old one that he uses because it has some old software he needs for design files.

When it eventually dies I'm converting it to a normal PC case and using it for my next build.

It's just a beautiful design.


>It's literally incomprehensible. Just make a really good fast desktop computer.

For the sake of argument, why? It's probably one of their worst selling products. Nobody defends the remaining iPod enthusiasts when they demand a new giant capacity iPod.

Maybe we just need to accept that Apple doesn't want to be in the desktop computer business for much longer?


Because they are a computer company.

Their inability to make a top of the line desktop is threatening my ability to justify buying their laptops or anything else from them. I run a company. I'm not going to mix and match operating systems and if I can't have high end desktops for heavy media work I'm going to switch everything eventually. I'm not abnormal.

People keep trying to come up with some 4D chess explanations for why this makes sense for the always wise and prescient Apple. It's much simpler to use Ockham's razor to conclude that they are making a shortsighted mistake.


>Because they are a computer company.

You saying this doesn't make it true. To a neutral observer, Apple looks like a technology company that makes consumer products, including sometimes computers.


> Apple Computer

Edit: From Wikipedia:

During his keynote speech at the Macworld Expo on January 9, 2007, Jobs announced that Apple Computer, Inc. would thereafter be known as "Apple Inc.", because the company had shifted its emphasis from computers to consumer electronics.[93][94] This event also saw the announcement of the iPhone and the Apple TV.


Obviously the shareholders want the business to focus on whats the most profitable. iPhone seems basically to be a license to print money for Apple.

Still, I think there are two arguments:

They are sitting on so much cash. There is obviously a cultural or organizational problem that produces to much friction to keep this product line up to date. I'm fairly certain many other corporations, if provided access to basically infinite cash, could figure out how to put new ram in an existing product line every year.

There's huge value in having creative professionals and developers use your product. Keeping people in your ecosystem is important. This might be hard to quantify on the bottom line, but it seems like an obvious strategic mistake to give up this advantage.


>Keeping people in your ecosystem is important.

What if the ecosystem they want is iPads and iPhones?


It's not my saying it that makes it true, it's the fact that they made every single one of the 50 or so varying models of desktop and laptop computers that I have purchased over the last 16 years.


Every time someone brings up sales volumes, they don't seem to connect the shitty updates with correspondingly shitty sales. Of course the Mac Pro sells in low volumes; it's a frigging 4 year old computer now. Same with the mini. If Apple updated these on an annual basis (just the cpu/gpu etc) then maybe sales wouldn't look like shit.


And they clearly make substantial profits on the Mac line, so again why don't they just fucking do it?


Even if they didn't who cares? For example, every motor vehicle manufacturer has an unprofitable racing effort and generates interesting concept cars and special editions to fill out the line.

Apple can have a flagship line that's break even or worse to anchor the high end. It's literally inexplicable, and I think framing it in the rational actor model is a mistake, it's best explained as a fuckup.


I wish I could upvote this (and your other comments) 100 times. Amazing high quality machines will keep the design and dev community on Mac for years to come, churning out the apps that help to make the iPhone ecosystem work.

Bleeding power users to alternative systems creates vulnerability.

These arguments would be the same for Microsoft circa 2007. Why should Microsoft care if the designers, developers, and tastemakers hate their products? They've got so much lock-in that people will be forced to use their stuff forever. Wheee, we're invincible. We all know how this turned out.

I'm running up against low memory issues on my 16gb MBP lately. I absolutely cannot responsibly update my Apple product right now because of the 16gb limitation on the MBP. It does matter!

I'm getting ready to bite the bullet and go back to Linux desktop.


My theory is that they strongly believe PCIe-over-Thunderbolt (now -over-USB-C) will obviate all need for internal expandibility/upgradability, and they went all-in on a form-factor (the garbage-can Mac) to take advantage of that... and then the market has stubbornly refused to co-operate by releasing Thunderbolt peripherals equivalent to internal components. So they're playing chicken with said market, probably under Intel's repeated promises to make it happen.


Which won't happen until Intel is more willing to loosen their grip on Thunderbolt. With Thunderbolt, in order to even get in the door, you have to file an application to request to get into the development program which they promise to get back to you within three weeks in order to just get the datasheets needed to start designing. Conversely, protocols like PCIe and USB, you can get the chips, the datasheets, and development boards, with no signup process, just waiting a few days for the items to come in the mail. When you treat your customers like that, it's no surprise that the takeup on thunderbolt has been tepid.


Not to mention you can only buy controllers from intel, and we are talking $50+ a controller, its totally insane.


When my machine isn't portable, I'd much rather have my upgrades mounted inside a generously-sized case than a tiny one surrounded by a rat's nest of cables.


You touch on what I miss about the old days from Apple, the "speed bumps". Apple used to make updates to the Macs a bit more often then they do these days. A lot of the time the updates weren't much more then a slightly faster CPU and maybe a new graphics. I don't expect completely new designs every year, but it boggles my mind why Apple won't put in better CPUs that are pin compatible with their motherboards.


I think it boggles our mind because of cognitive dissonance. The answer is a George Carlin They Don't Care About You rant, but we don't want to think of Apple as being that crotchety, but are merely distracted.


Just to provide a counterpoint, I recently did a new desktop build and installed Windows 10. It's not bad, and very different from the Windows 10 beta that I gave up on in frustration two years ago. With the Ubuntu subsystem I can do useful work right away. After turning off some of the annoyances (via the Services and Group Policy control panels), it really does a decent job of just working. You can download and install with a USB stick (no more stupid DVDs). It still demands a license key, but runs indefinitely without registration with a little watermark.

If you haven't built a desktop in the past few years, the performance boost from PCIe NVMe SSDs is great, and Intel i5-7600K (now retails for $200) can run at 4.5 GHz reliably and stay cool. I'm impressed.


I consider an operating system that randomly includes ads on their OS tools far from 'not bad'.


I came to a similar position recently. Bought a new laptop intending to run Ubuntu on it for personal development work. But the pet management story on the newer Intel CPUs just doesn't compare with Windows 10. WSL is pretty damned good and i don't feel so had using it. I'd objectively prefer to use Linux, but right now i just cant.


If you're relying on the Ubuntu subsystem to do "real work", why not just install Ubuntu?


Probably because of the myriad of things that have shaky hardware support on ubuntu.


Is there really a difference you can feel (not measure) between NVMe and SATA SSDs? I am just wondering.

I myself don't dare to go Windows because of malware. Linux is where it is.


For me, I didn't notice much of a difference in performance. Ostensibly, the benchmarks say that it is ~5x faster, but on a day-to-day basis, I barely notice this - I rarely transfer files larger than ~10GB.

However, what I did appreciate was the (in-the-case) logistics difference of installing an M.2 drive; one screw, no cables, almost as easy as a RAM upgrade. Not having to fuss with SATA was a pleasure, and reminded me of the switch from IDE to SATA (you just... plug it in? Where are the jumpers?).


SATA drives exist in m.2 format too.


This is both informative and interesting from other angle. If Dan Counsell, well known owner of Realmac Software, has to build hackintosh then wtf is Apple doing?


Hasn't Tim Cook essentially confirmed that they will be refreshing the line soon? I guess, not to Mr. Counsell's satisfaction?

https://www.macrumors.com/2017/02/28/apple-ceo-tim-cook-pro-...

I get all of the arguments on how long it's been, though.


Catering to the majority (of its niche, which is non-Windows, easy to use, all-in-one, curated experience, upmarket PCs) and not to even further niches?


Macintosh is 6-7% of worldwide PC units. At an average sales price of $1200 a unit vs. average Windows PC at $500 a unit, Apple has at least 15% of world wide PC revenues. At 15-20% net margins vs. Windows PC makers average 2-3% margins, Apple's Mac profits are as much as, if not more, than all the Windows PC makers combined.

Over $20B in annual revenues with at least $3B in net profits, that's a hellofa niche. So the question is, why is Apple not updating/refreshing Mac lineups more often, and why don't they have a decent pro model?


Because they get to those numbers without a "decent pro model", so it's not like they feel they need one.

As for getting to even better numbers, well, they had decent pro models (and even servers) back in the day, and they (presumably) know that they don't sell that much.


Their niche will soon figure out there are some people (like OP) running the same software on much better hardware for much lower price. How much longer do you think they will keep buying original macs?


I buy an original mac because I want a machine that does what it does without all the overhead of assembling it myself. if i spend just 2 hours of my time assembling something equal for 400 bucks less, ive gained nothing. and chances are itll take a lot longer than those 2 hours. with a mac, you click a button and receive a machine that is "pretty fuckin good", for 1700-2500 dollars. do I know I could do better? Sure. Do I give a shit, though?

i could almost turn "could macs be good value?" into a mandatory interview question, because those who say no don't understand how trading money for time works. working with such fools is incredibly tedious and painful.


"i could almost turn "could macs be good value?" into a mandatory interview question, because those who say no don't understand how trading money for time works. working with such fools is incredibly tedious and painful."

I would advise against that, unless you are in the habit of interviewing people who value their time at $200+ per hour, such as in your above example. Otherwise, their answer may be coming from a slightly different mindset.


thats the whole point.


Decades?

Most people who buy Macs are not very price-sensitive, and they decidedly buy them so they don't have to personally build that "much better hardware".

So, this option does exist, but it's the exact opposite about what they want.

Assuming they even need the increased power in the first place, which most (except creatives pros in video and 3D) don't.


What happened to "Here's to the crazy ones" :(


It's an advertising slogan. Whatever happened to Coke wanting to teach the world to sing in perfect harmony and getting sexy women hit on you when you put on Lynx?

Besides, being "the crazy ones" and being a cpu/gpu power hungry is orthogonal. Apple never sold the most powerful computers (except briefly and accidentally, when Motorola did something right).

Just friendly and easy to work with computers -- which the "crazy ones" could get going with without needing an IT debarment. A "crazy one" can be e.g. a human rights activist or some novel graphic artist, that just needs a simple computer to work with.


My concern with these hackintosh systems is safety. Tools like Unibeast/Clover and whatever else. They manipulate the OSX install image. It all seems to work but then you type in your credentials for the bank or work into a browser and who knows if the OS is compromised.

Is it safe? That is the question.


Clover is fully open source. And it's not too complicated to follow along, so it's actually a fairly auditable piece of software.

I've been using hackintosh for a while, and it's almost a vanilla OSX with the exception of Clover and FakeSMC. I doubt there's any malware in there...


> I doubt there's any malware in there...

Famous last words. Unless the builds are verifiable, they're not trustable. Open source doesn't do you any good if you don't know that the binary that you download and run doesn't do something weird.


I doubt you verify every bit of binaries you download.

Open source gives you the possibility of knowing your binary is good, by compiling it yourself. I chose not to do so, because I couldn't really be bothered. But others who need this kind of security could, and I would hope they do.


Can this be answered by doing a security review of the Hackintosh installation?

Would all that is necessary be to put the Hackintosh behind a network inspector, say Wireshark, to check if anything nefarious or unexpected is going on.


What if the traffic was masked - say certain "routine" DNS queries to kick off a request to get further commands to run? I wouldn't be worried about facile root kits, but higher quality attacks that are way harder to detect.


I started to build a Hackintosh using this kind of setup but I realized that I no longer love the OS (I already own a Macbook) and that there's no reason for such a hassle.


Yeah, I'm hitting this dilemma on the laptop front. It is hardware refresh time at work, and there is a concerning dent near the fan on my MBP, along with a light spot on the panel.

I despise MacOS's keyboad shortcuts. I just want a stable linux laptop that is fast enough for emacs + web browsing and has an impeccable screen/keyboard/trackpad/battery.

My HW budget is high enough to buy a high end apple laptop and tiny (27"?) apple-ish display from LG.

If I move to a overkill-nice PC laptop, and also buy a 43" 4K (non color matched) monitor, I think I'll be able to buy a ryzen cluster with the left over cash.

I'm not sure if that will be faster or slower than my xeon VM in the data center, or if I want to spend the time setting up distcc.


I just want a stable linux laptop that is fast enough for emacs + web browsing and has an impeccable screen/keyboard/trackpad/battery.

Isn't that a Dell XPS? https://arstechnica.com/gadgets/2016/06/the-xps-13-de-dell-c...


The dilemma is the following: do I flush a few thousand bucks of company money down the drain to get a middling, marginally productive setup, or do I deal with the PITA / risk of configuring my described linux setup?


Problem is, Apple doesn't have anything for power users or developers right now. I'm currently hackintoshing, but with every "dumb it down" decision (like remove the battery life time estimation), hardware mishap (LG monitors not working near wifi, come on) or new hardware announcement that still doesn't serve my demographic i wonder if i shouldn't just switch back to Linux.

I used to say that their laptops are fine, and it's only the desktop where i need to hackintosh, but ... touch typing hostile emoji keyboard?


I'm on Linux now, and people that like design and attention to detail can't really be happy using this... I don't expect massive migration to Linux from OSX. People will go back to Windows and OSX in a few months.


For the record, I've been using Linux as my main OS for 10 years when I finally decided to switch to OS X about 4 years ago. I have no problem switching back if Apple stops offering me enough flexibility. After all, you don't need to recompile your kernel to install drivers any more :)


You don't need to deal with drivers period on MacOS. I use Linux every day as a server, and have tried a few times to use it as a desktop platform, but never last longer than a few hours. (I'm a graphics person)


My HP printer begs to differ. Wouldn't show up on my mac until I installed their garbage software. On my ubuntu desktop, it works right away without any fiddling. Graphics, on the other hand, are definitely pretty finicky, especially HiDPI


I agree, but honestly Windows 10 now has Ubuntu bash included. Windows is a much better choice for Mac users who want to migrate.

Ubuntu is missing Office 365, all Adobe Products and much more. Yes you can run it with Wine, but that's not that easy.


Can confirm. Ubuntu, supposedly the most user-friendly Linux desktop distro, can't even get it close to being right.


Assuming you mean the GUI side, as a full time Linux user: Unity is infuriating. I run MATE and recommend trying Cinnamon (Mint), Gnome3 (Fedora), KDE (Kubuntu), or https://elementary.io/ for strong hints of apple flavor.


I'm honestly curious, what is it that you find infuriating with Unity? I hear this often, but never specific reasons. I'm perfectly OK with Unity, and I'm a tiling WM kinda guy (so it's far from ideal, but still perfectly usable). The few times I've tried to use Gnome 3, on the other hand, I just couldn't figure it out.


I'm not the one you asked the question to but in my case it's the top bar. I can make global menus go away but not move the top bar to the bottom. The sum of those issues are the reason I never considered buying a Mac since the very first one.

I also don't like the dock but at least that can be made to auto hide. I do like lenses and I wish I could have them in the Gnome 2 look alike DE I'm using on Gnome 3 right now (Ubuntu 16.04 with gnome-flashback and some customizations, included merging the top bar in the bottom one.)


I tried to use and love lenses, but it never really worked out for me.


* The Launcher, just the MacOS Dock, is a waste of pixels. I don't need to see a 128x128px Firefox icon all day. It can be hidden and/or reduced to 32x32px, but bad defaults.

* Hidden menus. Can be changed, but bad defaults.

* Top Bar.

* Left aligned window controls. It's not an option to change anymore.

Con: All the UX I don't like about MacOS. Pro: At least Edit->Cut works in the file manager. YMMV 100%.

Gnome3: Has a "launcher", but it's default hidden. I use meta/win key and type what I want.


Agreed on Launcher somewhat. I hide it, but put most frequent applications there and launch them via shortcuts (Super+[1-9]). Instead of menus I use the HUD, and in general I try to use most Unity features via shortcuts and find the defaults in that department sensible.


Yeah hints, I've been waiting years for a well-polished frontend, still waiting. Though as you mentioned the likes of elementary.io and the like, they are getting closer, but feel it'll still be a few years away.


AFAIK the only major remaining issue is font rendering without Infinality. What else is elementary.io missing?


I always thought this on Windows-- the Linux subsystem is weak and I won't be happy with it until Windows is developed on a UNIX kernel, i.e., never. So then I switch to some distro with Gnome3 because it's cool and that's where the problems start.

HiDPI support has been garbage for at least 2 years. I'll get scaling and resolutions right for my XPS13 QHD, and then I'll plug in a 1080 monitor and it goes back to looking like crap. The sound goes out unless I cold boot twice. Then I need to use Wine to run Photoshop in a barely legible font size because hidpi scaling won't work. God help you if you want to manually configure your display manager, as you'll end up with an unusable display and hours of SO on your hands.

That's not to mention the ton of visual and "feel" issues behind using even Pantheon/eOS, which feels like a fork of OS 10.5.

I recently built my own hackintosh and am quite happy with it. I do all my work on the Mac with integrated graphics (don't need a GPU, which is great because the Nvidia web drivers are indeed terrible) and switch to Windows for games. Couldn't be more pleased.


> Apple doesn't have anything for power users or developers right now.

I'm a power user and a developer and I find the 2016 MBPs to be the best laptops around. My only misgiving was the hefty price. The only "dongle" I needed was a single USB/USB-C adapter but I really don't have to use it most of the time; everything I have — phone, tablet, portable hard disk — works wirelessly.

The battery time (I think you mistyped "life time") estimation was dumb, never accurate or consistent, and people don't mind the lack of it on billions of other mobile devices anyway.

The battery issues with the 2016 MBPs were found to be on the software side and have since then been resolved, and they have been reported to go for 18! hours on a single charge. Certainly lasts me a whole day and then some of non-continuous use.

Re: the LG shielding mishap, you're blaming Apple for a completely different company's product, okay.

> still doesn't serve my demographic

Which demographic and what do you require?

> touch typing hostile emoji keyboard?

What?


First, you're defending the laptop, not the desktops. The laptops are still almost okay. Now for some point answers...

> The battery time (I think you mistyped "life time") estimation was dumb, never accurate or consistent, and people don't mind the lack of it on billions of other mobile devices anyway.

It was accurate as long as you kept doing the same task. For example, I knew how much longer a gaming session or a long build could last on battery. But this is laptop stuff and the original article and my whining is about desktops.

> Re: the LG shielding mishap, you're blaming Apple for a completely different company's product, okay.

A completely different company's product that is the only solution that makes use of the emoji mbpro's features, is sold and promoted a lot by Apple, and thus endorsed by them on their web site.

> Which demographic and what do you require?

Independent dev working from home. Running two VMs simultaneously at times, and also using the same computer for entertainment, namely gaming. Working on compiled software, not scp-ing stuff to a server for web development, so I actually need CPU power.

I could almost use the trash can, except they want me to pay for two slow video cards that are for some reason labeled "pro". Put in it a nvidia 1050+ and I'll buy it, although I'll be a bit space challenged. Some source trees take a lot of space. I'll even pay extra for the Xeon and ECC ram i don't really need, but give me a video card i can game on.

Before you ask, no, the iMac is not a dev's computer. It thermally throttles if you actually use the CPU, and makes laptop-like noise when compiling. No thank you. I've heard even the trash can throttles, and i blame the "pro" video cards.

> touch typing hostile emoji keyboard?

What you read exactly. Please explain how you touch type on the touch screen that replaced the F keys and how it is useful to me when I, like anyone who works a lot with the keyboard, have memorized every shortcut I use. It's mostly good for emoji in instant messaging, and I don't use that either.


Don't think of the touchbar as a keyboard-shortcut bar at all; the F-keys have basically been "killed", not changed. They're gone like FireWire. The new thing that is there displaced them, rather than being an upgrade to them; it just emulates them as a sop for the few people who might need them, while app developers take the hint and remove F-key bindings from their apps.

The touchbar is really there to be a little touchable display, like an iPad, or the bottom screen of a Nintendo DS. It's for virtual knobs, rather than virtual keys: it makes perfect sense once you use it to scrub or drag-select in iMovie or Final Cut Pro. It's basically a shrunk-down embedded platform for those "live controller" iPad apps (DJay Pro; that thing Bret Victor built in the dead fish video; etc.) to run on.


> First, you're defending the laptop, not the desktops. The laptops are still almost okay.

"Problem is, Apple doesn't have anything for power users or developers right now."

You complained about everything from Apple across the board, including blaming them for some other company. [1]

> Independent dev working from home.

Same here.

> Running two VMs simultaneously at times, and also using the same computer for entertainment, namely gaming.

Ditto, but just one VM; Windows, for some games. Everything I want to play, even very recent games like Paragon [2], runs just fine on the 15" 2016 MBP, in Boot Camp if not inside macOS.

> so I actually need CPU power.

Haven't experienced any throttling yet, myself.

> What you read exactly. Please explain how you touch type on the touch screen that replaced the F keys and how it is useful to me when I, like anyone who works a lot with the keyboard, have memorized every shortcut I use.

See [3]. The Touch Bar is always in your vision and I don't have to take my fingers off the keyboard to use it, as opposed to moving my hand to the mouse/trackpad to perform functions that I can now do from the Touch Bar.

"Looking" at it is no different than moving your eyeballs a nanometer to look at a menu bar or any other element on the screen.

Even if looking at something was such an arduous action, having to continuously refer to a secondary resource (be it your memory, in-app help, keybind settings, or online) to know what each F-number does with each modifier in each context of each app, is considerably more backwards.

[1] https://news.ycombinator.com/item?id=13902459

[2] https://www.epicgames.com/paragon

[3] https://news.ycombinator.com/item?id=13902923


> Re: the LG shielding mishap, you're blaming Apple for a completely different company's product, okay.

It's a monitor Apple effectively blessed as their monitor replacement. They sold it. So... yeah. Yeah, I do blame them for their lack of due diligence.

Why is this their problem?

Because, how am I supposed to rely on Apple to meet my needs if they are perfectly happy dropping products a business depends on?

> Which demographic and what do you require?

Not the OP, but none of the half-dozen developers/tech-oriented people I know who have switched to the latest laptops prefer them over the older ones. I've heard nothing good about them besides being a bit lighter, but that was one person and it was more a "the only thing better is it's a bit lighter" before listing complaints.

The complaints range from the touch bar to performance being far worse than older models. Running side by side, it's noticeable just by watching things run.

> What?

Are you not aware of the touch bar?


> Yeah, I do blame them for their lack of due diligence.

While simultaneously absolving yourself of having to do any research.

> Because, how am I supposed to rely on Apple to meet my needs

I often see people saying Apple owes it to its customers to make x, y, or z product, as if Apple's role in the ecosystem needs to be expanded further. Apple is a company, not a nanny, they don't owe you anything, and I feel we should keep their scope limited rather than expanded.


> It's a monitor Apple effectively blessed as their monitor replacement. They sold it. So... yeah. Yeah, I do blame them for their lack of due diligence.

Apple don't make the LG Ultrafine Display. Apple offer many other third-party products through their stores, but it's the manufacturer's responsibility to support those.

> how am I supposed to rely on Apple to meet my needs if they are perfectly happy dropping products a business depends on?

Apple still make amazing 5120×2880, 10-bits-per-channel screens: The Retina iMacs. If your business depends on such screens, you'd get the iMacs, not the MacBooks, or get a MacBook with any of the many third-party 4K displays.

> none of the half-dozen developers/tech-oriented people I know who have switched to the latest laptops prefer them over the older ones.

Why?

> before listing complaints. The complaints range from the touch bar

Like? What complaints specifically?

> to performance being far worse than older models.

They have some of the fastest SSDs on the market [1], their batteries can go up to 18 hours [2], they have the fastest GPUs ever in a MacBook, and how many laptops do you know that have high-DPI Wide Color screens?

----

> Are you not aware of the touch bar?

I use it daily, and it's better than the archaic row of limited and cryptic FN keys that it replaces, especially once you've tweaked it a little. [3]

FN Keys:

• 12

• Have to remember what each number does in each app

• Different numbers remain "fixed" for different functions in different operating systems (e.g. F1 for Help, Alt+F4 for Quit.)

• Not customizable

• Cannot give at-a-glance status without taking up main screen space

----

Touch Bar:

• More than 12

• Not limited by physical space

• Context-sensitive and adaptive

• More types of controls than just buttons (e.g. sliders, color pickers)

• Customizable

• Can display at-a-glance status such as time, battery etc.

----

As for other companies that offer high-end laptops, take a look at people's experience with Razer for example. [4]

All of the over-a-dozen developers/tech-oriented people that -I- know prefer the 2016 MBPs to everything else. They've also outsold all other competitors in just a single month. [5]

[1] https://9to5mac.com/2016/11/01/2016-macbook-pro-ssd/

[2] https://www.engadget.com/2017/01/13/macbook-pro-battery-issu...

[3] https://alexw.me/2017/01/what-if-you-could-customize-your-ne...

[4] https://news.ycombinator.com/item?id=13785247

[5] http://fortune.com/2016/11/09/apple-macbook-pro-sales/


> Apple don't make the LG Ultrafine Display.

But they did make monitors. And Routers. And other things that people bought into. Because Apple "just works." But now it doesn't.

> Apple still make amazing 5120×2880, 10-bits-per-channel screens: The Retina iMacs. If your business depends on such screens, you'd get the iMacs

iMacs are not replacements for MBP + Apple's monitors.

> Why?

Because the previous models have features and capabilities that the new ones do not that they prefer.

> Like? What complaints specifically?

For example, the touch bar button activating not when someone wants the touch buttons to activate. Specifically the latest case, I had a friend joke about how he's changed the background color of his terminal so many time since moving to the latest MBP. That's just one example.

> They have some of the...

shrug I know what I see. Sorry, but running scripts side by side with an older MBP compared to the latest MBP, the older one won hands down.

> I use it daily

And you weren't clued into the GPs remark about the touch screen? You honestly couldn't figure it out?

> They've also outsold all other competitors in just a single month.

And just because others like something, that means it's good for me?


> For example, the touch bar button activating not when someone wants the touch buttons to activate. Specifically the latest case, I had a friend joke about how he's changed the background color of his terminal so many time since moving to the latest MBP. That's just one example.

I would just like to say that that's BS.

You have to be very deliberate to change the Terminal window color: It takes a tap on a specific button to bring up the color picker, then you have to lift your finger to the slider and move it, or keep your finger held on the Touch Bar for a second then slide it around without lifting it to choose from a palette.

It's not something that can be done accidentally, as it takes multiple specific actions in a row.

If you still keep fumbling it "so many times" then you can always customize the Touch Bar and just remove the offending buttons.


Another one that defends the laptops when the original article is about a Mac Pro replacement :)

> Can display at-a-glance status such as time, battery etc.

You look at the keyboard when you type? Not my demographic then, i type too much for that. It's probably fine for you.

> All of the over-a-dozen developers/tech-oriented people that -I- know prefer the 2016 MBPs to everything else. They've also outsold all other competitors in just a single month.

You don't know any developers working on compiled software. And the mbpros outsell everything not because they don't annoy people, but because there's almost no alternative. Perhaps the Dell Developer Edition laptops, but Apple would have to piss me off more than they have so far for me to try a Dell instead.

And again, the article - and my OP - are from people who need a powerful desktop for work.


> You don't know any developers working on compiled software.

I do. I expect your next response will impose qualifiers on "developers" and "compiled software." :)

> my OP - are from people who need a powerful desktop for work.

Your OP said:

> Problem is, Apple doesn't have anything for power users or developers right now.

> I used to say that their laptops are fine

My experience is that their laptops are still more than fine enough for power users and developers.


> My experience is that their laptops are still more than fine enough for power users and developers.

And My experience is that their laptops are not more than fine enough for power users and developers.

Now, why can't we be both right? That your needs are clearly met by the laptop, but others aren't? Are you somehow impacted if people who used to like the products put out are now disappointed by them? That people who were looking to buy new machines spent their money else where and had better results?

Do you feel that if these people are right, that somehow this negates your choice and needs? Why are you working so hard to dictate that their needs are unimportant and that your needs are superior?


Because it is baffling.

None of the complainers seem to be able to say exactly WHY the new MacBooks are bad.

• They complain about performance when they're the fastest MacBooks ever, surpassing many competitors and with a very good battery charge-to-perforamce ratio, not to mention the best display ever on a MacBook.

• They complain about needing "5 dongles" when you need at most 1 USB/USB-C adapter + sometimes a multiport hub or dock, for almost any use case.

• That one guy keeps calling it an emoji keyboard, when there aren't any emoji on the keyboard.

• They complain of not being able to touch-type when the physical keyboard is still there.

• They complain about noise when these are the quietest MacBook Pros ever, and without willing to show what they're comparing it with.

• They say Apple is losing favor with customers when the new MacBooks have outsold everything and Apple has continued to top rankings and stock prices.

So yeah. I have the thing in my hands, and I use it daily, and there really hasn't been a better MacBook before. I DO concede that they may be priced a bit too steeply.

Meanwhile I only see complaints from people who have clearly never even used the things! It's pretty obvious with their "emoji keyboard" and "dongles everywhere" hyperbole, like going back to the old "Micro$oft" and "Windoze" days.

So of course I have to wonder and ask: By WHAT metric are they bad? I can only chalk it down to a concentrated anti-PR effort, or some desperate brand envy (e.g. adamantly putting the blame for LG's monitors — that have now been fixed, by the way — on Apple.)


> I do. I expect your next response will impose qualifiers on "developers" and "compiled software." > I used to say that their laptops are fine > My experience is that their laptops are still more than fine enough for power users and developers.

Well, [part 1] I don't use xcode, i touch type [part 2] and a lot of the stuff i work on involves looong builds and/or going through a few VMs. I also HATE fan noise and I work from home in a quiet room. Part 1 disqualifies the emoji keyboard, part 2 disqualifies anything with laptop-like internals, namely the laptops and the iMacs. I hope that's enough qualifiers :)


Which apps do you use that make such heavy use of the function keys that makes any such difference significant? I'm not sure I'm aware of anything I use on a daily basis that uses the function keys. I use them so rarely, I usually forget that I need to press the fn key first to actually use them, and then I always have to hunt for the fn key; its placement by the home/end keys on my keyboard isn't exactly conducive to frequent use.


Debugging keyboard shortcuts is the first thing that comes to mind. Esc is also used regularly.

I always bind my keyboard to use the fn# keys without pressing "fn", usually in the BIOS, so I only press "fn" for the other special features (brightness, media, etc., that laptop makers map arbitrarily to the fn# keys)


> i touch type ... Part 1 disqualifies the emoji keyboard

Why do you keep saying this? The Touch Bar replaces the FN keys, not the keyboard.

There are no emojis on the Touch Bar unless you specifically bring them up. You can modify the Touch Bar to only present you with the controls you need. [1]

> I also HATE fan noise and I work from home in a quiet room. ... disqualifies anything with laptop-like internals

The 2016 MBPs are some of the quietest laptops around [2]. Which computer do you use? Is it a desktop without any fans?

[1] https://www.boastr.net/bettertouchtool-touch-bar-customizati...

[2] https://www.reddit.com/r/macbookpro/comments/5ffhe4/2016_mbp...


> Why do you keep saying this? The Touch Bar replaces the FN keys, not the keyboard.

I use those keys. Without looking down.

> Which computer do you use? Is it a desktop without any fans?

Let's say I build my systems according to the advice on silentpcreview.com. Apple laptops on full load may be quiet for a laptop, but they have nothing on a custom built desktop.


> I use those keys. Without looking down.

Unless you slide your finger across the keyboard to "count" which FN key is under your finger, you can still use FN keys on the Touch Bar, without looking down; they are in the same positions.

The Touch Bar is always within your peripheral vision when you're looking at the screen anyway [1] [2].

> Let's say I build my systems according to the advice on silentpcreview.com. Apple laptops on full load may be quiet for a laptop, but they have nothing on a custom built desktop.

Well if you're willing to record/measure the noise of your current machine under your usual workload, I could try running the same tasks on my 15" 2016 MBP to compare.

Mind you though, a desktop chassis is usually far away from you, unlike any laptop right under your hands, unless you use it with an external keyboard. So objective noise vs. perceived noise will be different, and have to be compensated for.

[1] http://i.imgur.com/En02SKA.jpg

[2] http://i.imgur.com/DIN2fCE.jpg


> Unless you slide your finger across the keyboard to "count" which FN key is under your finger, you can still use FN keys on the Touch Bar, without looking down; they are in the same positions.

If that was true, we could just use an iPad instead of a physical keyboard. But there's a reason people don't like typing on chiclet keyboards or glass, and some developers even go to great lengths of geekery to build custom mechanical keyboards with different types of switches.

Typing on glass just doesn't give you the same feedback you get from physical keys.

I'm not saying the Touch Bar is not an improvement for many people. Many people (even some developers or professionals) rarely use the F-keys. But there is a certain demographic which deeply cares about the keyboard, and there's no denying parts of this demographic overlapped with that of MBP users.


What do developers use the f keys for? I never use them while editing code. I guess maybe shortcuts in some IDEs? But you don't press those anywhere near as often as regular keys, and they're still there on the touch bar when you need them.


> they are in the same positions.

This is simply not true.


I built a crazy fast Hackintosh using the intel 8 core 5960x CPU. 17000 on Cinebench. However I sold it a week ago.

Being iOS developer really sucked, as I needed to upgrade OS X for Xcode but the CPU wasn't supported with Sierra for 6 months.

Also I spent at least 2 weeks of work on it during the year I had it. So not worth it. But there are slower cpus that are better supported. It was fun the days it worked though:)


> Also I spent at least 2 weeks of work on it during the year I had it

To me, this is the most interesting, and the article doesn't really specify it. Excluding the building itself, why did you have to spend time on it? What kind of problems occurred?


I'm wondering whether it would be easier to run macOS in VM. No more fear of updates with snapshots, and I imagine ease of installation and less compatibility issues.

Anyone does it, how's performance and keyboard "tunneling" (CMD vs CTRL)?


MacOSX performance inside a VM is terrible, no matter your hardware.


How about VFIO using Qemu/KVM? That is the setup I have.


Nonsense, depends on the VM.


I'm curious, why do you say so?


New filesystem APFS supports snapshots, so you can use them without VM, I guess. It's already in production on iOS and probably will be released with the next macOS version.


VFIO with proper configuration and compatible hardware that you pass (sound card, USB hub, video card) is indistinguishable from native.

> keyboard "tunneling" (CMD vs CTRL)?

Better to use original mac keyboards, if you have them, not really a virtualization issue.


Ive found it pretty tough to get macOS up and running in a VM. Anyone know of any easy way?


VMWare ESXi officially supports Mac Pro hardware as a host. Guests will just work--you don't need to use any of the hackintosh bootloaders.

If you aren't running ESXi (or VMWare Workstation) on Apple hardware, you can use this to enable the functionality: https://github.com/DrDonk/unlocker


I've come across this[0] doc before, but I haven't tried it yet so I can't speak to ease of setup and performance.

[0]: https://github.com/geerlingguy/macos-virtualbox-vm


If you dont mind downloading a potentially sketchy random VM image from the web, this has worked for me:

https://techsviewer.com/install-macos-sierra-virtualbox-wind...


A few days ago I used this guide to install Yosemite on an AMD computer with Windows 10 as the host using Virtualbox. It works but there's no accelerated graphics so it feels like working remotely.

https://techsviewer.com/install-mac-os-x-10-10-yosemite-amd-...

The trick is to use VirtualBox 4.3 instead of 5.


While this PC looks great, visually, and spec wise; isn't this missing the point?

It seems like a company needs to build a _very_ good linux distro with design first principles. It needs to work on a number of devices. More importantly, it needs to be a paid OS. It can have an open source distro underneath, but the UI needs to be created by people who are paid well.


This is pretty much what the elementary folks are trying to do https://elementary.io/


I've been using elementaryOS for a while now, and it doesn't really compare to macOS yet unfortunately.

It seems to me there is a disconnect between developers and designers. When I ask devs why there isn't a nice looking Linux distro, they ask me why I care about what the UI looks like. Designers don't use or care about Linux because of the lack of decent design software. As much as I love the concept of GIMP, it is not a realistic alternative to Adobe software (especially things like InDesign and XD) or Sketch, for example.

As a designer-developer, Linux is still not an easy choice for me, as much as I'd like it to be. elementaryOS is a nice start, but I still end up using macOS if given the choice.


Nice. Thank you for this. I've never heard of this before. I think it shouldn't be optional to pay. I want a high-end distro that's $50-60 and far more polished. Better or equal to macOS.


I bought my Mac the first month they appeared in 1984. I won't bother you with the ensuing history, but I will raise a question related to it:

I remember when Apple could not offer a retail OS, not without cannibalizing hardware sales. If Pro desktops fade, is that still true for that segment? Or could they offer something Xeon only (to keep out commodity laptops) as a legal Hackintosh? Or do certified configurations a la Oculus? If their profit is in mobile, and cloud services, it might help more than hurt.

FWIW I like Debian now, and the non-intrusive UI.


I have a Ubuntu desktop (i7 6700K, nVidia GTX 1080, like everybody else's), and a recent MacBook Pro.

Each time I open the latter, I have a mixed feeling of "how beautiful everything is!" and "how slow everything is!"

(I am a Scala software engineer occasionally working with GPU-based machine learning)


I have been using a similar (a bit lower specs) hackintosh for 2 years.

About two weeks ago, I decided to stop and look elsewhere.

I started a small serie on my experience if you are interested: https://medium.com/the-missing-bit/leaving-macos-part-1-moti...

I am still testing my current setup, but I guess I'll soon publish the last bit, with the setup I found and my conclusions on the switch.


> FUCKING FULL SCREEN MODE WHEN THE GREEN BUTTON IS CLICKED

Man this is so me when I updated to El Capitan. What is Apple thinking??? Now I have to hold alt just to increase my window size. At least give us an option to change it! How hard is that, Apple?


I used the Moom to help me with that.

Another good point for windows, it has nearly all the features of Moom, at least those I need.


I had a Hackintosh for years. I used the stock/retail CD from my former mackbook, used some custom kexts and a guide to generate the right plist edit to unlock my nvidia card. I never had issues with updates either.

It ran Snow Leopard and I used it for all my development, video editing, photo editing, etc. Eventually I left for Australia and decided to get a real MacBook and unfortunately it had Lion on it.

I hated Lion. Gone was Expose. Gone were rows of virtual desktops (Missing Control had one row with multiple columns. I hated that shit). There was no way to get the old functionality back. Eventually I started using Linux again in a VM as my primary OS.

Today I'm back on Linux with i3 as my tiling window manager and I don't think I'd ever go back to macos. I think many of their design decisions since Lion and onward have been terrible. I just keep around a Windows laptop or VM for when I need commercial products or to play games.


>> Maybe Apple have been waiting for the recently released Ryzen CPUs from AMD?

You can't even run the stock kernel on AMD chips. How much QA and other work Apple would have to do, I have no idea.


That is when you see the whole tool chain, OSX, Xcode, Compiler all optimized for Ryzen in WWDC. I hope.

I honestly dont think any consumer will stop buying Mac just because Intel isn't inside the Mac. And it will save Apple at least $200 BOM per Mac, making final Mac Selling price around ~$250 less. Slightly more affordable. Or more likely, staying within the same pricing and give you double the SSD storage.


AMD would have to match the current power draw and integrated GPU that Apple can get from Intel before they switched on the portable front.


I think AMD would make more sense on Desktop front. But TB3 is still a roadblock.


The crew over at amd-osx.com figured it out, can't imagine it'd be much more difficult for Apple to do so.


There's more to it than just getting AMD to boot the kernel. Features like AirPlay that are Intel-specific.


There's more to it than getting it 'working'. Airplay support, for one.


I'm a Hackintosh user for almost 10 years (from 10.6!), though I do have a MacBook Pro with my when I'm out.

As other people stated, yes, it is very time consuming to get it straight. Treat it as a hobby, you'll understand things about the Mac (or computers in general) other people don't, such as DSDT patches, how drivers are loaded, and Mac power management, etc.

If you use Clover, and get all the patches right, you can almost get an update-proof setup (Except when you go from say 10.11 to 10.12). But even at the worst case, usually people on the Internet will figure out fast enough for you to apply the new patches. Minor updates are really really easy and fine. I always click Update without batting an eye.

It's a tinkerer's hobby. If you like doing researches and being fine with spending time figuring out stuff on the Internet, I will say go for it and try it out! The process is fun and the result is very rewarding.


So just like using Linux then? /s


I know I could Google and read about some experiences but since this made me think about it... have any of you HN'ers tried to virtualize OS X or run it virtualized (on anything other than an OS X host) on a regular basis?

I've got a (still pretty new) high-end MacBook Pro sitting at the end of my desk but -- after putting together a new, extremely over-built workstation a few months ago -- I haven't even turned it on since I don't know when. I've got KVM/qemu, VMware Workstation, and VirtualBox all installed on my workstation, though, and it might be interesting to try to get OS X running under one of them.


I got Snow Leopard running under VMware a few years back. It 'worked', in that I could do most things, but sound was glitchy, and updates would regularly break things.

My ultimate answer was just to give up on Apple on the desktop and stay in the Windows/Linux worlds. I'm just not going to invest any further time on a system that is owned by a company that clearly doesn't care about it, and is actively hostile to attempts (like virtualization) to use the system in a larger context.


There are mac vagrant boxes available that work on a linux host. You may need to put

    vb.customize ["modifyvm", :id, "--usbehci", "off"]
in the vagrantfile.


Best mac I ever had is my Hackintosh. I built a fusion drive for it and use Clover instead of Chimera for a boot loader, so I can update from major versions without trouble.

I've never had any issues, i just shop for compatible chipsets. I've had tons of issues with Linux on the desktop in comparison to Hackintosh. Never understood how people say it's time consuming to do it!


It's time consuming, depending on the particular hardware you have. The updates usually go smoothly, but then a sound driver or something will break and you have to go and fix it. Which is pretty annoying if you want to spend your time getting something else done with your computer.


One day I want the Adobe suite to run on linux (natively) then I can leave OS X just like Windows. Local Media editing is the last reason for windows/os x to exist.

Everything else (games, calculations, social networks, model rendering) is better served by a web or mobile app backed by a server running linux.


If I were Apple, I'd buy Adobe the very second I even heard a rumor of a Linux Adobe suite. I just wouldn't let it happen. Would be a few billion well spent.


What really needs to happen is a big name company with money (like Google), to go all out and get a decent linux distro on the market for professionals that everyone can get behind, and also subsidise the development of big name products from Adobe, drivers, game developers, in the short term get the OS to gain traction.


But knowing how Google works they will lose interest in it in three years and then that OS would be in the same position as macOS.


There are some comments here as to motivation: why go through all this? I make video money using after effects, mainly. I'm chained to the oars. The choice of intermediate codec (ProRes) is surprisingly important, there are other solutions that don't make it for my workflow. And it's not just codec, lots of things about windows 10 make it an unprofessional choice. I was planning to build a windows box anyway, with i7 6900K, 128GB ram, ASRock X99 Taichi, GTX 1080, NVMe, all that. When I overlaid that over a hackintosh, it seemed a bit past what is possible, at least from a cursory look at tonymacx86. It's worth a week of work to me, maybe I'll look harder. Thanks for listening!


Wonder why OP didn't use PCIe NVMe drives.

Night and day better than standard ssd on homebrew macs.


May I ask you what kind of work they shine at?

Right now I have 2 machines on my desktop, one with SATA SSD (sams 850pro) and another with PCIe SSD (sams 950pro). I see big difference in benchmarks, but do not feel any difference in my day-to-day work. Boot time is shorter on PCIe one, but I reboot these machines maybe once a couple of months.


I believe hackintoshes are unable to boot from NVMe drives at the moment (this may have changed)


This is false information.


I custom built my PC before knowing of what a hackintosh was so I didn't purchase my hardware specifically for it. According to the compatibility wiki my config was compatible.

I remember during college of wanting to do some Ios development but afford a MAC. I spent atleast 20 hours of tweaking kewts settings and trying different distros. I finally got it to boot but it crashed whenever I tried to run the emulator.

I haven't touched hackintosh stuff for several years but the grief and time wasted makes it not worth it. It's a shame apple is limiting their development tools to MacOS. They could learn a thing or two from Microsoft.


I have a Toshiba Satellite Radius 11 L15W-... its a cheap 300 dollar laptop runs windows 10. changing the OS is not supported and actually blocked my the manufacturer. I was able to find a few tools to remove some of the counter measures set in place. It is now a Ubuntu Laptop, I would use a Mac for development but Macs are expensive. I have built 2 PC gaming rigs that were more expensive then the 1500 price tag of a hackintosh. Bottom line is you can get any modern OS working on almost any machine. It just starts to get really hacky and information can be hard to find.


Or you can get a Windows 10 machine from a reliable integrator (like Supermicro or ThinkMate) and it will "just work."


This seems reasonable: http://hackintoshmethod.com


I'm not a Hackintosh builder, have considered it many times. From the video's I've watched, the big trick, especially if you use Final Cut Pro X with OpenCL, is to avoid NVidia and use AMD Radeon video cards. Supposedly this makes the build process a lot easier/less finicky, and even faster. Can anyone confirm?


Has anyone here tried running OS X under Parallels Desktop on Linux?

http://download.parallels.com/desktop/v4/wl/docs/en/Parallel...


> I've switched off auto-updates in Sierra. While system updates should work just fine, I prefer to hold off until the community over at tonymacx86 have confirmed there are no issues.

Does anyone know why some of the updates can brick the machine? Also how often does this happen? Or like what percentage of updates break things.


I ran a Hackintosh for years, but I used the stock/retail ISO along with a couple of custom kexts and the nvidia enabler (the drivers will run other nvidia/ati cards, they're just not officially supported and the stock kernel models have a white-list of the models that were sold with macs).

I remember leaving the stock updates on and rarely ran into issues. I think I had video screw up once or twice, which just involved getting some of the latest kexts off the forums.


Definitely does not brick the machine.

Just that you restart and graphics are weird or audio is having issues, so you have to figure out how to resolve those issues.

By waiting a few days and seeing what issues you may encounter in advance, it's easier to avoid issues.

(That being said, issues from updates has never actually happened to me)


Is the motivation to use a hackintosh over linux apps you cant get for Linux mainly?


Pretty much, and it's a very polished is overall. It's especially good for some niche things like audio, video and media. Photoshop has always run like a dream on macs and I use Logic Pro on my hackintosh for which there's really no alternative on Linux.


a bit off topic. NVIDIA driver for mac is kind of suck. the bug, which showing a transparent window whenever you try to open an epub file with ibook.app, is been there for a very long time, and still no fix yet.


Still! I was hoping for a fix for that. I hear if you activate your onboard intel grapics it will work but I haven't managed to do that yet


Those cases still look really ugly. Reminds me of the look I was going for twenty years ago when i was a teenager w all the neon. The Macs at least look elegant and like something I'd want in my house.


There are plenty of elegant PC cases available. The author went to great pains to put a ridiculous number of LEDs into the build.

Even with the LEDs, there is a ~$5k budget difference between the hackintosh and the trash can. That much will buy you a pretty wide range of aesthetic choices. For instance, you could probably have someone cnc something out of aluminum, or commission a hardwood case that matches the period of your office furniture.

Personally, I think the build would have been more memorable if it incorporated some strobes, a fog machine and plasma bulb. (Bonus points for replacing the plasma bulb with a tesla coil without sacrificing stablity / safety, and without exceeding the trashcan price)


I used to use this for my Hackintosh case:

http://www.silentpcreview.com/article163-page1.html

Lian Li PC-V2000.


That's your only criticism? Maybe choose a different case?


Well, and the airflow looks a lot like "uhm, I'm just gonna slap fans all over it, that's probably fastest ^W best, k?".


Curious: is there a window manager for Linux that behaves much like OSX?


ElementaryOS is probably the closest experience: https://elementary.io

I don't know if the window manager is easily usable on a different Linux distribution though.


In what way? IMHO window managers is something Linux excels in.

What does macOS has in its window manager that Linux lacks? (other than "no you can't maximise easily" feature...)


I love Linux, but the WMs are not its strong point IMHO. There are many of them and each tries to invent some new desktop paradigm instead of polishing existing experience. It's OK to be boring - just do your job and don't get it my way. I'm currently using Xfce and hope they don't take the KDE4 route for a long time... (not that the experience is bugless or even great, but it's better than alternatives).


Huh? Uncustomized Kwin is about as boring as WMs get.


The late Gnome Shell is very similar, both in look and in RAM usage (1GB+).


After years of making fun of Linux folks for having to "know everything about the computer" to get things working, Mac folks seem to be embracing the culture these days.


I can't speak for everyone, but for me there's a big difference from wanting to know everything about the computer and needing to, particularly when I'm not at work.


These days setting up a Windows or Apple system means fighting lots of relatively confusing dialogs and options to avoid using cloud services, etc. Modern consumer operating systems want to analyze every bit of your activity they can get their grubby mitts on and you really have to fight them and say "No thanks!" about 15 times on a modern phone and less, but still a lot, on Win/Apple.

I care about privacy, like most people, but I am able to control mine far better than the typical, non-technical computer user. I am reaching a point where these operating systems are becoming untenable for day to day computing use. They lure you in with "everything just working" including the privacy invasive "cloud" features. Increasingly these things are using ML to mine your behavior. I can run Linux. I don't even mind paying for a privacy with a little bit of my time and convenience. I don't want to look back 25 years from now and say "I really wish I had not given up this bit of privacy". Alternately, I am hoping when I look back I can say, "It was worth the relatively small time sacrifice to avoid these things."

(Not confusing for technically savvy folks, but still, pretty hard).


Yea, you might be confusing Windows and Macintosh there (or iOS and Android). It's trivial to setup a Mac, there is one dialog to collect crash data you can opt out of, and you don't have to use iCloud. Apple would like you to use iCloud but doesn't make you, doesn't want your private data, and is committed to protecting that data.

Windows is a little more frustrating, and I don't think Microsoft has the same level of privacy commitment as Apple does, but I also don't think it's that far behind.

Google is a special case, they are an advertising company and have virtually no revenue streams outside of that for Android. I'm sure they'd like to protect your privacy too if they could still serve super targeted ads to you, but they have no way of doing that.


Sierra got a lot worse. That was the end of the line for me. App store, difficult to disable System Integrity Protection with no alternative APIs to replace lost functionality, etc. But, yes Apple is the best of the lot. Android phones require a lot of battle to tame by default.

iOS is by default the most secure phone OS, etc. (Secure Enclave is nice and you know it is available on all iOS devices newer than X) But it is a walled garden that is too tightly controlled in terms of what is allowed.

Still if you enable iMessage it gets murkier. Siri, cloud backupiCloud keychain (which is good, but not for me). Lots of small pushes to remote services.


Sarcasm detected, but this is only a tiny fraction of Mac users. 99.9% will just go to an Apple store, buy a Mac and are sure that everything from power management to Wifi works.


Be careful with OS upgrades. I recently upgraded my hackintosh to macOS (Sierra) and now it can't shutdown/restart.


Unlocked CPU and Z-Board but no OC? With that config you should be easily able to run that CPU at 4.6GHz


Eventually you'll likely give in and install windows.


If you're not doing VFIO, you're doing it wrong.


Apple have made some terrible design decisions since Jobs died.

A 'thin iMac' - whose thin-ness was absolutely pointless.

The MacBook (2 lbs) and MacBook Pro (3 lbs) were designed to weigh arbitrary weights instead of thinking about features: http://www.apple.com/mac/compare/results/?product1=macbook&p...

And the MacPro was possible the worst shape for upgrades.

Jobs inspired people to come up with great machines, but he also had a pragmatism which seems to have been lost at Apple.


>A 'thin iMac' - whose thin-ness was absolutely pointless.

For you maybe. Many of us (millions, judging from the sales numbers) appreciate the thin-ness, even if it's for a desktop machine.

I had to drag my old non-thin iMac several times to different locations, and it was no joy. Nor do I need a behemoth on my desktop.


Hmm, you're implying that everyone that bought a new iMac appreciates the thinness and bought it because of that?

Or perhaps they bought them because they had no other choice when choosing a reasonably fast desktop machine compatible with their software?

Seriously, the argument "oh, this feature must be popular because the whole machine is popular" really doesn't hold water in complex machines like computers, phones or cars.


>Hmm, you're implying that everyone that bought a new iMac appreciates the thinness and bought it because of that? Or perhaps they bought them because they had no other choice when choosing a reasonably fast desktop machine compatible with their software?

iMacs sold better even when the "cheese graters" were available, so yes.


The cheese graters cost between two to four times the price of an iMac depending on the era we're talking about (price fluctuations of the PowerMac/Mac Pro). Counting the fact that you do need to buy an external monitor to go along with it. That counts for something.

Apple never had a usable entry level desktop computer tower. Something with high performance on consumer class CPUs, rather than Xeon and ECC ram. The Mac Mini was always crippled to keep it from competing with the iMac and cheese graters among people who want something better performing.

The iMac is popular because it's the best performance to price ratio. Not because the form factor is any good. For the longest time the entry level MBP (and unibody Macbook before then) was also Apple's best selling laptop and they only recently cut it out from their line-up and replaced it with the Air as their entry level offering. The Air will also exist for as long as they keep selling the current Macbook at those prices because most people are not willing to spend 1449 euros on a machine that barely performs better.


As someone who owned both an older iMac and a new iMac, I really fail to see much difference. Form factor, overall, is much the same, as is the weight. I think it's more than a little exaggeration to call the older iMac a "behemoth".


Depends on what you call the "older iMac".

The 27" Core 2 Duo 2008-9 era iMacs, one of which I had, weighted 13.8 kg!

The new 27" model is 9.5 kg. That's over 30% less weight...


Yes, are technologists so effete that they can't handle a 5lb laptop anymore? Size constraints I understand as I fly on airplanes a lot, but weight constraints can really be annoying. At least for higher end machines its okay for a lot of us to have an extra bit of weight for more features. It can be both.


>Yes, are technologists so effete that they can't handle a 5lb laptop anymore?

Apparently they are. Just read this effete technologist's comment:

  I’m personally just hoping that I’m ahead of the curve in my strict requirement 
  for “small and silent”. (...) I want my office to be quiet. The loudest thing 
  in the room – by far – should be the occasional purring of the cat. And when I 
  travel, I want to travel light. A notebook that weighs more than a kilo is 
  simply not a good thing (yeah, I’m using the smaller 11″ macbook air, and I 
  think weight could still be improved on, but at least it’s very close to the 
  magical 1kg limit). -- Linus Torvalds


Size and noise I am totally on board with. Just not the obsession with weight. And I am an ultra-light gram obsessed backpacker. I may just be an outlier or unusually strong. (Kidding on the last qualifier :))


The focus on weight baffles me. Most _bags_ are now heavier than laptops.


> I want my office to be quiet. The loudest thing in the room – by far – should be the occasional purring of the cat.

Any reasonably well built desktop PC does that. It's a matter of applying a modest amount of care when selecting and assembling components.


>Any reasonably well built desktop PC does that.

When it comes to off the shelve towers and laptops, you'd be surprised.

And if you mean it's easy if one personally "selects and assembles components." you'd be surprised again on how atypical this is for the average PC buyer.


So? I claimed neither.


Well, by writing "Any reasonably well built desktop PC does that" it seemed like you imply that most models available in the market do it as well.


No, most aren't well built, but that's nothing specific to PCs or even computers. Most stuff is designed to be assembled easily, quickly and cheaply by unqualified workers. That doesn't per se contradict that the result is well built, but usually means that compromises are made to accomodate the assembly process, which diminishes other qualities of the product.

E.g. you'll never find a high-end CPU cooler in a factory-assembled PCs, simply because they are too difficult and slow to mount. Compared to a simple push-pin or lever-mounted cooler these can have a dozen parts or more and require funneling screws through cutouts in the cooler itself etc. -- you just won't see that on an assembly line.

There are also other issues, more specific to PCs, e.g. the standard case form factor does a poor job supplying graphics cards with fresh air. Other case form factors solve this (e.g. Silverstone has some; the "trash can" Mac Pro follows a similar concept), but tend to incur other compromises (and price tags).


> whose thin-ness was absolutely pointless.

Design isn't pointless. That in itself has value. I'm thinking the poor upgradeability is actually a feature (for the seller - not the buyer) and not arbitrary either.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: