Unfortunately, the information is still very fragmented and it takes a lot of time and effort to get one running. Actually building the thing is the easiest part, it's the booting, installation and OS patches/fixes that eat up time.
The software is usually on sourceforge or something even shadier, it's rarely actually OSS-licensed, and configuration/documentation/interfaces are a nightmare of incomprehensible incoherence. There cannot be a software community less professional. I've seen Minecraft plugins by 8 year olds much better in these categories.
From one side there is huge user base full of people who want to get something working with no effort and usually tend to demand something from developers, but very few can provide even proper bug reports. On other side there is bunch of "experts" some of which tend to leech of others work and even less of actual skilled developers.
So it's just stressful to participate those projects. In the end most of enthusiasts that stick there must have very strong personality or be selfish persons that looking for cheap popularity.
Also there are humble guys who silently work on their own thing, but they usually work alone or in very small groups and don't communicate with outsiders. And of course most of time they don't ever get credit for their work on BSP reverse engineering and fixes.
Man, I remember feeling disappointed people weren't making more FPS games for DS, so I decided DOOM would be fun to port. I even got the wifi multiplayer working.
EDIT: The biggest unsolved hurdle was getting the music to play (it was MIDI and the DS hardware didn't really have anything to play it back). Eventually someone got a hand-optimized OPL3 ASM player to work on ARM7, but I don't think anybody ever managed to connect it to DOOM running on the ARM9 processor using the FIFO. The DS had two CPUs (ARM7+ARM9), and the ARM9 was completely utilized.
I also helped out in a very minor capacity with DS Linux, but pepsiman did nearly all the work.
I'm kidding, but only a little. This actually sounds a lot like the state of the Minecraft ecosystem last time I checked, about two years ago.
I always guessed Android ROMs were such a misery in the community aspect, because they're very easy and relatively useless. All you need is a computer, which is easy enough, then some open and free software, and you're ready to go and compile your own ROM. The source code is often available, and you can change a color here and there and maybe change some of the Linux kernel parameters so you can say it's for better battery life, or more free RAM if you're into that.
That's the easy part. Now the relatively useless part: phones don't last long. Someone who is keen on tweaking their phone will probably be looking for a new one every year, or every other year. At best they last about three years. The changes to the software is minimal at best, and every custom ROM is practically the same. There is no marketability in these ROMs. There's not millions of dollars of potential revenue to be gotten out of a server that has to run with 20 years of uptime, that a company can jump into and support, like you can see with other uses of Linux. Red Hat supports for like 10 years? Android devices get discarded in two. Therefore there is little professional attention to third party, open source distributions of Android.
Google develops it, companies alter it a little bit, then ship it. Hobbyists from all around the world take the open source aspects of this and maybe add some new stuff here and there. There's no need to expect professionalism from hobbyists.
(I find CyanogenMod, or LineageOS or something now, to be pleasant to use on a phone. It gives the option to run apps as root, and it comes with a terminal, and it tries to remain as open as it can.)
Absolutely. I especially like it because it gives me the ability to restrict network access from apps (I usually just deny background cellular access but some apps I think have no business talking to the network even over WiFi). Moreover, the permission model is much better which brings me to the reason I uninstalled Facebook Messenger: denying start at boot or any other permission to Facebook Messenger (Or Facebook) on Lineage crashes the permission manager. Whatsapp and Instagram have no such issue. They will happily accept not being able to run at boot (I haven't looked into whether it actually works but at least I can set the permission as I want).
I guess my number one request for a custom rom would be the ability to say I don't want any app to run in background/run at boot/use network connections unless I specifically white list the app for the purpose.
Similar good experience with Nexus 4 and NitrogenOS for my standby device. Android 7, works perfectly fine.
It seems strange to worry about licensing in a community predicated on violating a term of an OS license.
Is this a problem because source code isn't available?
(As to the _When_, that can a point for contention, given that consumer software marketing material typically doesn't explicitly mention the license)
It isn't the same as some kid who writes a plugin and then uses it and keeps playing minecraft.
As the owner of an early-2009 octo-mac-pro, I am very interested in learning about how to max that system out to the gills.
For instance, I am under the impression that it is now possible to boot from a pci based SSD card on that system.
Further, I believe it is possible to add a card that gives me USB3/SATA3 (although if I am booting from PCI, I don't really care if my slow mass storage internal four drives are SATA2).
Finally, I think there is some apple-blessed video card that is quite a bit stronger than the original GT120 cards I have in my system.
I've seen bits and pieces here and there about maxing out a mac pro and bringing it very close to "modern" but it would be nice if it was all in one place somewhere...
FWIW, other than maxing out the ram, I really have no problem with the existing RAM options and the dual-cpus that I already have (which bring me to octo-core). Even 8 years later, the machine is not slowing me down at all in the cpu/ram department (and I only have 6 GB).
The first thing I added was a CalDigit FASTA-6GU3 SATA3/USB3 card. This gives you two internal SATA3 connectors (bootable) which I'm using for my macOS boot SSD. It also has two eSATA ports and 2x USB 3.0 ports and only takes up a single PCIe slot. They've since come out with a new card that adds a USB 3.1 Type-C port. No drivers needed, it "just works".
Next was a NVIDIA GTX 970 card, this works perfectly in 10.11/10.12 once you've used the GT120 card to install the NVIDIA Web Drivers. You don't get the boot screen unless you get the card ROM flashed by macvidcards.com ($$) but I haven't found this to inconvenient as I have left the GT120 in a spare PCIe slot so if I need to use the boot selector for some reason I can just swap the mini-DisplayPort cable over. You can go up to a GTX 980 Ti but you're limited to Maxwell cards as NVIDIA hasn't released macOS drivers for Pascal.
Finally I picked up 64GB of DDR3 1333 MHz ECC RAM on eBay that was pulled from server for under $200, this was mainly to fulfill my dreams of being able to say I have 64GB of RAM but I run a lot of VMs for network simulations and it really helps to have the extra memory.
I've also flashed the SMC so the machine thinks it's a MacPro5,1 of the 2010+ generation, you'll need to do this if you want to upgrade the CPUs (I haven't yet) but has the added benefit of letting 10.12 install without any hacks or modifications to the OS. If you're on 10.11 now you'll have to disable SIP temporarily to run the firmware update but it was otherwise quick and painless, this is the best place I found with clear instructions: http://forum.netkas.org/index.php/topic,852.0.html
All in all this is the best Mac I've ever owned and it amazes me it works as well as it does being almost 8 years old now.
"Next was a NVIDIA GTX 970 card"
How many PCI slots does that card take up ? I currently drive four monitors with my mac pro and the single slot aspect of the gt120s makes that easy ...
"All in all this is the best Mac I've ever owned and it amazes me it works as well as it does being almost 8 years old now."
I know, right ? I regularly have 20+ chrome tabs open along with 1-2 VMs running in vmware fusion and I have never once felt like the system was slow. My only issue is that I have only ever run snow leopard on it and now chrome no longer has updates, so I need to move to ... mavericks maybe ? I feel like mavericks is the most stable/sane OSX release since SL ...
This will obviously depend on the hardware/software you use but Mavericks was the buggiest and worst OS X release on my rMBP. I was rebooting weekly due to weird graphics drivers glitches and inexplicable OS slowdowns. El Cap had a rough start, but was fine after a couple point releases. I've had no issues whatsoever with Sierra.
EDIT: 10.12.3 doesn’t seem to be much better, sadly:
My interaction with PDFs starts with viewing a linked one in Safari and ends with "save as PDF to web receipts folder" and for that kind of usage there are absolutely no problems. I always thought that for anything half-advanced with PDFs (like filling out forms) you'd want to install Acrobat Reader anyway...
Then, I put this:
SAMSUNG 850 PRO 2.5" 1TB SATA III 3-D Vertical Internal Solid State Drive (SSD) MZ-7KE1T0BW
In it as my system drive. The volume looks external (its got that orange and white icon), but everything has worked perfectly for about 18 months. It boots just fine.
For me, the performance increase with this change was incredible. Its like a new computer. I my OS and all my installed applications on it. I keep all my data on traditional drives in the other four bays. Its like a brand new computer. It was completely worth it. Right now, I use it for very intensive workloads of various kind, and never even think about replacing it.
Finally, you can find a lot of info online about whether you should enable TRIM or not. FWIW, I did do it, and its been working fine for 18 months:
I cannot remember at this point how I copied the data over. I do remember that whatever method I found on google worked quite quickly, and that there were no hassles with it. After doing some kind of copy operation, I booted to the new drive and never went back.
Hope this helps.
Yep, put a Samsung SM951 M.2 (AHCI version, not NVMe) on a Lycom DT-120 PCIe mounting card. Got two of those in a MP5,1 myself.
i'm sure it's probably very easy to make a non-mac video card work but i can't be bothered.
Small issues since, but there's always a solution + guide within 15 minutes of searching (and it's usually just flipping a flag in Clover).
When an update comes out, I just wait a week or so to make sure there are no issues and then update (no issues at all so far with that).
Having a ridiculously powerful computer running macOS at a great price has been such a joy.
If I'm talking to someone who is the kind of person that builds computers and is comfortable with a bit of command-line & config, I always recommend them to look into a hackintosh because a lot of the uncertainty has been removed thanks to the great community around it.
However I'm starting to give it serious consideration as Apple is signaling pretty hard that they don't give a shit about desktops any more, and MacOS is not seeing much development other than some sloppy seconds and half baked ports of iOS ideas.
I'm pretty heavily depend on MacOS for my daily workflow these days, but the writing is on the wall that I've got to start thinking about Linux or even Windows over the next 5-10 years.
In that light, the benefit of the hackintosh is not only being able to get current hardware at a fair price, but also being able to hedge my bets and dual boot to Linux to get my feet wet.
Just to see if it would work, I hackintoshed it prior to installing FreeBSD. Apart from the network ports, everything else (Nvidia card, USB, etc) worked.
Network connectivity was fixable by just plugging in a USB3<->ethernet adapter (also had one handy), and voila, usable desktop.
It wouldn't be surprising (to me), if newer generation Supermicro boards also worked. That would open up possibilities for lots more cpu grunt in an OSX box. :)
While I may not wholly disagree with this, what OS is doing much better? Linux is still slowly catching up to the macOS graphics compositor from 10 years ago, and Microsoft is busy integrating its own half-baked port of Windows Tablet Edition... True Windows does have some momentum with the new ubuntu subsystem, but that's still catching up when macOS is a real native UNIX from the ground up.
There's not much more I want out of my OS... a new filesystem to replace the 30-year-old-design HFS+ is already on the way, and while I'd love a more modern CLI userland, that isn't going to happen due to legal reasons, so I'm fine with homebrew. They just need to fix SMB and clean up bugs and performance issues for me to be totally happy.
[UPDATE] Case in point: one of the banking apps on my iPod just stopped working, insisting that I upgrade to the latest version. So I did, but the new version doesn't work either. It just hangs when I try to log in. Granted, this is an iPod, not a Mac, but the iOS philosophy of having Apple control the device is slowly creeping in to the Mac. Every day and every upgrade is a crap shoot in terms of whether or not what worked yesterday will still work today.
Sure, I see less of them, but hardly unexpected considering people buy fewer of them (and they're less targetted by malware).
On the other hand, I hate dealing with them because they can suffer from the same "it should work but doesn't" crap that Windows machines do but without the critical mass of support via Google.
And unless you get your issue escalated, the Apple Geniuses are about as much use as a chocolate fireguard.
To a certain extent that is certainly true. But there was a time during the Snow Leopard era when Macs were extraordinarily reliable compared to the then-contemporary competition, and even compared to anything before or since. Things have gone downhill since then because Apple decided to de-emphasize reliability in favor of other things like thinness, and the merging of desktop and mobile environments.
My conjecture is that my hatred for any system will increase the more I use it, to the point of switching to another system, then the cycle repeats.
Of course, it helps if you have a macbook pro in addition to your desktop; you can check out the new stuff on the genuine apple machine, and maybe get away with postponing updates on your desktop even more.
Only thing is iMessage/iCloud. It takes a little to set up correctly, as you'd have to "hack" the serial numbers etc., but once it works, it just works.
However, if you stick to known hardware (mainly this means specific motherboards with well supported chipsets) it's quite straight forward and reliable. If you're starting out without existing hardware, just stick to the known MBs and you shouldn't have any issues. Very dependable hardware lists are available at tonymac and osx86 project, just use that.
Hardware support has come a long, long way in the past few years.
With the NCASE, you can get a LGA 2011-v3 mini-ITX board and get as fast of a computer as you want. Though truthfully, I don't think there's much of a point.
Likewise, an old NVIDIA card for a Hackintosh? Also not much of a point. The web drivers are so bad. Who knows when Apple will ship another NVIDIA chip in a Mac?
If you need "CUDA stuff," use Linux or Windows. Software like Octane is so buggy and suffers from worse performance on Mac anyway. Final Cut and After Effects both support OpenCL acceleration. Besides, the RX 480 is $189.
If you're doing "VR stuff," well pity on you if you're developing for an Oculus on a Mac. The Vive doesn't, and probably never will, support Mac. Whatever VR solution Apple releases will obviously run great on the video cards they support, so that again strongly points to purchasing an AMD card at this time over your alternatives.
With regards to this specific build, a high DPI display will greatly improve the enjoyment of this computer. The Dell P2715Q is the best deal. Mac OS has such good HiDPI support compared to Windows (and especially Linux). Enjoy the features you pay for!
Truthfully, I'm hard pressed to see the point of a Hackintosh, and I own one.
I managed to install Tensorflow with CUDA on Windows just by following theirs official installation guides.
I also managed to compile Tensorflow with GPU support from source, also by just following the official guide.
Sure, it's a bit more work than "sudo apt install".
Short story: worked great, running in about 5 minutes. (With VSCode, I even get autocomplete out-of-the-box on the TF libraries, imports, etc.)
I think it was much different before 1.0. Also, building on Windows is a little more difficult. But just installing it really isn't. And Keras was far more annoying because it wants scipy, which wants BLAS, and I don't use Anaconda. I haven't gotten around to that one yet...
Christoph Gohlke to the rescue: http://www.lfd.uci.edu/~gohlke/pythonlibs/#scipy
I also don't think that the numpy installed via pip is using mkl, so his version of numpy should be faster too: http://www.lfd.uci.edu/~gohlke/pythonlibs/#numpy
I wish I know what Apple was thinking. One would have thought that the "iPhone halo effect" was something that they would have wanted to give momentum to. Instead people are looking at Windows units again.
It's literally incomprehensible. Just make a really good fast desktop computer. They're one of the largest companies in the world now with essentially unlimited financial resources, and decades of core competency in making high end desktop computers. So like why don't they just fucking do it.
Google shovels billions into these absurd projects that never go anywhere and nobody blinks because it's Google. They announce projects and then cancel them just before they hit store shelves, or worse, after they've sold some and then decide to arbitrarily brick them. It's just Google being Google. Nobody cares.
Apple removes a single connector on their product and everyone goes apeshit. "Apple's doomed!" "Apple profits are going to suffer! Sell!" Doesn't matter that their device sold in record numbers. It's about perception.
Apple has to tread carefully. They too unbelievable amounts of shit on the Cube, a computer that in retrospect was just one stepping-stone on the path towards the very successful Mini, and on other "failures". They're allergic to public humiliation like that, and a desktop computer that sells only in low volumes is a failure.
And concerning Apple, their connector choices have been really stupid for the past 10 years, only the magsafe was nice.
Apple might have stubbornly stuck to their 30-pin connector for way too long, but at least people who bought accessories for it weren't frozen out when the latest incarnation of USB Mini/Micro/Whatever caught on.
"Really stupid" is shit like the Nexus Q (https://en.wikipedia.org/wiki/Nexus_Q) discontinued before launch, or the half-assed effort to get Android vendors to update their software more often. It's Microsoft's utter failure to capitalize on any meaningful market share in the mobile space when they were a leader in mobile/PDA/tablet applications for years.
What I really don't understand is why they didn't foresee this, and just updated their cheese grater Macs. After all, it's way easier to simply update a motherboard to fit a faster CPU than it is to rearchitect the Garbage Can design every time there's a new CPU and GPU architecture out there.
The only thing I can think of is that they've either just downprioritized a lot to focus on $new_secret_product, or that they've developed a new Mac Pro that for some reason didn't pan out, so they needed to start over again.
The business partner has an old one that he uses because it has some old software he needs for design files.
When it eventually dies I'm converting it to a normal PC case and using it for my next build.
It's just a beautiful design.
For the sake of argument, why? It's probably one of their worst selling products. Nobody defends the remaining iPod enthusiasts when they demand a new giant capacity iPod.
Maybe we just need to accept that Apple doesn't want to be in the desktop computer business for much longer?
Their inability to make a top of the line desktop is threatening my ability to justify buying their laptops or anything else from them. I run a company. I'm not going to mix and match operating systems and if I can't have high end desktops for heavy media work I'm going to switch everything eventually. I'm not abnormal.
People keep trying to come up with some 4D chess explanations for why this makes sense for the always wise and prescient Apple. It's much simpler to use Ockham's razor to conclude that they are making a shortsighted mistake.
You saying this doesn't make it true. To a neutral observer, Apple looks like a technology company that makes consumer products, including sometimes computers.
Edit: From Wikipedia:
During his keynote speech at the Macworld Expo on January 9, 2007, Jobs announced that Apple Computer, Inc. would thereafter be known as "Apple Inc.", because the company had shifted its emphasis from computers to consumer electronics. This event also saw the announcement of the iPhone and the Apple TV.
Still, I think there are two arguments:
They are sitting on so much cash. There is obviously a cultural or organizational problem that produces to much friction to keep this product line up to date. I'm fairly certain many other corporations, if provided access to basically infinite cash, could figure out how to put new ram in an existing product line every year.
There's huge value in having creative professionals and developers use your product. Keeping people in your ecosystem is important. This might be hard to quantify on the bottom line, but it seems like an obvious strategic mistake to give up this advantage.
What if the ecosystem they want is iPads and iPhones?
Apple can have a flagship line that's break even or worse to anchor the high end. It's literally inexplicable, and I think framing it in the rational actor model is a mistake, it's best explained as a fuckup.
Bleeding power users to alternative systems creates vulnerability.
These arguments would be the same for Microsoft circa 2007. Why should Microsoft care if the designers, developers, and tastemakers hate their products? They've got so much lock-in that people will be forced to use their stuff forever. Wheee, we're invincible. We all know how this turned out.
I'm running up against low memory issues on my 16gb MBP lately. I absolutely cannot responsibly update my Apple product right now because of the 16gb limitation on the MBP. It does matter!
I'm getting ready to bite the bullet and go back to Linux desktop.
If you haven't built a desktop in the past few years, the performance boost from PCIe NVMe SSDs is great, and Intel i5-7600K (now retails for $200) can run at 4.5 GHz reliably and stay cool. I'm impressed.
I myself don't dare to go Windows because of malware. Linux is where it is.
However, what I did appreciate was the (in-the-case) logistics difference of installing an M.2 drive; one screw, no cables, almost as easy as a RAM upgrade. Not having to fuss with SATA was a pleasure, and reminded me of the switch from IDE to SATA (you just... plug it in? Where are the jumpers?).
I get all of the arguments on how long it's been, though.
Over $20B in annual revenues with at least $3B in net profits, that's a hellofa niche. So the question is, why is Apple not updating/refreshing Mac lineups more often, and why don't they have a decent pro model?
As for getting to even better numbers, well, they had decent pro models (and even servers) back in the day, and they (presumably) know that they don't sell that much.
i could almost turn "could macs be good value?" into a mandatory interview question, because those who say no don't understand how trading money for time works. working with such fools is incredibly tedious and painful.
I would advise against that, unless you are in the habit of interviewing people who value their time at $200+ per hour, such as in your above example. Otherwise, their answer may be coming from a slightly different mindset.
Most people who buy Macs are not very price-sensitive, and they decidedly buy them so they don't have to personally build that "much better hardware".
So, this option does exist, but it's the exact opposite about what they want.
Assuming they even need the increased power in the first place, which most (except creatives pros in video and 3D) don't.
Besides, being "the crazy ones" and being a cpu/gpu power hungry is orthogonal. Apple never sold the most powerful computers (except briefly and accidentally, when Motorola did something right).
Just friendly and easy to work with computers -- which the "crazy ones" could get going with without needing an IT debarment. A "crazy one" can be e.g. a human rights activist or some novel graphic artist, that just needs a simple computer to work with.
Is it safe? That is the question.
I've been using hackintosh for a while, and it's almost a vanilla OSX with the exception of Clover and FakeSMC. I doubt there's any malware in there...
Famous last words. Unless the builds are verifiable, they're not trustable. Open source doesn't do you any good if you don't know that the binary that you download and run doesn't do something weird.
Open source gives you the possibility of knowing your binary is good, by compiling it yourself. I chose not to do so, because I couldn't really be bothered. But others who need this kind of security could, and I would hope they do.
Would all that is necessary be to put the Hackintosh behind a network inspector, say Wireshark, to check if anything nefarious or unexpected is going on.
I despise MacOS's keyboad shortcuts. I just want a stable linux laptop that is fast enough for emacs + web browsing and has an impeccable screen/keyboard/trackpad/battery.
My HW budget is high enough to buy a high end apple laptop and tiny (27"?) apple-ish display from LG.
If I move to a overkill-nice PC laptop, and also buy a 43" 4K (non color matched) monitor, I think I'll be able to buy a ryzen cluster with the left over cash.
I'm not sure if that will be faster or slower than my xeon VM in the data center, or if I want to spend the time setting up distcc.
Isn't that a Dell XPS? https://arstechnica.com/gadgets/2016/06/the-xps-13-de-dell-c...
I used to say that their laptops are fine, and it's only the desktop where i need to hackintosh, but ... touch typing hostile emoji keyboard?
Ubuntu is missing Office 365, all Adobe Products and much more. Yes you can run it with Wine, but that's not that easy.
I also don't like the dock but at least that can be made to auto hide. I do like lenses and I wish I could have them in the Gnome 2 look alike DE I'm using on Gnome 3 right now (Ubuntu 16.04 with gnome-flashback and some customizations, included merging the top bar in the bottom one.)
* Hidden menus. Can be changed, but bad defaults.
* Top Bar.
* Left aligned window controls. It's not an option to change anymore.
Con: All the UX I don't like about MacOS. Pro: At least Edit->Cut works in the file manager. YMMV 100%.
Gnome3: Has a "launcher", but it's default hidden. I use meta/win key and type what I want.
HiDPI support has been garbage for at least 2 years. I'll get scaling and resolutions right for my XPS13 QHD, and then I'll plug in a 1080 monitor and it goes back to looking like crap. The sound goes out unless I cold boot twice. Then I need to use Wine to run Photoshop in a barely legible font size because hidpi scaling won't work. God help you if you want to manually configure your display manager, as you'll end up with an unusable display and hours of SO on your hands.
That's not to mention the ton of visual and "feel" issues behind using even Pantheon/eOS, which feels like a fork of OS 10.5.
I recently built my own hackintosh and am quite happy with it. I do all my work on the Mac with integrated graphics (don't need a GPU, which is great because the Nvidia web drivers are indeed terrible) and switch to Windows for games. Couldn't be more pleased.
I'm a power user and a developer and I find the 2016 MBPs to be the best laptops around. My only misgiving was the hefty price. The only "dongle" I needed was a single USB/USB-C adapter but I really don't have to use it most of the time; everything I have — phone, tablet, portable hard disk — works wirelessly.
The battery time (I think you mistyped "life time") estimation was dumb, never accurate or consistent, and people don't mind the lack of it on billions of other mobile devices anyway.
The battery issues with the 2016 MBPs were found to be on the software side and have since then been resolved, and they have been reported to go for 18! hours on a single charge. Certainly lasts me a whole day and then some of non-continuous use.
Re: the LG shielding mishap, you're blaming Apple for a completely different company's product, okay.
> still doesn't serve my demographic
Which demographic and what do you require?
> touch typing hostile emoji keyboard?
> The battery time (I think you mistyped "life time") estimation was dumb, never accurate or consistent, and people don't mind the lack of it on billions of other mobile devices anyway.
It was accurate as long as you kept doing the same task. For example, I knew how much longer a gaming session or a long build could last on battery. But this is laptop stuff and the original article and my whining is about desktops.
> Re: the LG shielding mishap, you're blaming Apple for a completely different company's product, okay.
A completely different company's product that is the only solution that makes use of the emoji mbpro's features, is sold and promoted a lot by Apple, and thus endorsed by them on their web site.
> Which demographic and what do you require?
Independent dev working from home. Running two VMs simultaneously at times, and also using the same computer for entertainment, namely gaming. Working on compiled software, not scp-ing stuff to a server for web development, so I actually need CPU power.
I could almost use the trash can, except they want me to pay for two slow video cards that are for some reason labeled "pro". Put in it a nvidia 1050+ and I'll buy it, although I'll be a bit space challenged. Some source trees take a lot of space. I'll even pay extra for the Xeon and ECC ram i don't really need, but give me a video card i can game on.
Before you ask, no, the iMac is not a dev's computer. It thermally throttles if you actually use the CPU, and makes laptop-like noise when compiling. No thank you. I've heard even the trash can throttles, and i blame the "pro" video cards.
What you read exactly. Please explain how you touch type on the touch screen that replaced the F keys and how it is useful to me when I, like anyone who works a lot with the keyboard, have memorized every shortcut I use. It's mostly good for emoji in instant messaging, and I don't use that either.
The touchbar is really there to be a little touchable display, like an iPad, or the bottom screen of a Nintendo DS. It's for virtual knobs, rather than virtual keys: it makes perfect sense once you use it to scrub or drag-select in iMovie or Final Cut Pro. It's basically a shrunk-down embedded platform for those "live controller" iPad apps (DJay Pro; that thing Bret Victor built in the dead fish video; etc.) to run on.
"Problem is, Apple doesn't have anything for power users or developers right now."
You complained about everything from Apple across the board, including blaming them for some other company. 
> Independent dev working from home.
> Running two VMs simultaneously at times, and also using the same computer for entertainment, namely gaming.
Ditto, but just one VM; Windows, for some games. Everything I want to play, even very recent games like Paragon , runs just fine on the 15" 2016 MBP, in Boot Camp if not inside macOS.
> so I actually need CPU power.
Haven't experienced any throttling yet, myself.
> What you read exactly. Please explain how you touch type on the touch screen that replaced the F keys and how it is useful to me when I, like anyone who works a lot with the keyboard, have memorized every shortcut I use.
See . The Touch Bar is always in your vision and I don't have to take my fingers off the keyboard to use it, as opposed to moving my hand to the mouse/trackpad to perform functions that I can now do from the Touch Bar.
"Looking" at it is no different than moving your eyeballs a nanometer to look at a menu bar or any other element on the screen.
Even if looking at something was such an arduous action, having to continuously refer to a secondary resource (be it your memory, in-app help, keybind settings, or online) to know what each F-number does with each modifier in each context of each app, is considerably more backwards.
It's a monitor Apple effectively blessed as their monitor replacement. They sold it. So... yeah. Yeah, I do blame them for their lack of due diligence.
Why is this their problem?
Because, how am I supposed to rely on Apple to meet my needs if they are perfectly happy dropping products a business depends on?
Not the OP, but none of the half-dozen developers/tech-oriented people I know who have switched to the latest laptops prefer them over the older ones. I've heard nothing good about them besides being a bit lighter, but that was one person and it was more a "the only thing better is it's a bit lighter" before listing complaints.
The complaints range from the touch bar to performance being far worse than older models. Running side by side, it's noticeable just by watching things run.
Are you not aware of the touch bar?
While simultaneously absolving yourself of having to do any research.
> Because, how am I supposed to rely on Apple to meet my needs
I often see people saying Apple owes it to its customers to make x, y, or z product, as if Apple's role in the ecosystem needs to be expanded further. Apple is a company, not a nanny, they don't owe you anything, and I feel we should keep their scope limited rather than expanded.
Apple don't make the LG Ultrafine Display. Apple offer many other third-party products through their stores, but it's the manufacturer's responsibility to support those.
> how am I supposed to rely on Apple to meet my needs if they are perfectly happy dropping products a business depends on?
Apple still make amazing 5120×2880, 10-bits-per-channel screens: The Retina iMacs. If your business depends on such screens, you'd get the iMacs, not the MacBooks, or get a MacBook with any of the many third-party 4K displays.
> none of the half-dozen developers/tech-oriented people I know who have switched to the latest laptops prefer them over the older ones.
> before listing complaints. The complaints range from the touch bar
Like? What complaints specifically?
> to performance being far worse than older models.
They have some of the fastest SSDs on the market , their batteries can go up to 18 hours , they have the fastest GPUs ever in a MacBook, and how many laptops do you know that have high-DPI Wide Color screens?
> Are you not aware of the touch bar?
I use it daily, and it's better than the archaic row of limited and cryptic FN keys that it replaces, especially once you've tweaked it a little. 
• Have to remember what each number does in each app
• Different numbers remain "fixed" for different functions in different operating systems (e.g. F1 for Help, Alt+F4 for Quit.)
• Not customizable
• Cannot give at-a-glance status without taking up main screen space
• More than 12
• Not limited by physical space
• Context-sensitive and adaptive
• More types of controls than just buttons (e.g. sliders, color pickers)
• Can display at-a-glance status such as time, battery etc.
As for other companies that offer high-end laptops, take a look at people's experience with Razer for example. 
All of the over-a-dozen developers/tech-oriented people that -I- know prefer the 2016 MBPs to everything else. They've also outsold all other competitors in just a single month. 
But they did make monitors. And Routers. And other things that people bought into. Because Apple "just works." But now it doesn't.
> Apple still make amazing 5120×2880, 10-bits-per-channel screens: The Retina iMacs. If your business depends on such screens, you'd get the iMacs
iMacs are not replacements for MBP + Apple's monitors.
Because the previous models have features and capabilities that the new ones do not that they prefer.
> Like? What complaints specifically?
For example, the touch bar button activating not when someone wants the touch buttons to activate. Specifically the latest case, I had a friend joke about how he's changed the background color of his terminal so many time since moving to the latest MBP. That's just one example.
> They have some of the...
shrug I know what I see. Sorry, but running scripts side by side with an older MBP compared to the latest MBP, the older one won hands down.
> I use it daily
And you weren't clued into the GPs remark about the touch screen? You honestly couldn't figure it out?
> They've also outsold all other competitors in just a single month.
And just because others like something, that means it's good for me?
I would just like to say that that's BS.
You have to be very deliberate to change the Terminal window color: It takes a tap on a specific button to bring up the color picker, then you have to lift your finger to the slider and move it, or keep your finger held on the Touch Bar for a second then slide it around without lifting it to choose from a palette.
It's not something that can be done accidentally, as it takes multiple specific actions in a row.
If you still keep fumbling it "so many times" then you can always customize the Touch Bar and just remove the offending buttons.
> Can display at-a-glance status such as time, battery etc.
You look at the keyboard when you type? Not my demographic then, i type too much for that. It's probably fine for you.
> All of the over-a-dozen developers/tech-oriented people that -I- know prefer the 2016 MBPs to everything else. They've also outsold all other competitors in just a single month.
You don't know any developers working on compiled software. And the mbpros outsell everything not because they don't annoy people, but because there's almost no alternative. Perhaps the Dell Developer Edition laptops, but Apple would have to piss me off more than they have so far for me to try a Dell instead.
And again, the article - and my OP - are from people who need a powerful desktop for work.
I do. I expect your next response will impose qualifiers on "developers" and "compiled software." :)
> my OP - are from people who need a powerful desktop for work.
Your OP said:
> Problem is, Apple doesn't have anything for power users or developers right now.
> I used to say that their laptops are fine
My experience is that their laptops are still more than fine enough for power users and developers.
And My experience is that their laptops are not more than fine enough for power users and developers.
Now, why can't we be both right? That your needs are clearly met by the laptop, but others aren't? Are you somehow impacted if people who used to like the products put out are now disappointed by them? That people who were looking to buy new machines spent their money else where and had better results?
Do you feel that if these people are right, that somehow this negates your choice and needs? Why are you working so hard to dictate that their needs are unimportant and that your needs are superior?
None of the complainers seem to be able to say exactly WHY the new MacBooks are bad.
• They complain about performance when they're the fastest MacBooks ever, surpassing many competitors and with a very good battery charge-to-perforamce ratio, not to mention the best display ever on a MacBook.
• They complain about needing "5 dongles" when you need at most 1 USB/USB-C adapter + sometimes a multiport hub or dock, for almost any use case.
• That one guy keeps calling it an emoji keyboard, when there aren't any emoji on the keyboard.
• They complain of not being able to touch-type when the physical keyboard is still there.
• They complain about noise when these are the quietest MacBook Pros ever, and without willing to show what they're comparing it with.
• They say Apple is losing favor with customers when the new MacBooks have outsold everything and Apple has continued to top rankings and stock prices.
So yeah. I have the thing in my hands, and I use it daily, and there really hasn't been a better MacBook before. I DO concede that they may be priced a bit too steeply.
Meanwhile I only see complaints from people who have clearly never even used the things! It's pretty obvious with their "emoji keyboard" and "dongles everywhere" hyperbole, like going back to the old "Micro$oft" and "Windoze" days.
So of course I have to wonder and ask: By WHAT metric are they bad? I can only chalk it down to a concentrated anti-PR effort, or some desperate brand envy (e.g. adamantly putting the blame for LG's monitors — that have now been fixed, by the way — on Apple.)
Well, [part 1] I don't use xcode, i touch type [part 2] and a lot of the stuff i work on involves looong builds and/or going through a few VMs. I also HATE fan noise and I work from home in a quiet room. Part 1 disqualifies the emoji keyboard, part 2 disqualifies anything with laptop-like internals, namely the laptops and the iMacs. I hope that's enough qualifiers :)
I always bind my keyboard to use the fn# keys without pressing "fn", usually in the BIOS, so I only press "fn" for the other special features (brightness, media, etc., that laptop makers map arbitrarily to the fn# keys)
Why do you keep saying this? The Touch Bar replaces the FN keys, not the keyboard.
There are no emojis on the Touch Bar unless you specifically bring them up. You can modify the Touch Bar to only present you with the controls you need. 
> I also HATE fan noise and I work from home in a quiet room. ... disqualifies anything with laptop-like internals
The 2016 MBPs are some of the quietest laptops around . Which computer do you use? Is it a desktop without any fans?
I use those keys. Without looking down.
> Which computer do you use? Is it a desktop without any fans?
Let's say I build my systems according to the advice on silentpcreview.com. Apple laptops on full load may be quiet for a laptop, but they have nothing on a custom built desktop.
Unless you slide your finger across the keyboard to "count" which FN key is under your finger, you can still use FN keys on the Touch Bar, without looking down; they are in the same positions.
The Touch Bar is always within your peripheral vision when you're looking at the screen anyway  .
> Let's say I build my systems according to the advice on silentpcreview.com. Apple laptops on full load may be quiet for a laptop, but they have nothing on a custom built desktop.
Well if you're willing to record/measure the noise of your current machine under your usual workload, I could try running the same tasks on my 15" 2016 MBP to compare.
Mind you though, a desktop chassis is usually far away from you, unlike any laptop right under your hands, unless you use it with an external keyboard. So objective noise vs. perceived noise will be different, and have to be compensated for.
If that was true, we could just use an iPad instead of a physical keyboard. But there's a reason people don't like typing on chiclet keyboards or glass, and some developers even go to great lengths of geekery to build custom mechanical keyboards with different types of switches.
Typing on glass just doesn't give you the same feedback you get from physical keys.
I'm not saying the Touch Bar is not an improvement for many people. Many people (even some developers or professionals) rarely use the F-keys. But there is a certain demographic which deeply cares about the keyboard, and there's no denying parts of this demographic overlapped with that of MBP users.
This is simply not true.
Being iOS developer really sucked, as I needed to upgrade OS X for Xcode but the CPU wasn't supported with Sierra for 6 months.
Also I spent at least 2 weeks of work on it during the year I had it. So not worth it. But there are slower cpus that are better supported. It was fun the days it worked though:)
To me, this is the most interesting, and the article doesn't really specify it. Excluding the building itself, why did you have to spend time on it? What kind of problems occurred?
Anyone does it, how's performance and keyboard "tunneling" (CMD vs CTRL)?
> keyboard "tunneling" (CMD vs CTRL)?
Better to use original mac keyboards, if you have them, not really a virtualization issue.
If you aren't running ESXi (or VMWare Workstation) on Apple hardware, you can use this to enable the functionality: https://github.com/DrDonk/unlocker
The trick is to use VirtualBox 4.3 instead of 5.
It seems like a company needs to build a _very_ good linux distro with design first principles. It needs to work on a number of devices. More importantly, it needs to be a paid OS. It can have an open source distro underneath, but the UI needs to be created by people who are paid well.
It seems to me there is a disconnect between developers and designers. When I ask devs why there isn't a nice looking Linux distro, they ask me why I care about what the UI looks like. Designers don't use or care about Linux because of the lack of decent design software. As much as I love the concept of GIMP, it is not a realistic alternative to Adobe software (especially things like InDesign and XD) or Sketch, for example.
As a designer-developer, Linux is still not an easy choice for me, as much as I'd like it to be. elementaryOS is a nice start, but I still end up using macOS if given the choice.
I remember when Apple could not offer a retail OS, not without cannibalizing hardware sales. If Pro desktops fade, is that still true for that segment? Or could they offer something Xeon only (to keep out commodity laptops) as a legal Hackintosh? Or do certified configurations a la Oculus? If their profit is in mobile, and cloud services, it might help more than hurt.
FWIW I like Debian now, and the non-intrusive UI.
Each time I open the latter, I have a mixed feeling of "how beautiful everything is!" and "how slow everything is!"
(I am a Scala software engineer occasionally working with GPU-based machine learning)
About two weeks ago, I decided to stop and look elsewhere.
I started a small serie on my experience if you are interested:
I am still testing my current setup, but I guess I'll soon publish the last bit, with the setup I found and my conclusions on the switch.
Man this is so me when I updated to El Capitan. What is Apple thinking??? Now I have to hold alt just to increase my window size. At least give us an option to change it! How hard is that, Apple?
Another good point for windows, it has nearly all the features of Moom, at least those I need.
It ran Snow Leopard and I used it for all my development, video editing, photo editing, etc. Eventually I left for Australia and decided to get a real MacBook and unfortunately it had Lion on it.
I hated Lion. Gone was Expose. Gone were rows of virtual desktops (Missing Control had one row with multiple columns. I hated that shit). There was no way to get the old functionality back. Eventually I started using Linux again in a VM as my primary OS.
Today I'm back on Linux with i3 as my tiling window manager and I don't think I'd ever go back to macos. I think many of their design decisions since Lion and onward have been terrible. I just keep around a Windows laptop or VM for when I need commercial products or to play games.
You can't even run the stock kernel on AMD chips. How much QA and other work Apple would have to do, I have no idea.
I honestly dont think any consumer will stop buying Mac just because Intel isn't inside the Mac. And it will save Apple at least $200 BOM per Mac, making final Mac Selling price around ~$250 less. Slightly more affordable. Or more likely, staying within the same pricing and give you double the SSD storage.
As other people stated, yes, it is very time consuming to get it straight. Treat it as a hobby, you'll understand things about the Mac (or computers in general) other people don't, such as DSDT patches, how drivers are loaded, and Mac power management, etc.
If you use Clover, and get all the patches right, you can almost get an update-proof setup (Except when you go from say 10.11 to 10.12). But even at the worst case, usually people on the Internet will figure out fast enough for you to apply the new patches. Minor updates are really really easy and fine. I always click Update without batting an eye.
It's a tinkerer's hobby. If you like doing researches and being fine with spending time figuring out stuff on the Internet, I will say go for it and try it out! The process is fun and the result is very rewarding.
I've got a (still pretty new) high-end MacBook Pro sitting at the end of my desk but -- after putting together a new, extremely over-built workstation a few months ago -- I haven't even turned it on since I don't know when. I've got KVM/qemu, VMware Workstation, and VirtualBox all installed on my workstation, though, and it might be interesting to try to get OS X running under one of them.
My ultimate answer was just to give up on Apple on the desktop and stay in the Windows/Linux worlds. I'm just not going to invest any further time on a system that is owned by a company that clearly doesn't care about it, and is actively hostile to attempts (like virtualization) to use the system in a larger context.
vb.customize ["modifyvm", :id, "--usbehci", "off"]
I've never had any issues, i just shop for compatible chipsets. I've had tons of issues with Linux on the desktop in comparison to Hackintosh. Never understood how people say it's time consuming to do it!
Everything else (games, calculations, social networks, model rendering) is better served by a web or mobile app backed by a server running linux.
Night and day better than standard ssd on homebrew macs.
Right now I have 2 machines on my desktop, one with SATA SSD (sams 850pro) and another with PCIe SSD (sams 950pro). I see big difference in benchmarks, but do not feel any difference in my day-to-day work. Boot time is shorter on PCIe one, but I reboot these machines maybe once a couple of months.
I remember during college of wanting to do some Ios development but afford a MAC. I spent atleast 20 hours of tweaking kewts settings and trying different distros. I finally got it to boot but it crashed whenever I tried to run the emulator.
I haven't touched hackintosh stuff for several years but the grief and time wasted makes it not worth it. It's a shame apple is limiting their development tools to MacOS. They could learn a thing or two from Microsoft.
Does anyone know why some of the updates can brick the machine? Also how often does this happen? Or like what percentage of updates break things.
I remember leaving the stock updates on and rarely ran into issues. I think I had video screw up once or twice, which just involved getting some of the latest kexts off the forums.
Just that you restart and graphics are weird or audio is having issues, so you have to figure out how to resolve those issues.
By waiting a few days and seeing what issues you may encounter in advance, it's easier to avoid issues.
(That being said, issues from updates has never actually happened to me)
Even with the LEDs, there is a ~$5k budget difference between the hackintosh and the trash can. That much will buy you a pretty wide range of aesthetic choices. For instance, you could probably have someone cnc something out of aluminum, or commission a hardwood case that matches the period of your office furniture.
Personally, I think the build would have been more memorable if it incorporated some strobes, a fog machine and plasma bulb. (Bonus points for replacing the plasma bulb with a tesla coil without sacrificing stablity / safety, and without exceeding the trashcan price)
Lian Li PC-V2000.
I don't know if the window manager is easily usable on a different Linux distribution though.
What does macOS has in its window manager that Linux lacks? (other than "no you can't maximise easily" feature...)
I care about privacy, like most people, but I am able to control mine far better than the typical, non-technical computer user. I am reaching a point where these operating systems are becoming untenable for day to day computing use. They lure you in with "everything just working" including the privacy invasive "cloud" features. Increasingly these things are using ML to mine your behavior. I can run Linux. I don't even mind paying for a privacy with a little bit of my time and convenience. I don't want to look back 25 years from now and say "I really wish I had not given up this bit of privacy". Alternately, I am hoping when I look back I can say, "It was worth the relatively small time sacrifice to avoid these things."
(Not confusing for technically savvy folks, but still, pretty hard).
Windows is a little more frustrating, and I don't think Microsoft has the same level of privacy commitment as Apple does, but I also don't think it's that far behind.
Google is a special case, they are an advertising company and have virtually no revenue streams outside of that for Android. I'm sure they'd like to protect your privacy too if they could still serve super targeted ads to you, but they have no way of doing that.
iOS is by default the most secure phone OS, etc. (Secure Enclave is nice and you know it is available on all iOS devices newer than X) But it is a walled garden that is too tightly controlled in terms of what is allowed.
Still if you enable iMessage it gets murkier. Siri, cloud backupiCloud keychain (which is good, but not for me). Lots of small pushes to remote services.
A 'thin iMac' - whose thin-ness was absolutely pointless.
The MacBook (2 lbs) and MacBook Pro (3 lbs) were designed to weigh arbitrary weights instead of thinking about features: http://www.apple.com/mac/compare/results/?product1=macbook&p...
And the MacPro was possible the worst shape for upgrades.
Jobs inspired people to come up with great machines, but he also had a pragmatism which seems to have been lost at Apple.
For you maybe. Many of us (millions, judging from the sales numbers) appreciate the thin-ness, even if it's for a desktop machine.
I had to drag my old non-thin iMac several times to different locations, and it was no joy. Nor do I need a behemoth on my desktop.
Or perhaps they bought them because they had no other choice when choosing a reasonably fast desktop machine compatible with their software?
Seriously, the argument "oh, this feature must be popular because the whole machine is popular" really doesn't hold water in complex machines like computers, phones or cars.
iMacs sold better even when the "cheese graters" were available, so yes.
Apple never had a usable entry level desktop computer tower. Something with high performance on consumer class CPUs, rather than Xeon and ECC ram. The Mac Mini was always crippled to keep it from competing with the iMac and cheese graters among people who want something better performing.
The iMac is popular because it's the best performance to price ratio. Not because the form factor is any good. For the longest time the entry level MBP (and unibody Macbook before then) was also Apple's best selling laptop and they only recently cut it out from their line-up and replaced it with the Air as their entry level offering. The Air will also exist for as long as they keep selling the current Macbook at those prices because most people are not willing to spend 1449 euros on a machine that barely performs better.
The 27" Core 2 Duo 2008-9 era iMacs, one of which I had, weighted 13.8 kg!
The new 27" model is 9.5 kg. That's over 30% less weight...
Apparently they are. Just read this effete technologist's comment:
I’m personally just hoping that I’m ahead of the curve in my strict requirement
for “small and silent”. (...) I want my office to be quiet. The loudest thing
in the room – by far – should be the occasional purring of the cat. And when I
travel, I want to travel light. A notebook that weighs more than a kilo is
simply not a good thing (yeah, I’m using the smaller 11″ macbook air, and I
think weight could still be improved on, but at least it’s very close to the
magical 1kg limit). -- Linus Torvalds
Any reasonably well built desktop PC does that. It's a matter of applying a modest amount of care when selecting and assembling components.
When it comes to off the shelve towers and laptops, you'd be surprised.
And if you mean it's easy if one personally "selects and assembles components." you'd be surprised again on how atypical this is for the average PC buyer.
E.g. you'll never find a high-end CPU cooler in a factory-assembled PCs, simply because they are too difficult and slow to mount. Compared to a simple push-pin or lever-mounted cooler these can have a dozen parts or more and require funneling screws through cutouts in the cooler itself etc. -- you just won't see that on an assembly line.
There are also other issues, more specific to PCs, e.g. the standard case form factor does a poor job supplying graphics cards with fresh air. Other case form factors solve this (e.g. Silverstone has some; the "trash can" Mac Pro follows a similar concept), but tend to incur other compromises (and price tags).
Design isn't pointless. That in itself has value. I'm thinking the poor upgradeability is actually a feature (for the seller - not the buyer) and not arbitrary either.