Hacker News new | past | comments | ask | show | jobs | submit login
Apple’s 2019 Mac Pro will be shaped by workflows (techcrunch.com)
198 points by jbegley 11 months ago | hide | past | web | favorite | 345 comments

What are Pros supposed to do in the mean time? Keep buying PCs I guess. Why haven't they kept selling the cheese grater is beyond me.

They are spending a fortune to understand what Pros need, while we have been screaming for the better part of the decade: lots of user accessible RAM, storage, PCI & GPU. How much would it cost R&D to update the Mac Pro? Teenagers can do it for peanuts, I'm sure the largest company in the freaking world could manage. If they were truly sorry or marginally “cared”, that's what they would have done years ago.

I'm fine with them playing the “let's reinvent the Pro computer” game as a side project, they have this massive amount of money burning a whole in their pockets after all, but some of us need to work.

> What are Pros supposed to do in the mean time? Keep buying PCs I guess. > How much would it cost R&D to update the Mac Pro?

I hope/think that Apple is thinking that bringing back the cheese grater or in other word a PC running MacOS is not going to be enough to bring back the pro. They need to come back with something that has something unique.

The TrashCan failed but you can see where Apple was going. A small, silent, powerful machine infinitely customisable on the fly via TB.

So of course if failed and this attempt has a good chance to fail too. I'm not quite sure that Apple has in its DNA the deep understanding of the type of Pro that are not content with either the iMac Pro or MBP in the same fashion they get the consumer sphere. It is just that for a long period of time the intersection between the 2 world was large but are now diverging. Perfectly happy for me to be wrong though.

> I hope/think that Apple is thinking that bringing back the cheese grater or in other word a PC running MacOS is not going to be enough to bring back the pro.

Before they can think of "bringing back the pro", they need to stop the bleeding: pros who need to upgrade this year have no choice but to go PC. An upgraded cheese-grater would have been a good stop-gap while they go back to the drawing board on their 2019 product.


Right now I program Rails on an old 2012 MBPr. I want to get a faster machine. Possibilities:

1) Spend 5k on a iMac Pro. That's an absurd amount of money.

2) Get a new MBP limited by USB C, new keyboard, and the abominable Touchbar. That's a lot of money for a downgrade.

3) Get a decent specced iMac or Mac Mini. Most viable option.

4) Ditch Rails, start learning .NET, so I can go Windows 100% to build my own PC. I save money. I lose a lot of time.

5) Build PC, install Linux. Viable but might be a configuration nightmare. I just want to work.

6) Run Linux under Windows host as a VM. Second most viable option.

I'm pretty frustrated. Suggestions?

I've had pretty good luck with Linux "just working" these days. That said, this is coming from the kind of guy who runs Arch Linux, so take that as you will.

System 76 sells machines with Linux pre-installed. I've had good luck with these (replacing the default Ubuntu with Arch of course).


7) continue developing Rails using Windows 10 and the Windows Subsystem For Linux

Overall my experience with WSL has been great.

7) buy an XPS or precision. Install ubuntu or another linux on it, or buy one with it preinstalled.

I did this, bought an XPS and a precision. Both refurbed in sales/with coupons, great specs. Cost me ~500 less than retail. Both like new.

Sure I'd prefer a mac, but its over double the cost for similar specs. I could look past the touchbar and ports. But not the keyboard

8) buy a librem. I don't know how good they are, though

> I'm pretty frustrated. Suggestions?

I suggest an upgraded 2009 Mac Pro.

The upgrades would include: flash to 5,1, 3.46 MHz Intel Hexcore proc, 32 GB RAM, SSD on a PMCIA card, and 5770 GPU. This should cost $1K USD or less, dual CPU (12 core) for a bit more. Mine Geekbenches at 3800 (S) and 17,800 (M).

I'm running one of these and it has been bulletproof and 100% hardware/software compatible.

That actually sounds fantastic. Can it drive a 4k display?

On 5, it's my home setup. SSH'ing to a local Linux box is fast and I'd already be using a terminal. I use it from both Linux and Macs and the only thing that bothers me a little is syncing the working directories. Most server grade boxes from Dell or Lenovo work with Linux perfectly and the tower ones are great workstations.

The iMac Pro is overspecced for Rails development. You can get a very decent Mac for much less. If you keep your work environment synced via iCloud or Dropbox, the problem of syncing I mentioned doesn't exist (I prefer the minor discomfort to syncing through a third party).

As for 6, get a Linux PC and run a Windows VM when you need it. It's better to have the decent OS host the crappy one than the other way around.

What's wrong with the iMac for development? $2500 for a quad core 3.44Ghz Core I5 with 32GB of RAM would make a nice development machine.

If you need more power for $2900 you can get a Core I7 4.2Ghz quad core with the same specs.

I'm not a Mac fanboy by any stretch of the imagination - I haven't bought a Mac in 12 years but I want to get away from Windows and don't want to futz with Linux. Getting an iMac would allow me to develop for iOS, Android, Windows, and give me a real Unix environment.

You can do .Net Core with a Mac. I am a Windows developer now, and will probably stick with .Net, but especially for hosted solutions, using Linux is a lot cheaper for deployment.

I agree that the iMac is a good choice, although it's a bit pricey for a stationary machine that shouldn't be constrained by power, size, or cooling. I am OK with paying a 100% markup for a well-designed laptop, however. Even with these objections I think that I will end up getting an iMac. Reluctantly.

You can run ubuntu natively without a VM on windows 10. It works really well.

Also configuring Linux is not the beast that it was years ago. Even on laptops it is fairly straightforward.

I feel the pain here too. It feels like macbooks have stagnated the last 5 years. I am struggling to do java / microservices development on my MBP. I just don't have the RAM and cores to run a lot of server processes.

I wouldn't touch the current slate of MacBooks. I prefer the Dell 2-n-1s - either business line or bought from the Microsoft store to avoid the bloatware.

The only Macs worth buying now are the iMacs.

7) Build a Hackintosh

While I have a Hackintosh, I'm not sure I'd want to get one if it was my primary computer. Granted, I haven't had any problems with mine, but all it takes is an update from Apple to break or brick it. Native Macs don't have that issue.

Its also a legal risk to do any kind of billable work on a hackintosh.

It might be unethical, but the risk is zero. Apple isn't going to sue you. Running unlicensed Windows is a significantly bigger risk.

The risk is pretty real, see my other thread.

If you don't have a legal office declared for business activities to the regulators, then yeah it might be zero.

If you have a real Mac/Macbook and a Hackintosh it would probably be hard to prove which was used for the work.

Except if you happen to live in a country where goverment authorities are entitled to search your office for illegal software.

Then good luck in court.

7) remote into a box in Azure or AWS when you need more power, still would be Lin or Win tho’

> They need to come back with something that has something unique.

That would be MacOS itself. That's all they need to bring a huge number of Pros who bailed on the entire Apple ecosystem back into the fold.

Well yes, but macOs was also much less buggy and much more stable a few years ago than now. I miss the days of Snow Leopard which was rock stable.

But until they bring back that stability, macOs won't be sufficient an argument to bring back pros I think

> They need to come back with something that has something unique.


The margins are too thin on a boring desktop PC. If they spec bump the cheese grater, that will cannibalize the ground-breaking new Mac Pro. Same reason why they won't put it out this year, they don't want to cut into the imac pro sales.

A crappy PC with a best-in-class 5K monitor can easily get more expensive than an iMac Pro.

They could at least spec bump the trash can. Thermal design issues aside, performance per watt has only gotten better for the design they already had, right?

They said they can't spec bump it. https://daringfireball.net/2017/04/the_mac_pro_lives

> I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture. That that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped.

Seems like a poor excuse. Just add in modern CPUs with the same TDP, and for the GPUs two Quadro 5000s (I think even the 6000 would work) have lower TDP than the FirePro D700s currently used.

To be more precise, according to Anandtech, their design wasn't able to cool 250W GPU in it. This significantly limit the GPU selection they had.

The biggest problem is how they didn't know this when they were designing it? And how they managed to find out in 2014 and didn't spoke a word until 2017. And now their answer is 2019.

Not to mention, as the OP suggest, they could have fitted Modern Intel CPU and 150W Vega GPU in it, and sell it for lower price.

Pros are supposed to buy an iMac Pro in the meantime, and then buy the modular Mac Pro Pro with all the pro trimmings - iPads for touch control, and such - when it all arrives next year.

It's going to be interesting to see if they stick with Intel for the Pro. It's entirely possible they won't, especially given Intel's security issues.

What that means for performance remains to be seen.

It's going to be interesting to see if they stick with Intel for the Pro.

Wouldn't the apps that the pro users dictate that to a large extent? If the truly "pro" apps don't get rebuilt against ARM then it would be like Apple never really released the Mac Pro if they ditch Intel so early into its return.

Rosetta 2: Stone Harder

No Intel != no x86. There's this one company that also makes x86 CPUs.

I think they meant go AMD Reyzen2 or more likely Eypc (or possibly Rome.)

Arm is not going to cut it in the dual Zeon workstation role

Aren't Apple developing their own CPUs?

They are supposedly not directly x86 compatible and thus will need to emulate it at a performance penalty. Not something you'd want in a "pro" machine.

I don't see that as the major problem. It seems like they could rely on third parties to have them ported in a matter of a few months.

I'm more doubtful of their ability to produce something sufficiently competitive.

Even if they were fast enough you would still want to launch those chips on less performance critical machines first and give the pro apps plenty of time to update before migrating the pro hardware.

Usually consumer software moves faster to adopt new tech stacks anyway. OS X supported x86-64 in v10.5 in 2007, but Photoshop didn’t go 64-bit on Mac until 2010 with CS5.

My best guess is that it’s because big pro applications have a mess of dependencies and some inner guts in the high performance code where it drops to assembly. Building that for alternate architectures isn’t as simple as checking a box in Xcode.

Critical software or drivers may not be supported any longer. Source may not be available.

If going from almost a standing start to fabing chips that compete on par with intel was that easy some one else (elon musk or china) would have done it already.

I am sure major shareholders of apple will not want to bankroll a 7nm fab and the massive massive costs in producing a competitor to AMD and INTEL

I completely agree with that. Actually building pro-quality chips that compete with top end Intel chips is really hard and is likely more of a bottleneck than 3rd party software.

I really still have a lot of doubts about this idea that they will even try it. It doesn't seem that there is really anything in it for them in doing this.

Mate, AMD themselves are already fabless. Apple uses suppliers. Apple makes their mobile SoCs already. This is feasible.

Steve would have unreasonably said get it shipped by June 2018.

And it would have shipped.

No, they're spending a fortune redefining what "Pro" means in order to make everyone feel like they're a pro and need pro hardware.

Reading this it had me reminiscing about Sun back in its heyday as a workstation vendor. Sun used to do this stuff, they would bring in a partner who had some critical application and the Sun engineering team and the partner's engineering team, and some "power users" of the tool would all work through what it would take to make the tool work better on a Sun Workstation than any other workstation. Sun engineering got a collection of bugs, features, and investigations that would come out of those exercises. When my wife was at Xerox they were doing similar sorts of studies for document preparation and presentation.

You could do that when the workstation cost $25,000, (in 1990 dollars!) I'm not sure how viable it is when the workstation is less than $5,000.

That said, I really miss using software products where that level of thought has gone into its design. The three 'creative' apps I use (other than writing code) are drawing, schematic capture/board layout (EDA), and writing. All systems that benefit from people investing in how the flow of these things work, and all of which have degraded over time.

> You could do that when the workstation cost $25,000, (in 1990 dollars!) I'm not sure how viable it is when the workstation is less than $5,000.

I think it work because the customer spend is of comparable magnitude. Back in the mid 90s you might have a couple of dozen game devs working on $2000 PCs with a few $45K Onyx machines used for back end rendering or part time use of a few artists = ~$200K. Or (in my case) you only had a couple of AI developers but each had a pair of $50K 3650s = ~$200K. Or your prop traders had a $25K Sparcstation on their desk and one at home, but the rank and file just had a Bloomberg terminal.

Now you have much larger teams with more uniform machines (+ some cloud resource). The total institutional spend is probably higher in constant dollars.

Speaking of which it will be interesting to see if Apple is able to push some of their APIs into the cloud so you can develop on a mac pro and dynamically push the heavy lifting into an apple rendering cloud. This article just talked about scaling ios <-> Mac Os <-> MacOS+iOS+eGPU (nice!) but left out "<-> cloud". That's where editing on iOS could really shine: chop up / assemble your downsampled rushes on your tablet and then stream them to the AppleTV at your bosses' office. Remember Peter Jackson used to bring his to LA every week on an iPod in his pocket.

I think there might be some people interested in a 25K machine, but only if it really improves their workflow; don't think adding more money / processing power does much for most people, maybe for 3d rendering but they have specialized hardware for that, which gives them more raw bang for their bucks than Apple could do.

I'm wondering if it'd be viable to have a 'compute unit' shared by multiple users. i/o would probably be a bottleneck, so maybe have it link up macs via usb-c? It might help with e.g. compile times (which last time I did iOS development, two years ago, was kinda slow (>1 minute for our app, CPU bound)).

But that's probably too specialized a thing. Apple wants to sell units by the million, not the hundreds (if that). Which is why they discontinued the mac pro and the 17 inch MBP.

Something like the dxg-1/dxg-2?


The world is bigger and wealthier today. You sell significantly more workstations which translates into way higher profit.

Not to mention Apple is overflowing with profit. Their failures around the pro market in the last decade have not been about resource constraints but poor vision and execution.

They could have put out a cheese-grater with fresh Intel chips every year like clockwork, but they were too cool for that. And the failure is all the more striking when they continue to "crush it" in other categories.

The only reasons I spent that kind of money on workstations back then were that I wanted my desktop to be Unix and I wanted good 3D graphics performance. The only way to get those in one compact box was to make a deal with SGI.

These days I get astoundingly better performance than whatever that low volume hardware did back then out of a $700 desktop with a $800 graphics card and a $500 monitor.

$5000 is probably still too high, unless you see the Apple product line as your only option.

I remember those old Sun Blade processors. They were absolutely beastly, provided you had to know-how to extract parallel performance from them :D

What drawing apps do you use, on what devices, and what is worst about those apps to you?

On my PC I use Corel Draw, on the iPad Autodesk Sketchbook, Touch Draw, or the Notes application, on the Surface its usually Sketchbook but I've also got Sketchable, and on Linux Inkscape.

The worst thing is the linkage between knowing what you want to do and assembling the right tools to do it in the app you're looking at. Most of my drawing is technical in nature (systems, small components, software architectures, Etc.)

Mac Pro cheese graters going away was a bad move and a missed opportunity by Apple, today they already lost the power pro users and they had them for a while.

Most Mac Pro users have moved back to PC, or iMac's/minis just for iOS/macOS related dev, with most work being done on PCs.

Apple had a window where developers needed a heavy Pro Mac for iOS, nix development like python/ruby/etc, even Unity dev back when it was only on Mac (2007-2010 ish), it was also great to have a Mac after the 2006 Intel processor move as it was the sexiest nix and great for developers from 2006 on, but those days are over.

Somewhere in 2013, they just moved on from the cheese grater and Pro market that included developers and content creators. Now they want to regain it? There was so much momentum squandered here and hearing Tim Cook repeat "Post-PC" and Apple messaging that desktops were like trucks, only developers need them, caused them to move on, so did the pro users and developers. Apple even watered down their developer laptops and Macbook Pros, 17" screens were relegated to being 'lapzillas' and Apple went only mainstream. Content creators and programming were not something they focused on anymore after the iPhone took over, eventhough those influencers were always the focus. I thought they would use the iPhone and iOS/macOS platforms to get more people on their desktops as well, instead they went the other direction.

We used to be Mac Pro heavy now we just have iMacs/minis for the last mile or iOS/macOS export and testing with performance heavy beefy PCs for most of the day to day work. Additionally, taking your jet engine/trashcan new Mac Pro or iMac to the mall for repairs instead of just popping in a new video card or drive also sucks.

Developers can get two performance PCs for the price of one Mac Pro. Usually you can get more power out of both as well since Mac Pros from 2011 on were 1-2 years behind. There is no getting back pro users such as devs/game devs that had switched over and have now gone back to Windows/PC.

>caused them to move on, so did the pro users and developers

Wasn't there a statistic from github recently, that 75% of the PRs created in 2017, were done from Macs? They are bigger in the market for devs then ever before, the thing is, that the majority of this peer group is totally fine with a MBP. I see this whole new MacPro story more as a marketing campaign, the driving factor behind this is not the demand from the market, it's only for their long-term reputation.

Yeah Apple still owns the laptop market but even their latest Macbook Pro wasn't really, more mainstream. Mac is still a better development environment because it is nix based for most things webdev related and iOS appdev mobile related.

At far as desktops though, even creative/advertising agencies are moving to PC for 3d, photography, graphic design, audio production, video production and more. These were always the target market of Apple but they lost many.

Apple iPhone lost ground to Android and laptops may also do so starting with development. I see many devs that went from Mac to PC desktop go from Macbook Pros to Surface Pros but that is just starting. Eventually it could also start switching people from iPhone to Android/Samsung/Google hardware as well as the hardware has caught up.

Apple being a hardware company, it is strange they did not take more advantage of the desktop inroads they were making in 2006+ with Intel and iOS development.

I see a lot of developers embarking on the Linux on Windows approach. It makes sense if one's goal is to learn to manage Windows boxes, but it misses a lot on the Linux side of things a good developer needs to know (unless, of course, they're developing for Windows, in which case the whole Linux side is unnecessary).

Apple can't regain the 3D, design and production market unless they can get a solution that's competitive in price with PCs. I'd suggest keeping Macs as the most delightful to use desktop and transparently offloading the heavy lifting to racks of commodity PCs running headless Apple software (which may even run on top of Darwin if that makes the porting easier).

OTOH, if Apple can develop a significant edge in performance like it had with early PowerPCs. Then Apple becomes the best bang for the buck again. Since they are designing CPUs, they can be pretty creative here.

> At far as desktops though, even creative/advertising agencies are moving to PC for photography, graphic design, audio production, video production and more. These were always the target market of Apple but they lost many.

I don't think this is really true, all the creative agencies I'm aware of are still firmly mac shops. Do you have a citation or anecdotes to back up your assertion?

Most of them are still pretty Mac heavy, I said 'moving to PC'. The agencies I work with and worked at have begun to purchase more and more PCs over Macs.

Adobe Creative Cloud apps tend to work better on PC now so this is a big reason.

The areas I mentioned (video, photography, audio, gamedev, 3d etc) are moving to hefty PC machines that need 64GB/128GB/256GB+ of ram, latest GPUs, many drives and many cores (16-40+)[1][2][3][4][5][6][7][8], no iMac can do that and Mac Pros have left the building. Many of the PCs are cheaper and more powerful as well, custom built ones especially. Many web developers or graphic designers still on iMacs, but the other areas on PC.

Basically, anyone that needed a Mac Pro, where an iMac or Macbook Pro was not enough, has probably switched/back to PC. You'll see most people do this around 2013-2014 when the Mac Pro was un-cheese grated and put in a jet engine/trashcan, looks cool but too expensive and not for hands-on pros or expanded easily.

One video/photography/3d guy I know is running 40+ core/thread machines 256GB RAM, hard for Mac Pros to compete with that[9][10].

[1] https://www.creativebloq.com/advice/the-digital-artists-guid...

[2] https://www.slrlounge.com/apple-is-dead-to-me-trey-ratcliff-...

[3] https://www.stuckincustoms.com/2017/02/10/switching-from-mac...

[4] https://petapixel.com/2016/12/03/im-leaving-apple-microsoft-...

[5] https://petapixel.com/2017/07/18/5-reasons-pick-pc-macbook-2...

[6] http://philipbloom.net/blog/makingtheswitch/

[7] https://uxdesign.cc/designers-workflow-on-windows-57393856ae...

[8] https://medium.com/charged-tech/why-i-left-mac-for-windows-a...

[9] https://www.techspot.com/review/1218-affordable-40-thread-xe...

[10] http://www.titancomputers.com/Titan-X399-Dual-Intel-Xeon-E5-...

Thank you, that makes sense. I haven't seen any Mac Pros or iMac Pros yet, so I guess I haven't been exposed to that end of the segment yet.

"fine", yes, but it could be a lot better still. I'd sacrifice the small size and touch bar and such for a faster cpu, or a docking station (like the plan they had for a new display, which included its own discrete graphics card to help power the 5K display - actually there was an article recently saying Apple now supports external gpu's, so that might still be a thing), or a better keyboard, bigger screen, etc.

(and personally, yes I'm stuck with a laptop due to work atm. Wouldn't mind having an office or fixed work place with an imac pro though)

>Wasn't there a statistic from github recently, that 75% of the PRs created in 2017, were done from Macs?

No, there wasn't. Github posts lots of stats but nothing about OS shares.

There was. Apple shared it in the WWDC keynote when talking about changes to Xcode.

There's a full transcript of the last WWDC keynote on Apple's website and it has zero mention of Github.


Yeah but most of the people using OSX aren't really thrilled about it. I'm typing this on a 2012 MBP- there's not much that's pushing me to upgrade to their latest machines.

How many game devs actually used Mac Pros instead of Windows machines?

Not a ton, but for iOS games you have to export at least on an Apple machine. At our studio most people had PCs but people working on iOS/Unity games had Mac Pros and iMacs solely or in addition to PCs.

There was also a window where Unity was only available on Mac from 2007-2010 where lots of Macs were purchased by game devs.

Also making Linux/Apple/web games there was a little blip for a while where Macs were making inroads especially after going Intel processors. OSX is arguably the sexiest nix compatible machine and is quite fun/efficient for development.

Overall game devs have always used PC as the main for desktop, but with mobile and iOS taking over handheld gaming in 2007, game studios all had to buy them.

Now, most end up just getting a few iMacs or Mac Minis to export on and debug otherwise much of the game dev work is done on PCs.

Apple truly lost game devs, but also other content creators like video editing/production, graphic designers, photographers, audio production and more. These professionals were always the focus of Apple, amazing they just let those people wander off as they are also influencers. I even see lots of devs going from Macbook Pro to Surface Pros now because their main machines are back to PC, that could eventually start causing people to switch from iPhone to Android/Samsung/Google as well.

But how in the world are you going to manage without ecc memory? You might have a bit get flipped by a cosmic ray once every 5 years of 24x7 computing.

Switched to a threadripper 1950 and used ecc memory and an nvme drive. Fastest computer I could build at the moment, cost less than half of a worse Mac,even with a 34 inch curved widescreen. Plus I got to go back to awesomewm. Finally I don't hate my dev environment. Feels good to get back to my pre-2013 roots.

According to this IBM study [2]/Wiki article [0], the probability is much higher - per the study with 128GiB of memory it is every 1h 25min.

There was an anecdote[1] about a Power Mac cluster at a university, which basically couldn't get past boot because the memory errors was so frequent.

[0] https://en.wikipedia.org/wiki/Cosmic_ray#Effect_on_electroni... [1] https://spectrum.ieee.org/computing/hardware/how-to-kill-a-s... [2] http://www.pld.ttu.ee/IAF0030/curtis.pdf

An article of frequency of errors http://lambda-diode.com/opinion/ecc-memory-2

There are plenty of papers that put the error frequency orders of magnitude lower than.

But even if true, so what? What are you doing on a workstation where a bit flipped every 3 days is going to cause a problem? Anything critical is going to have a software check. I just have never seen any legitimate argument for needing it in a workstation.

> What are you doing on a workstation where a bit flipped every 3 days is going to cause a problem?

Depends what bit flips. Given some of the file formats out there, the bit flip just before it is written to the disc might be fatal.

> Anything critical is going to have a software check.

I would love to see someone go through github and analyze if that is true.

I just cannot imagine why it would be acceptable to have bit errors when a preventative measure is available. If the industry would stop looking at it as a value add, the cost would come down and we all would benefit from a more stable platform.

>we all would benefit from a more stable platform.

What instability are you talking about? Has anyone experience any instability ever on a workstation that could have been attributed to a cosmic ray bit flip?

I wouldn't trade a 10% memory performance drag for ECC if the memory was cheaper than standard.

> What instability are you talking about?

By definition, if the bits can flip in a way that wasn't intended then it is not stable. I'll take the 10% because I pretty sure getting to wrong 10% faster is not worth it to me.

Having worked on system where we had a crc on almost all data structures, let me tell you that bit flips happen more often then you want to know.

Not everyone crcs 100% of their structs.

If they don't, data is being slowly corrupted.

How do you know they were bit flips? And if they were, how do you know they were soft errors that could have been corrected with ecc and not hard errors that would have happened either way?

Because we had insanely good crash reporting and we root caused every crash report we got in that had more than half a dozen hits in the wild.

Hardware was at fault. Turns out suppliers push the truth a bit here and there about their chips, what is supposed to be an identical part # isn't always identical, parts from different manufacturers that are supposed to be interchangeable aren't, and technical manuals aren't always the best.

We had (what we thought was) rigorous memory testing in place at the factory, but under certain extreme conditions there was a 100% reproducible flip of a few bits in RAM. It was almost always the same bits, thankfully, that made it possible to track down!

intel xeon e3 or amd

I switched to Apple product in 2002, because at the time they were absolutely better than Windows 2000 for my needs (mostly Protools, and eventually Logic 7). Windows at the time was a mess and poorly supported most pro hardware and vendors (specifically Avid) were slow to support releases.

It really felt like the best platform. I had a nice G4 tower, and eventually used a G5 and then Mac Pro tower. Things generally just worked, and you could expand things as needed.

Roll around to today, and it's a mess. Apple canceled a lot of their pro products. Aperture? Dead. Final Cut got nerfed pretty hard. Logic is still reasonably well maintained, but Ableton Live took over a lot for me.

The hardware options aren't that great right now for my needs. Last week, I sold my 2011 Macbook Pro to a friend and switched all my audio stuff to my old Windows gaming system. Picked up a Firewire PCI-E card for $20 and I was in business. It's fast, has a ton of ram and is easily expandable. Windows 10 drives me nuts, but eh...

I've still got three Macbook Pros for software development, and they are pretty great for that (minus the new keyboard, and that Tensorflow GPU has dropped OS X support). I don't think I'll be switching laptops anytime soon. But ugh, audio work on them just feels like an overpriced joke at this point.

I really never thought I'd be saying it, but for photography and audio stuff, Windows seems to be the place right now!

As someone who uses a MBP as a daily driver laptop: Windows has been a better platform for a lot of the photography software for over a decade. Photoshop and Lightroom just work a ton better on Windows and have since right before the CS days.

But the native terminal, etc, has won me over as a *Nix person for a long time. I don't like Linux desktop interfaces at all, and the new Windows linux/bash stuff hasn't been compelling or integrated enough for me to want to switch (ConEmu and all of the terminal stuff being garbage is a big issue). I'll be sticking with a MBP and OSX for my laptop and work related stuff, but I don't understand why anyone would still be using OSX for a photography workflow. It's just so much more polished on Windows.

> Lightroom just work[s] a ton better on Windows

That’s not my experience, at least with Classic.

Mine is a Mac house. My parents, though, are long-time Windows users; I only managed to get my parents to run a single iMac, which they gave away as soon as they could financially justify it.

I tell you that because that is where my cross-platform Lightroom experience comes from: almost all Mac, with occasional mentoring/support sessions under Windows at the parents’ house. With Lightroom on Windows, I’ve repeatedly observed:

1. Windows’ brain-dead default-mandatory file locking prevents perfectly reasonable operations, simply because Lightroom is busy working with one or more files in the background. Advisory locking as is default on POSIX type systems allows many things that Windows refuses to allow by default. (E.g. Rename a parent folder while a file in that folder is open for writing. Who cares, the FD is still valid!)

2. Lightroom can take a long time to shut down when it gets busy, as it too-frequently does. On the Mac, the app icon remains marked “running” while Lightroom grinds away, trying to figure out how to shut down, but on Windows, the app icon disappears from the task bar almost immediately, but Lightroom.exe remains running in the background, so that it is not obvious why reopening the program fails for minutes at a time.

(And why reopen? Because relaunching Lightroom often solves slowdown problems, and has for years upon years, which is a separate rant.)

3. Some plugins simply won’t run on Windows, at least not without dragging along a bunch of compatibility junk. Anything that depends on ExifTool, for example, requires dragging over a whole Perl environment just to run the plugin. If you have multiple plugins dependent on ExifTool, as I do, each one usually comes with its own Perl environment. Compare macOS, where at worst you have the CPAN module alone, and at best, it might simply require that you install ExifTool separately, that being a reasonable user requirement on an OS like macOS.

If your comment was referring to GPU acceleration and such, Adobe’s recent focus on that isn’t helping me anyway. Almost all of my problems with Lightroom’s speed are in the Library module, not the Develop module, where GPU acceleration doesn’t help much anyway.

> they are pretty great for that (minus the new keyboard, and that Tensorflow GPU has dropped OS X support).

If you're training a model on your laptop, instead of a server, does it really matter? It doesn't seem likely that you'd want to deploy a model trained locally.

Have you ever considered a hackintosh?

I think a big part of @tibbon's complaint is the lack of pro software (from Apple).

Please excuse the venting here for a minute.

After "can't innovate anymore my ass" Apple released their architectural dead-end trashcan Mac Pro in _2013_ we have to wait until _2019_ for an update? What am I supposed to buy? I'm certainly not going to buy anything with a Touch Bar. It's 2018 and I had my job order me the 2015 Macbook Pro for my work computer so I could skip the Touch Bar and still have USB ports.

What's going on at Apple?

Don't forget the keyboard on the new Macs, which is the other big reason not to get them (besides ports and touch bar).

I really wish I could get the 2015 MacBook Pro with USB-C ports and Touchbar MBP size.

As painful as it was to have to buy dongles and whatever else, having monitors + USB docks using USB-C has been pretty pleasant.

I use a 2016 and recently used my friends 2015. Honestly, it felt amazing. I had to do a double take. The new keyboards sure do suck.

Do they really think a 10% size difference is more useful than a good feeling keyboard? What are they smoking.

I felt like you about the keyboard but after using it for a while I have to admit I love it. Sometimes I have to use my old Macbook and the new one feels definitely better.

I had to do a demo on another dev's 2016 Pro in front of a client. It was embarrassing how many typos I was making. It's such a horrible feeling keyboard.

The new iMac Pros are a good stopgap for pro users on the Apple platform.

They are nice machines with plenty of performance. With 4 Thudnerbolt ports, you have some expandability.

They're powerful machines for sure. I just can't see myself buying a "desktop" machine that's non-upgradeable and has an integrated monitor. It's basically a big laptop.

The things you want to upgrade are cpu, gpu, ram and disk. The ram is upgradable, and through thunderbolt you can upgrade gpu and storage.

It’s not the same as internal upgradability, but it’s also not insurmountable.

IIRC the CPU is a standard socketed desktop Xeon too and can be upgraded if one were so inclined.

Yup. And "we painted ourselves into a thermal corner with this non-upgradable yeah can", so while you wait we give you a non-upgradable iMac Pro with the same thermal problem.

If only the 2013 Can Pro used normal PCIe connectors for the GPUs. Then you could bungie cord a couple of new GPU cards on the outside of the can, connect them using PCIe risers and cables, and power them off an external PC power supply instead of using the Can's power supply.

I hate to admit this, but Apple should sell off its Mac division. They are simply not hungry enough.

For the first time in 33 years, I no longer have access to a Mac. Apple doesn’t sell one that my employers nor myself want to buy.

That it has taken so long to course correct would have sunk most companies. It is probably a couple percent hit on Apple revenue at most.

Wow, yes they've definitely lost their "eye of the tiger," haha.

Doubt there is a Lenovo waiting in the wings, nor would they allow it, a shame.

I wonder who the target market is, though? The iMac Pro can cost £5000, so I assume more performance and hence price than that? The only users who I can imagine willing to justify paying such a premium are those who feel they must remain at the cutting edge of performance and are willing to pay for it, but surely they will have left the Apple ecosystem by then?

It's as if Apple are saying "we're making a computer for people who demand peak performance, but are willing to use cheap outdated hardware until we get round to it."

If you do any iOS work you need macs as build machines because of Apple's lock in. Spending the money on an iMac is just a waste of a monitor and if you're already spending the money on the hardware you might as well get something that can do render jobs as well. You need GPUs for that.

The Mac Pro doesn't include a monitor, so I can see it starting at less than the iMac Pro.

err no not if they do it right they will be offering the top of the line Xeons and one or two hefty GPU cards.

The new imac pro has the entry Level xenon and still throttles after 10 min.

They might offer that as a configuration option but not the base config.

I'd assume mostly people who want really hefty GPUs (and for whom eGPUs aren't an option). The iMac Pro won't help you there.

In the article it sounded as if they actually intend to achieve modularity fort the Mac Pro with eGPUs, or peripheral hardware in general. A mistake in my opinion.

> I wonder who the target market is, though?

Schools for sure. The staff believes that Macs are better and have bough trash cans to replace the cheese graters.

I think this is where Hackintosh comes in. You can buy top of the line, macOS-compatible hardware for 1/3 of the Apple premium.

It takes effort to set up, but not that much.

This is a myth that keeps going and going.

A Core i7 is not the same as a Xeon. ECC memory matters. The Apple premium simply isn't that much higher.

I also wanted to add that the cost of a person's time eclipses the cost of hardware (especially with computers). As a business, I simply wouldn't risk having days of downtime from a bad update, just to save $2,000 for something that's used for years, when the lost time in paying a creative person's salary will eclipse that.

And everytime, someone mentions how ecc memory is important. But it's just not. There is a performance penalty, and the real use case for ecc memory is vanishingly small for a workstation.

What could you possibly be doing on a workstation that requires ecc?

For every 8 gb of memory, you are potentially looking at a single bit flip every 9 years. With ecc, only once every 45 years. And in exchange, you are looking at a 10% performance penalty.

What could you possibly be doing on a Mac pro that would need that kind of accuracy?

But that's not my argument (Even though I said ECC matters. maybe you're right).

My argument is that every time someone says "I can build the same thing for 1/3 the price", they are literally using completely different parts. They AREN'T building the same thing.

The idea of an Apple Tax is that apple arbitrarily marks up everything. But if you bought the same parts independently, you'd find the markup is very little. More like 10%.

Look at those $10k HP workstations. They are more comparable to Mac Pros.

But then the problem is just that Apple makes component choices with very poor value. A Threadripper CPU is not the same thing as the Xeon in the Mac Pro, but who cares? You can still build a way faster machine (with ECC) for half the price.

Apple techies hate to admit it. But in the end, it's the same reason people pay $2k for a plastic Louie Vuitton purse.

Except, at least with a fancy purse, you can carry it around and signal your wealth to others. My workstation is hidden away under my desk.

No I’d say a better analogy for what you’re trying to say:

I just want to buy a car for $20k that goes 70 miles an hour. But you’re only selling me cars for $50k cuz you insist on putting things like carbon fiber brakes and titanium alloys. I don’t need any of that. I can build something that perfectly meets my needs for way less. Because those enhancements are meaningless for what I’m trying to do (go 70 MPH)

Ya, or maybe even more precise:

You want to sell me a car for $50k that can go only 60 mph. But it has carbon fiber brakes and titanium alloys, requires premium gas, and has a fancy logo. Oh, and it has a flux capacitor which does nothing.

Or I could just buy a car that can go 100 mph for $20k.

Ya, it's not the same parts, but if the result is a higher performing system for way less money, what's the difference?

You can use different parts and end up with an equal or better outcome.

A lot of the Mac faithfuls point to ECC and Xeon to defend the price of the entry level iMac Pro.

Yet every single one of them I follow has used only MacBook Pros for the last 5 years or so which obviously don't have ECC so yeah I feel the importance of it is overstated.

The error rate is significantly higher than that. ECC also makes RowHammer much harder to pull off.

Even if it is higher, what are you doing on a workstation that would benefit from lower rates? This study found 2 (yes, 2) "suspected" soft errors in 300 machines running for multiple months.


And not that Row Hammer should really be a concern for a workstation, but ECC does not necessarily protect against it. Some types of ram are vulnerable and some aren't. But ECC is not necessarily a factor.


ECC for security on a desktop to combat RowHammer?

I simply cannot imagine running something like that in a professional context. For me, "maybe this next update will install correctly" is a non-starter.

For consumers who just really want to run macOS, I can see the appeal.

I agree, in an educational institution or large corporation, no way. But in a small shop that lives and dies on productivity, where one can't (won't?) migrate away from MacOS, a Hackintosh may be the only option available.

It's the market filling in the gaps of Apple's engineering & marketing plan.

> in a small shop that lives and dies on productivity

A computer that takes a bunch of fiddling to set up and might not work if updated doesn't seem like a productivity booster to me.

IT does the fiddling. Creators gain performance and productivity.

You're a small shop. Do you even have the funds for IT, or is it just one of the hats that your main system admin or reliability engineer puts on?

That depends how much your creators' time is worth, i.e. the return on increased performance.

If Hackintoshes have motivated Apple to release a new Mac Pro after a multi-year gap, then Apple-only shops will benefit from the Hackintosh investments of other shops.

Yes, that's my point. How much money are you saving? Maybe a couple thousand dollars? Balance this with the time you'd need to spend on setting this up and getting it to work right–a couple days, a week? It's not obvious that this would always work out, especially so if you don't have in-house IT and would either have to get a creator to learn how to do something like this or hire someone that does.

Right, so those people keep using Apple and wait N years for a new computer.

Other people use a Hackintosh and help to define requirements for Apple's next computer.

Apple gets free R&D. Those who need performance get it. Everyone eventualy gets the benefits in the next computer.

I agree.

Works perfectly in a home office and saving $2500 is not a small thing.

Not all updates are critical.

To me the value of a Mac is not macOS. It’s having a complete device solution that just works.

I don’t want to fight with my workstation all the time. I want to use it to do... work.

I have been battling Apple about my 2013 mac pro for a year. You have to be careful about the monitors you buy because Apple will say it's not their problem when your monitor knocks your wifi offline (personally) or some weird shit. Personally, I think for anyone deliberating between an iMac Pro and a Mac Pro, you should go with the iMac Pro because Apple has always been an all-in-one company--it's one of the few things they are still good at these days. I used to work on the older mac pros in high school and never had problems, but those days are over. If I knew hardware well enough to fix it I would have gone Hackintosh.

I used to have a '14 RMBP with the Apple display and it was wonderful. Now I have a '17 13" MBP and it only has two USB ports so I have a third party dock and monitors because Apple doesn't offer a solution. The entire experience is awful but still more reliable than any Windows laptop I have ever run, although the gap is closing.

Hackintosh doesn't work for me because my business isn't fixing laptops. Apple used to provide a full solution and it was really great.

Hackintosh run very stable these days even more so now with Clover. But you need to have fun at doing that

Yeah I get it, I even consider that kind of thing fun in my spare time although I have other hobbies these days. In a professional (corporate) environment I don't have time to fix my laptop. I'm lucky to work in a place that has an outstanding IT department who provides what we need (including outstanding Macintosh support) but a Hackintosh is not on the list.

I am also sure the legal department would shut down a hackintosh real quick.

You can't knock everything else just because it's not a Mac. I've personally had my fair share of Mac problems and when I worked Help Desk we had our fair share of Mac Laptop problems that were comparable if not greater than our failure rate of Thinkpad's. This was in 2017 too.

I’m not knocking it because it’s not a Mac. I’m saying when Apple sells a complete solution it is generally trouble free.

That may have been the case 5 years ago, but it certainly isn't now. I work in a mostly mac office and not a single one of my mac fanboy coworkers has a single nice thing to say about the 2017 macbook pros.

Right because it's just a laptop and you have to buy a bunch of other stuff from third parties to go with it. I explained this in a sibling comment.

In 2014 Apple would sell you a monitor that doubled as a dock which worked great with the RMBP and Apple's own wireless KB and Mouse and even Time Machine.

Today you're stuck with USB-C docks which are pretty awful. Pretty much on par with a Dell or HP laptop dock. The new Macbook Pros are the worst device I have used from Apple.

The thread you are replying to is about hackintoshes. The point of a Mac for me is that it Just Works (tm). If I want to fight with something I'll buy a Thinkpad and install Ubuntu. I have no interest in fighting with macOS, it's only useful to me if it is easy.

> Right because it's just a laptop and you have to buy a bunch of other stuff from third parties to go with it.

No. Their complaints are about the laptop itself. The keys are constantly breaking. The software constantly crashes. We're all a bunch of consulting devs and the 2017 macbook pros have cost every single one of them at least a week's worth of billable hours in the past 6 months. Most of them have gone back to their 2015 models.

This solution is fine for individual developers, or even small shops / startups, but it would never fly in a more corporate environment. Coincidentally, those environments are also the most likely to drop $5K+ on a development machine without batting an eye.

Perhaps this is part of the reason why Apple is moving away from Intel chips to some of their own creations?

Statistically there are almost no hackintoshes, it would be a very dumb reason to switch from Intel chips. The much more likely reason is that Intel chips have hit something of a performance wall and Apple also runs a much more popular ARM based platform.

More likely still is that one chip set convergence lowers the cost of OS development by combining both platforms into one.

This list is a joke, that's not even close to a 1:1 comparison. It looks like they intentionally picked the most expensive options for everything. You don't need a $160 AIO water cooler, a $250 1000W power supply, or a $650(!?!?!) motherboard. The Vega in the iMac doesn't compare to a 1080 at all.

To anyone with an understanding of PC hardware it's clear this list was composed specifically to justify the iMac pro's price.

Here's a much more reasonable list: https://pcpartpicker.com/user/xanderstrike/saved/mNDWXL

Why would you need to buy a 1080 ti graphics card when imac pro has an amd vega. That makes the entire parts list suspect to me.

This is for Logic and Final Cut users who want to have the best and don't know enough about computers to understand they're being ripped off.

Or they bill $2-300/hour so they don't really care. If their tooling is on the Apple side, they don't really have the option of going for a high end PC.

There's actually even more to it than just not caring. There is also the marketing side of it. Recording studios will sometimes make choices to buy certain gear because of the name and recognized brand. Apple computers are certainly the recognized brand for digital recording.

The software stack is very relevant, Apple shops would have to retrain to go PC, and some of the software they have is sub optimal.

My wife is a designer, she needs much more powerful computers than I do (as just a developer). She constantly runs out of memory, GPU performance is very relevant for Photoshop, Illustrator, and other tools she uses in her work. So spending $2500 or $3000 for a high end (but not pro) imac is a given, even though I would be perfectly happy with the $1900 one.

For her, the advantage in productivity is worth it. I can imagine there are many people who would see the imac pro as being worth it to them also and the price isn't going to bother them much because their time is already very valuable.

From 2007 to 2016 I was all-in on OS X / MacBook Pros / iMacs for work (software dev.) Since then I've gone back to a Dell XPS 15 and a custom-built AMD Threadripper machine. As much as I like the Mac hardware, I could no longer justify the cost. The XPS is a nice machine and (no touchbar - it's a feature) and my custom-PC is a monster. Windows 10 with all its warts is plenty serviceable especially now with it's Linux Subsystem (primarily because I prefer to use bash as my shell.)

I guess I'm no longer part of their target market for their computer systems.

Dell XPS laptops are very high quality and a definite suitable replacement for a Macbook Pro if you can justify getting rid of OSX. XPS laptops are also fully Linux supported out of the box.

I could justify the cost that whole time until the 2017 Macbook Pro. Or was it the 2016 that moved to the tiny-action keyboard switches?

My 2017 MBP is less than 6 months old and two keys don't attach to the switch anymore. They literally come off with my finger as I type. They aren't broken, the C-clamps are still intact. They have just widened enough with wear such that they don't clamp anymore.

It's so bad that I use my 2014 Macbook Air when I need to do a lot of typing.

Most of the people I know IRL are having the same issues with their newer model Macbooks, especially my developer friends since we of course tend to be harder on our keyboards.

So you aren't missing anything.

My daughter's 2016 12" Macbook's space bar only works on a hard key press. I have AppleCare on it, but taking it to either Apple Store in our area is a ridiculous hassle. Drive 30+ minutes, service will take a minimum of 2-3 hours (at least that's been my experience with our phones.) I feel that Apple used to have a competitive advantage in that I could bring my devices in, have them serviced in a reasonable amount of time while I wait (an hour?)

My dad had is IPhone 6+ battery replaced under the $29 plan. 6+ week wait for the battery. They call, the battery is in, dad shows up at 11:30am on a weekday, it'll be ready by 3pm. He shows up at 3, and it takes until nearly 4 o'clock to get someone to get the device back out to him. I'm reaching the point where I may as well go with Dell/Samsung because shipping a device in is guaranteed to be less of a hassle than the "Apple Store Genius Bar" experience.

It's almost what you might expect from an executive team whose vision/talent is focused on manufacturing efficiencies and the aesthetic side of industrial design...

> I guess I'm no longer part of their target market for their computer systems.

I've seen the phrase "not part of their target market" deployed at developers and particular users with increasing frequency for years.

Seems inevitable that eventually people will go somewhere else.

> Linux Subsystem

Wat.I just went googling after reading this comment and at first glance, it looks like this is an officially supported method to run GNU apps in a Windows terminal? Can you boot directly into Ubuntu desktop without messing with BIOS and UEFI?

>Wat.I just went googling after reading this comment and at first glance, it looks like this is an officially supported method to run GNU apps in a Windows terminal?

It's hardly limited to GNU apps. Most of your Linux applications should work fine on it. I've had no problem with all sorts of non-GNU releases when I use it on my desktop.

>Can you boot directly into Ubuntu desktop without messing with BIOS and UEFI?

No, it is a Linux userland on top of the NT kernel with an emulation layer written to handle all of the syscalls.

It works well. The only really glaring issue is ConEmu and related Windows terminal ecosystem is still hot garbage compared to any Linux or OSX terminal.

> The only really glaring issue is ConEmu and related Windows terminal ecosystem is still hot garbage compared to any Linux or OSX terminal.

...and that bites you in a surprising number of ways.

I recently tried porting some software that runs just fine under Ubuntu-on-x86 to Ubuntu-on-WSL, and I found three different failure modes in the console/pty mechanism before I gave up. I then tried running it under Cygwin and it ran correctly out of the box.

Perhaps the easiest way to see this is to try to run a program like GNU screen under WSL, but the core problem isn’t any specific application. Any program that does anything even moderately tricky with ptys is likely to fail under WSL.

"It works well. The only really glaring issue is ConEmu and related Windows terminal ecosystem is still hot garbage compared to any Linux or OSX terminal."

Does that mean Ctrl-C is still "copy" on Windows even in a bash shell?

There are many tools like valgrind that are necessary for development that are broken in the subsystem. I would call it a step in the right direction, but it is somewhat half baked.

Here's a link on how to get Linux GUI going: (edit - better link) http://blog.sqlyog.com/how-to-add-a-gui-to-the-new-bash-cons...

Reading this article was really annoying. First, there were grammatical issues like these, which may have been intentional, but made understanding the article difficult:

> I saw a bunch of them walking by in Apple park toting kit for an outdoor shoot on premises while walking

> Is it the OS is it in the drivers is it in the application is it in the silicon and then run it to ground to get it fixed.

Then, I clicked over my Hacker News tab and when I went back, the content was replaced by another page (!). Apparently it was changed via JavaScript when I reached the bottom of the page. This is stupid and user-hostile.

I agree.

When scrolling to the bottom of the article the last thing I expected was the entire content of said article to be replaced by headlines of other stories from TC.

Viewing the article in portrait mode on desktop displays the main contents on the left 50% and complete whitespace on the right 50%.

Overall the TC redesign is god-awful and has degraded usability substantially.

I think Apple is needlessly re-inventing the wheel here. While their intent to analyze real world usage and optimize every detail by bringing in actual pro users is admirable, I‘d prefer them to use a simpler, more streamlined approach and have them release standard, state of the art hardware in a tasteful, quiet case and continuesly work on optimizations on the software side - a process that is required anyway and not exclusive to the Mac Pro, which makes me wonder why it should dictate the Mac Pro roadmap and schedule in such a significant way.

Furthermore it‘s concerning that the article mentions modularity mostly in ways that involve external hardware and peripherals. If that turns out to be the approach Apple is considering, it sounds worryingly similar to an iMac Pro without the fanatastic display.

“And then we take this information where we find it and we go into our architecture team and our performance architects and really drill down and figure out where is the bottleneck. Is it the OS is it in the drivers is it in the application is it in the silicon and then run it to ground to get it fixed.”

This approach is reminiscent of Apple's approach to low-latency touch scrolling on the original iPhone; to Quicktime and MIDI a decade and a half before that; and, more recently, to the Apple Pencil.

I knew executives at Palm who were frustrated by the challenges that both the tech stack and the organizational structure (Conway's Law) posed in the (failed) effort to reduce touch latency to where it felt physical. I know that it took Android many years to reach the point where objective observers described it as equally “buttery”. IMO this full-stack integration and optimization is one of Apple's core competences. It will be interesting to see what it leads to this time. The effort may be too late, but if the past predicts the future, it will not be too little too late.

I waited and waited and waited. I liked the design of the Trashcan but by the time I could afford one it was woefully out of date and I needed Nvidia GPUs for the rendering engine I wanted to use.

In the end I moved my creative work over to PC and currently run a nice compact mATX PC built for GPU computing with two 1080Tis. Still cost me less than half the eventual iMac Pro and presumably this Mac Pro.

If this had been released a few years back maybe I'd never have moved to PC but at this stage they'll have to be still supporting the new Mac Pro in 3 years before I even consider moving back at this point.

Been a Mac only user for 15 years now, I didn't move away lightly, but now I've switched it'll be hard to convince me back.

I wonder at what scale has this occurred? Has video editing industry moved away from Apple computers in a substantial way?

They have managed to keep the music recording industry; a new macbook pro or imac is plenty powerful enough to be the heart of a recording studio.

Many people in the pro audio world have moved to Windows over the last 5 years. Apogee, who I think Apple paid to only support OsX at one time, now makes Windows drivers and officially supports both platforms. MOTU has been cross platform since like 2006. Logic is really the only audio product that does not support Windows but only because Apple killed the pc version in like 2004. Logic pc was amazing and might even still run on Windows 10.

The main reason I'm not using Logic right now on my MBP is that it doesn't run on other platforms, and Apple's behavior over the last half-decade has convinced me that after almost two decades of knowing their platforms were the right choice for me, I can't trust their judgment when it comes to disruptive or dealbreaking changes to both hardware and software.


What are you basing that on? There is plenty of Windows now being used in audio. I use a MacBook Pro 2017 QuadCore, and still off loading audio work to a PC. The laptop uses the fan really heavily. Granted the i7 iMac is pretty damn good, but still you need up with lots of things hanging off it.

It also depends on the music you do and the plugins you run.

I do not need a Mac Pro, nor could I afford one if I wanted to (I do own one, though, albeit it is 11 years old, I got it preowned from a friend).

But I wonder what the hell the people at Apple were thinking when they put put the "trashcan"-design. It looks awesome, I cannot deny it, and I imagine it is very convenient to handle.

But something deep inside of me cringes when a computer in this price class has everything soldered on and has no real option to extend it. You cannot upgrade the CPU, the RAM, the GPU, nor can you install any additional PCIe cards.

It wasnt that convenient when you wanted to put stuff inside it or rack it. Hence the thirdparty rack chassis market.

When I said "convenient to handle", I was thinking, small enough to sit on top of my desk, lightweight enough so I can carry it without breaking a sweat (the cheesegrater is really heavy!), that sort of thing.

I assume Jony Ive said "I'm so bored of boxes" and that was the end of the cheese grater.

It’s the ultimate rounded corner

I just hope they don’t break any new ground with it. Zero. Just make a big tower that does excactly what a HP workstation tower would do, and match the price. No need for any Apple magic of any kind here. Don’t make a trash can, don’t make a touch bar.

That makes zero sense from a business perspective.

"I know, let's put out a boring commodity product that has no unique selling points, doesn't match our ethos, and has a tiny niche market. And we'll sell it at the same razor-thin margin that everyone else does!"

That couldn't be further from Apple's way of doing business.

If you want a cheap commodity tower, just buy one, but don't expect Apple to make one for you.

Apples uniqueness here is their OS and the applications that run on it. That’s why people would buy a Mac over an HP. Only that. If they want pros to keep using Mac OS for video etc, then they need to have competitive hardware.

Apples other concern is the developer side and the app ecosystem: high end build machines and developer workstations are also relevant for their actual business which is selling apps and phones.

Computer hardware for professionals is just a boring service Apple needs to provide in order to support the rest of their business. It’s not even a visible product like a MacBook or an iMac- it’s designed to be hidden on the floor because it’s huge and noisy (the trash can was a failed attempt at working around that fact)

Edit: to follow up on the “powerful machines are a boring service” idea: If Apple didn’t want to risk the brand damage of launching a boring product, they could simply license dell or hp to sell certain servers and workstations with Mac OS to let the highest end render machines and build servers etc run Mac OS. They could easily charge a 20% premium and people would be happy to buy them.

"Apples uniqueness here is their OS and the applications that run on it."

I really think you're wrong. Even a lot of people who run Windows or Linux prefer Apple hardware.

I should clarify that I mean on high end workstations and compute/build servers now. Machines that no one sees or touches, and are only required because you are already in the Apple ecosystem, and need to do e.g. builds using xcode, or video rendering using some OS X software.

Apple make excellent laptops, phones etc. but those are products in a completely different category.

I think lots of people use macs to run linux (especially developers). The number of people that run Windows exclusively on Mac is probably pretty small though.

Ignoring the GP, most of the comments in this thread are people begging for an overpriced commodity tower. People are happy to pay the Apple premium, but they expect the latest specs.

With “match the price” I’m completely fine with a reasonable Apple tax, and obviously a bigger one if there is a quality difference. I’m not expecting Apple to launch a plastic tower, and if they make a 2 socket Xeon machine looking like the old Mac Pro, then it’s certanly fair to charge a bit of s premium over an HP tower.

Most importantly though it needs to be compatible with most high end hardware from the day it’s launched. If Nvidia makes a gtx 3080 the day after you bought your Mac Pro you can’t worry about when the Mac compatible version comes out, or whether it will cost twice what the PC compatible card does.

And yet that product is absolutely critical for them to keep selling their iPhones, iWatches, and iEverythings.

They might not make a single dollar on the sale of the pro machines - and it would still be worth every dollar of engineering put into the project to keep the devs happy in order to keep their other markets healthy.

There's zero evidence that this is true. Apple is aggregating an incredible amount of consumer demand on iOS, so developers will jump through a ton of hoops to serve that market.

Is there any evidence that iOS doesn't have as many great apps because developers are fleeing because there's not a Mac Pro that they like?

I'd be curious - from an academic/economist-like perspective - if that is a curve you can't see in data until it's broken. People stay and try and tough it out, then once they reach their tipping point they rage-quit and you never get them back.

I kind of suspect that's what happened last year - everyone was just toughing it out waiting for the pro kits, and when it never came the threats that came in from big shops were very real. Enough so that Apple broke every tradition it had regarding future product announcements, and laid a lot of their goodwill on the line to promise a fix.

If you're talking about the video, audio, photo, 3D, etc. markets, then maybe, though I'm really skeptical. I think sales of the Mac Pro have probably been sliding for awhile and Apple is turning their attention to it. I doubt there's some cataclysmic cliff or that it can't be turned around.

Regardless, this doesn't have much to do with the macOS / iOS dev market, which is what I was responding to. No chance that there are a bunch of big dev shops that got fed up with not having the Pro desktop / laptop hardware from Apple that they want and decided they'd rather just abandon the Apple customer base rather than deal with it. The value of that customer base is huge, and those dev shops will go through a lot to reach it.

"Our business has always been building apps for iOS and Android, but we don't like that we can't expand the memory in the iMac Pro, so we're just building Android apps from now on, even though more than half of our clients and customers are on iOS."

No way.

> I doubt there's some cataclysmic cliff or that it can't be turned around.

For any shop with a dozen mac pros running some mac software package it's a huge step to switch to PC if it also means switching the whole software stack. At this point there are probably a whole lot of shops that struggle with very old pros, or have started buying the iMac pro. But I'm sure there are shops that also held on for a long time and finally gave up and swapped both hardware and software because they grew tired of waiting (I know of a couple - but anecdata isn't data, so these are guesses).

The only reason Apple broke tradition and disclosed that a new pro is coming is to convince shops like these to hang in there for another year.

Sadly that doesn't to be the case. People want the cheese grater Mac Pro, with modern internals. Maybe Apple could offer it in gray and gag rose gold. Just make it the dependable cheese grater. They could have done that years ago.

But no. It's going to end up being the desktop PC equivalent of a RED camera system, with a price to match.

I hope you are one of the people that is consulting for them. This is exactly what is needed.

Which is why Apple will do the opposite. Their business model is the old Henry Ford line "if I asked the customer what they wanted they would have said a faster horse". They think they know what you need better than you do. Sometimes this works really well, sometimes not.

Get ready for an underpowered $10k pyramid whose only input is Siri.

And a proper freaking keyboard and mouse

Apple already lost the Mac Pro market when they made Final Cut Pro X, and stopped regularly releasing Mac Pros. They completely disregarded the professional market by killing functionality that was critical to the users. That combined with the lack of powerful hardware made a lot of the professionals go to PC and Avid. They addressed some of the issues in future releases but it was too late.

Creators were pretty much the only people keeping Apple alive in the tough years and they abandoned them.

So now, two questions:

- Will the Mac Pro actually be Pro?

- Will they ever bring back a Pro computer in a notebook form factor?

What's non-'Pro' about the current 15" MBP that was 'Pro' about the previous design? They're practically the same.

If you mean no SD card reader etc, it's my understanding that pro photographers tended to use a USB3/thunderbolt reader anyway, as the integrated reader was connected over USB2 internally and was far too slow for pro use.

- Escape button is on the touchbar

- Touchbar at all

- Battery reduced from 99 whr to 76 whr

- Lost magnetic charging port

- Lost every other useful port

- Keyboard generally considered worse

I currently have a 15" mid-2015 MBP, and I use the DisplayPort, HDMI, and USB-A ports on a daily basis. Losing the ports doesn't bother me too much, but similarly priced or cheaper laptops don't ask me compromise there. On the other hand the battery life, touchpad, and keyboard will be absolutely stellar or I'm not buying it, especially in this price range. Again, others ask me to compromise less. While Apple still has the best touchpads in the industry, others are catching up. Meanwhile, Apple has regressed in several ways

I've got a 15" 2017 MBP, replacing one from 2014 or so. I think on balance the 2017 is better for my needs.

I don't particularly like the touchbar (though that may change if IntelliJ ever bring out their promised support and it's any good), but it's not a huge deal for me (at least after I remapped esc to caps lock; I now do this on every keyboard I use, as I find it more ergonomic). I do love the touchId thingy, but they could have done that without the touchbar.

I prefer the keyboard to the retina MBP one, though I preferred the pre-retina MBP one to either. There was something about the feel of the rMBP (and MacBook Air) one I never liked. This is very subjective, of course.

The port situation mostly hasn't bothered me, as I just use a USB-C hub with power passthrough. It would have been nice to have at least a single USB-A port, but realistically I wouldn't use it much.

I find battery life to be similar to the previous one, generally. I think the size sacrifice is worth it, at least for my needs, because it makes the machine very portable. I barely notice the weight of it in a backpack now.

My biggest complaint is one you haven't mentioned; you can't get an integrated GPU only version. You can't even fully disable the discrete GPU. This means that you're stuck with all the quirks of the dual GPU setup. This is a step back, IMO; I've no real use for a discrete GPU on my work machine, and these dual GPU Macs have never quite worked properly.

I wasn't expecting to like it as much as I did, really, but I was mostly impressed.

I hadn't realized that they didn't sell them with just the integrated GPU anymore. Combined with the reduced battery, that seems like a poor decision.

As far as I'm aware, you've never been able to fully disable the dGPU. On my previous 2011 MBP, the dGPU failed. You could twiddle some UEFI variables to prevent switching to it so the system would boot, but it didn't actually cut power to it so it just ran hot and the battery died in 3 hours. Also, all the DisplayPorts were attached to the dGPU, so you couldn't plug up external monitors anymore if you did that. Units with the dGPU have worse battery life when there's a monitor plugged up, because it forces a switch to the dGPU.

what usb-c hub do you use? does it support two screens at once?

I have tried 3 and all 3 have not worked well with 2 screens (one screen being 2k)

> was far too slow for pro use

I read this every time it comes up on HN yet every single one of the five photographers in our shared office laments nothing more than the missing card reader.

If it was so slow why not just upgrade it to a USB3/PCIe-connected reader? and being a 'pro' device why didn't they do that already when USB3 became available?

I've no idea why they didn't do it, but they didn't.

An SD reader connected over USB2 is better than no SD card reader. If they cared they would have made it PCIe, but instead they took it away in a move that benefited only Apple. No pro benefited from the removal of the sd slot. It didn't make the machine cheaper, or faster, and in fact it made it harder and more complicated to use. And then they had the audacity to charge more for it.

My 9 year old toshiba has a pci based SD card reader. The possibility for it to be internal and also offer reasonable speeds is not beyond the pale.

Some of the top end cameras use both CF and SD, so sure, it doesn't cater for everyone, but SD has become so ubiquitous it seems odd it's missing.

Sure, it's definitely possible to have a fast SD card reader in a Mac laptop, but no Mac laptop, even ones with Pro in the name, ever actually did.

(I'm actually not sure what the reason for this was; iMac SD card readers were PCIe).

My 12 year old Asus has a PCI based SD card reader too, the same chip also provides the FireWire interface.

FWIW, the SD card slot in newer (but not newest, ugh) Mac models is connected via the PCIe bus and not USB.

Huh, really? I had a 2014 or something rMBP in work at one point; was almost certain it had a USB2 SD card reader.

EDIT: According to Apple, all the laptops are USB2: https://support.apple.com/en-us/HT204384

Desktops are PCIe.

I have a 2015 MacBook Air and wanted the SD card reader to belong to a Linux guest inside VMWare Fusion (for dd'ing custom RPi images). I just couldn't get it to work. Then I found this article https://kb.vmware.com/s/article/1035825 which says This issue occurs because the SDXC card slot included in newer Mac models is connected to the Mac using the PCIe bus and not using the USB bus that the original SD card slot uses.

This process is a mistake for two reasons.

The first is time. If they were ready to release a simple, expandable, better-everything box a few months from now, every single person who’s been chomping at the bit for a new pro Mac would buy it immediately. That would fix their market, gain them goodwill and give them plenty of time to explore what all these amazing professionals actually want to do, later.

Second, this just doesn’t seem like the hardest list of requirements to guess. Heck, there are practically no constraints, not even price! It doesn’t have to be thin, it can guzzle power, they don’t have to compromise on any ports (put 4 of everything you can think of in, old and new). Theoretically they could build a slightly better box with some analysis but literally no one is asking for that gargantuan task to be done yet, at least not before 2022.

If they just make an expandable ATX/cheese grater style case with generous cooling and allow top Nvidia GPUs, I will pay a large premium for this. Ideally there would be the option for high end consumer gear (i7/i9, vanilla RAM) in addition to the Xeons/ECC/FBDIMMs that get so expensive so quickly, but two tiers with different architectures is probably too much to hope for.

Still hoping for a professional MacBOOK, but Apple doesn't ever listen to their pro customers or do they?

? Isn’t that a MacBook Pro?

By comparison with high-end mobile workstations, it's a toy.

A maxed out Macbook Pro has 16GB of non-upgradeable, non-ECC RAM and a GPU with 4GB of vRAM and a Passmark score of 3,553.

Lenovo will sell me a machine with 64GB of socketed ECC RAM and a Quadro P5000 with 16GB of vRAM and a Passmark score of 10,281.

The Macbook Pro doesn't have built-in LTE. It has a non-removable battery. It throttles badly under sustained loads. It has chronic issues with display calibration. If you want a useful selection of ports, you'll need a bag full of fragile and easy to lose dongles. They've only just rolled out eGPU support and there's still no official support for Nvidia GPUs. There's no 17" option. It has that damnable butterfly keyboard.

Yes, the Macbook Pro is very thin and lightweight, but Apple have made a lot of performance and usability tradeoffs to get there. That doesn't sound very "pro" to me.

I don’t get it.

Apple doesn’t make a mobile workstation. They make a Pro version of the MacBook.

Take a MacBook, give it some more power, and you mostly have a MacBook Pro.

If the complaint is why doesn’t Apple make a high-end mobile workstation I'm guessing because the market isn’t there for it? At least not at the scale for Apple to touch it?

A lot of professionals end up being serviced just fine by a “Pro MacBook”. I don’t get this obsession with acting like Apple is doing something wrong by making the MacBook Pro a pro version of the MacBook instead of something like a Dell Precision, they’re two very different things with two very different markets.

There used to be a 17" MacBook Pro. Last one came out in 2009. It was a bit of a beast to carry around, but not prohibitively so. It was .98 inches thick and weighed 6.6 pounds. The current 15" MacBook Pro is .61 inches and weighs 4.02 pounds. So not a huge difference. I commuted on the bus and subway with a 17" pro with only minor discomfort.

It had gigabit ethernet, FW800, three USB ports, 1 thunderbolt port, ExpressCard/34 slot, audio in, audio out.

With the exact same form factor and updated ports, Apple could offer a hell of a Pro Macbook Pro. Just give it a 4k screen and an option for a second SSD in the space formerly used by a DVD drive.

While I agree with you, I spec'd out the Lenovo that the above commenter was referencing costs $5500 with all the "discounts" tossed in. MacBook Pro tops out at $3000 (without going crazy on SSD). I understand that the market for people wanting to spend over $3k on a laptop may be thin, it seems quite easy to just give someone the option to spend $5k on a Macbook Pro if they just have to have it. Why exclude the option?

Form factor? Could a MBP in its current form factor actually fit all that?

The form factor is one of the issues with the current design. Or simply add a third product line: MacBook Workstation. Thicker for a larger battery, cooling and higher end CPU/GPU and moar RAM. Xeon class processors as an option.

OSX is a closed platform. If you're an FCPX or Logic Pro user, you're totally reliant on Apple's hardware offerings. If Apple don't sell the hardware that you need to do your job, then you have the choice between switching over completely to Windows (not a straightforward task) or tolerating the least-worst option from Apple's lineup.

If Apple honestly say "we couldn't care less about professional users, we're making money hand over fist anyway", then at least we all know where we stand. Apple haven't taken that approach - they keep making grand overtures about how much they care about professional users. They keep making grand promises about how the perfect professional machine is just around the corner, how they're not updating the current machine because the next big thing is so incredible, amazing, awesome.

There are a lot of very loyal Mac users who have kept Apple in business through thick and thin, but their patience has been tested to breaking point. Those users have been a key part of Apple's marketing; Apple make computers for dreamers, visionaries, artists, the cultural elite.

I don't know anyone who is genuinely happy about being a Mac user, but I know a lot of people who grimly tolerate giving Apple their money. They know what to expect from next year's Macs - they'll be thinner, lighter and subtly worse in a lot of important ways.

What happens to the Mac brand if those users leave? What happens if the resentment felt towards Apple by photographers, video editors, 3D artists and recording engineers turns into a full-scale revolt? What happens if owning a Mac marks you out not as a sophisticated member of the cultural elite, but a nerd who wants a slick *nix experience or a sucker who paid $2000 for a Facebook machine?

Apple are playing a dangerous game. Apple don't need to care at the moment because the iPhone is so gratuitously profitable, but they should know better than anyone that the market is incredibly fickle. They're burning decades of goodwill for no obvious reason.

It wouldn't be particularly expensive to make the majority of pro users happy. Bring back the old cheesegrater Mac Pro and stuff it with the latest commodity components. Bring back a previous-generation MBP chassis and fill the extra space with an i9 or a Xeon E3, four SODIMM slots and a GTX 1070 Max Q or a Quadro P5000.

They don't have to promote these machines, they don't have to display them in Apple stores if they're ashamed of making a functional computer, they just need to make them and promise to keep making them to secure the continued loyalty of their most loyal customers. Apple won't do it, for reasons known only to them. In the long term, that could prove to be a very expensive oversight.

Honestly they should consider bringing back the clone program. HP/Dell can sell me a workstation that runs OSX, that they update on their schedule, and I'll be happy. I don't want it pretty, I just want it functional.

You mean the one without a real keyboard or the one with only one port while plugged in?

I disliked my new MBP at first honestly it’s not that bad if you can bring the rest of your hardware up to speed.

My monitor does video, charging, and USB 3 with full size ports with one cable.

And the keyboard is subjective, I got used to it pretty easily. I actually feel like it’s more tactile than the old one

"it’s not that bad"

That is not the most ringing endorsement for a very expensive and supposedly premium laptop.

> My monitor does video, charging, and USB 3 with full size ports with one cable.

My Lenovo X1 Yoga also does everything via one Thunderbolt cable, but it also has a full selection of normal ports on the notebook itself rather than only on the docking station. It also has a 1440p touch screen and a nice keyboard, and still costs 1000 € less than a Macbook Pro ^W Deluxe.

Ok. When I can walk into a Lenovo store and walk out with a loaner while mine gets fixed I’ll buy it. I’d also like OS X, but I can settle for Windows these days.

Lenovo has 3 years next business day on site service included with many models - you can choose to add the accidental damage option and it still costs less than AppleCare, not to mention options to upgrade to 4 or 5 years cover.

I just searched and couldn’t actually find any models listed with it available, but even so the search I did came up with more results for “they couldn’t fix it onsite” than anything.

I’d rather pay the difference in AppleCare. Most developers on this site are making large sums of money with their machines, 1000$ or even 2000$ is hardly going to make a measurable difference on the ROI of my laptop.

Let's hope Apple release an MBP that you like better then.

If the $2k isn't that significant, and minimal downtime is so important; may as well keep a slightly older refurb model on-site as a hot-spare ready to go just in case there's a problem... A policy that would work, whatever your OS / brand preference.

I love my MBP, I replied to the comment about my endorsement but for some reason the comment doesn’t show up if I’m logged out?

The endorsement was tempered by the fact that around these parts endorsing the new MBP is apparently treason. I’m getting my comments downvoted for implying the MacBook Pro is a fine machine.

They give loaner Macbook Pros?

If you setup Joint Venture at purchase. If you make a living off your Mac I’d recommend it because for one 500$ fee you can cover multiple devices

I tried to live with mine for about 6 months before giving up, selling it and buying a second hand 2015.

Needed keyboard repair twice in that time, and was horrible to type on. Useless touchbar, and an over-large trackpad that was constantly activating when typing. The last was odd as I regularly touch the old size trackpad with thumbs when typing but never had an issue. Add in the occasional freeze and crash and I feel it was a complete lemon.

I'm very happy with my 2015 and get longer battery times than I did with the newer model. I have no idea what I'll do when it needs replacing.

Which monitor do you have?

Acer xr382cqk

IMO best all around monitor being produced today. 38” 1600p ultrawide (equivalent density to a 1440p 34” ultrawide), 75hz which is high enough for gaming without going into gamer aesthetic/retail markup territory, Freesync.

I can’t recommend it enough tbh

The Macbook Pro has 4 total USB C ports. You can use any of them to charge, and still use all 3 of the others for whatever else needs doing.

The keyboard is subjective.

> The keyboard is subjective.

The feel of the keyboard is indeed subjective (I happen to like the feel of the keyboard on the new MacBook Pro), but in this case Apple genuinely built a buggy keyboard, so much so that people are writing songs about it [1].

[1] https://youtu.be/FdS3tjEIqUA

I ordered one and then returned it (for the first time ever after ~10 or so MBPs) -- ended up just purchasing a new refurb non-touchbar MBP, which is what it was replacing too.

Largest reason for return: G key. 30% of the time = two Gs. 50% = 1 G. Remaining 20% = 0 Gs/no keydown whatsoever. Hah.

> The keyboard is subjective.

Yeah, I think if the keyboard didn't break from a spec of dust it would be fine.

The 13" Pro model has two USB-C ports.

My 13" Pro has four USB-C ports. I think it's just the non-touchbar one that only has two.

Not all of them. The models with the Touch Bar have four.

16gb ram isn’t very pro.

Honestly I buy this argument way more than any complaints about the touch bar.

With my work my swap usage is 10-20GB at a time, between my IDE, app, tests, and browser tabs. 16GB is utterly painful but it’s the most you can buy.

The CPU also isn't really up to par for pro use cases. I understand not putting a burly CPU and pile of RAM into the mid-high-tier laptop but developers and media people need something. Running parallel make on my brand new touch bar macbook is like 2-4x slower than doing it in a linux VM on my ~$1500 windows desktop from a year ago, to the point where I'd work more efficiently running remote desktop on a $200 chromebook.

Running parallel make on my brand new touch bar macbook is like 2-4x slower than doing it in a linux VM on my ~$1500 windows desktop from a year ago

It's also more than two times slower on my 2013 Dell Precision Xeon workstation.

But that comparison is not really fair, a MacBook Pro has to use a mobile CPU or it won't be able to dissipate the heat. Also, depending on ambient temperature, possible airflow, the MacBook's CPU probably throttles fairly early to prevent overheating.

A desktop CPU can larger power envelopes, typically throttles less and most virtualization has gotten so good that for CPU-bound workloads the difference between running on bare hardware or on a VM is barely noticeable.

to the point where I'd work more efficiently running remote desktop on a $200 chromebook.

Maybe it depends on the field I am working in, but I can't really run stuff on any laptop or desktop unless it is a > 5000 Euro workstation with plenty of RAM, storage, and a beefy GPU. The value that the Mac platform provides for me are the many applications that Chromebooks/Linux don't have good substitutes for, such as OmniGraffle, Deckset, Things, Microsoft Office, Little Snitch, Pixelmator, Acorn, Lightroom, and Affinity Designer. So, it's not just an expensive terminal ;).

I have one because of iOS development. The touch bar is stupid but ignored. The touchid is actually pretty awesome and using it for sudo is pretty convenient, especially when in a group setting where I’m nervous about people noticing keystrokes (eg, Corp hackathons and stuff).

But the memory is noticeable. For general performance, but also for running lots of vms and stuff in different configurations.

Still situations where its convenient to spin stuff up locally rather than cloud.

No. The MacBook Pro as implemented is a prosumer machine at best. A Pro machine would have higher end components at the BTO level. 1) larger size for greater battery capacity, 2) Xeon-D as an option, 3) 32-64 Gig of RAM, 4) High end graphics, 5) better keyboard. User replaceable drive, and memory. Apple has the capital to build a Halo laptop, just as they do for phones. They simply chose not to.

He probably wants a portable Mac that the user can upgrade RAM/SSD/battery with a non fluffy keyboard. Yeah, that ship has already sailed.

Fluffy is about the worst adjective imaginable to describe Apple's butterfly keys. They are crisp and sharp, lacking in tactile response though.

Agreed, I wasn't clear. I meant the touch bar.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact