Hacker News new | past | comments | ask | show | jobs | submit login
New Mac Pro Tech Specs (apple.com)
246 points by arunmib 27 days ago | hide | past | web | favorite | 480 comments



Still don't understand the justification for this machine, I waited years for a MacPro but when it became clear that Apple actually considered it a dead product and behind the scenes were building the iMac Pro to fill the niche I switched to windows and now run a dual 1080Ti workstation (Uses CUDA for 3D rendering path-tracing) for around half the price of a iMac Pro.

Apple bloggers said time and time again the reason they were not making a tower was that there isn't really a market for one anymore, yet when they finally made it they decided to build something that only really serves the highest of high end video editors.

Completely ignoring 3D, mid range video editors, developers who need high core counts + ECC, deep learning, etc.

Before they made it we kept being told "There isn't enough of you to justify them making it"

They finally make it and the narrative turns into "It's not for you it's for people who edit Marvel movies"


Apple didn’t say those things - like you said, bloggers, critics, people whose interest is in making you read about Apple and continue clicking on articles, they made up all these different reasons why Apple would or wouldn’t make a tower.

I don’t really think Apple cares about that drama. I think they know they have customers who will pay the premium. I think the machine is squarely aimed at businesses making the purchase, not consumers.

Is it really a problem that the Mac Pro is only for ultra high end users? Apple hasn’t made a mid-range consumer tower in over a decade now. If you walk into Best Buy how many towers you think they’re selling compared to laptops?

That’s the market Apple sells in, not the low-margin custom built PC parts market.

Apple doesn’t cover every use case of the personal computer. They are just one OEM. Unfortunately if you like macOS they are the only OEM.

As far as the mid-range video market that you talk about, what about the iMac Pro (has ECC memory) or a high spec iMac is insufficient for that task? Sure, it’s not as nice as your dual 1080Ti setup, but also, NVidia isn’t actually a viable option for Apple anymore thanks to their disastrous support for the platform in the past. If you made a Hackintosh system with NVidia you’d still be SOL. You aren’t getting CUDA on Mac no matter what hardware configuration Apple comes out with. Is Metal supposed to cover that use case and compete with CUDA?


Find me a comparable computer that:

- Ships with anything other than Windows (Linux, OSX, BSD, etc)

- Has a corporate warranty

You'll quickly find that Apple has this market cornered. To many people, you're not paying the extra money for the goodness of Apple. You're paying to avoid the badness of Windows. There's also some software that works on Macs, but not Linux machines that may be necessary for the job.


"Find me a comparable computer that:

- Ships with anything other than Windows (Linux, OSX, BSD, etc)

- Has a corporate warranty"

Support for mac-only software notwithstanding, Dell's workstations officially support RHEL, have Nvidia GPUs for CUDA workloads, and come with up to five years warranty with on-site service. You can probably find comparable HP Z-series workstations too.


>-Has a corporate warranty

Has Apple finally started to offer one comparable to the Big Three? (SLAs, Onsite service with guaranteed reaction times and HW replacements and so on) Serious question - this was actually a big argument against Pro Apple workstations in the past.

>-Ships with anything other than Windows

Well, you won't get macOS of course, but all big workstation manufacturers sell workstations with Linux preinstalled. It's really nothing unusual and hasn't been for quite some time.


In at least some cases, yes.

https://support.apple.com/applecare-enterprise-service

I do not know what the criteria are to be able to get on that program, though.


Huh, consider me (cautiously) impressed. (And slightly ashamed, since I completely failed to find this information before writing my post. :) )

At least at first glance, it seems they have learned their lesson in this area.


Not really comparable if it's only US and Canada.


Dell will ship you a dual socket workstation with either RedHat or Ubuntu, even better is you can get it with an Nvidia GPU.


Apple’s warranty service is terrible compared to the alternatives like Dell. A couple years ago I had a failed key on a laptop with a service warranty. Dell had a tech in my living room with the repair parts in less than 24 hours.

Apple could not possibly do that today. Maybe you don’t need it, but if you care about warranty service Apple is not the answer



> You’re not paying the goodness of Apple > You’re paying to avoid the badness of Windows

Are you telling me there is a market of an intermediate OS? Because I would be ready to pay about $250/year/user for a Linux that is as good as Mac, but with cheaper and more maintainable gear than the iMac. It’s almost as if Apple were trying to tell us there’s an intermediate market up for grabs, but they’re still to close to it for any incumbent to try their luck. Canonical was close to it, but stuck to the wrong business model and decided to switch to Unity in 2013 instead of stabilizing Ubuntu. Product roadmaps are hard. Jony Ive is available, just saying ;)


Apple service is atrocious. Don't they make you book an appointment and turn up at their store just to get someone to look at it? I hear of people who spend more time taking their laptops to the local Apple store than most people with major illnesses spend going to the doctor...

I bought a Dell XPS13 laptop recently which unfortunately had a non functioning motherboard. I contacted Dell and a technician came to my house first thing the next day and replaced it, no questions asked. Totally hassle free. I'd take that any day over having to book an appointment to see a 'Genius'.


> Apple service is atrocious. Don't they make you book an appointment and turn up at their store just to get someone to look at it?

I had a macbook pro with a logic board that died. I phoned Apple, they couriered me a replacement device next day, and that courier picked up my old device. Literally couldn't ask for better service.


Seriously? My friend was forced to drag his 27 inch iMac into the Woodfield shopping mall location, which if you've ever been there, is a quarter mile minimum walk from the parking lot to the store.


He probably chose to do that. You can't force someone to physically appear in your store. What if they're disabled?


Pedantry. They didn't give him an option besides physically bringing it into the store.

Good question about accessibility. I have no idea. But it's not as though our society is a perfect utopia for the disabled. I can only imagine it would have gone far worse.


I was just thinking reading the comment that you replied to that this goes one of two ways. Some people seem to get wonderful service, others crap for relatively high price.


To be fair, we experimented with a Dell XPS 13 laptop that had a succession of problems, and the service was the worst I have ever encountered in IT, taking several months of elapsed time before we finally got an on-site visit from someone who knew what they were doing (who then fixed the laptop in under an hour). That was what "next day" level support actually looked like in our case.

We were attracted to these as there was a Linux version that potentially offered a good alternative to Windows 10 for some of our people, but the experience was so bad that instead we immediately ruled Dell out as a supplier for any serious equipment for the foreseeable future.


Interesting. I've never had an issue with getting a next day tech to our office from Dell.


I think we just got stuck in endless loops of tier 1 support people trying to run through checklists over the phone and then again via email. It was essentially the business class version of "Have you tried switching it off and on again?" repeated seemingly endlessly, bouncing from one support tech to another. Clearly several of the techs didn't even understand that they also sold these laptops with Linux on them and that's what we had.

Eventually, literally months later, someone finally seemed to escalate it to a person with the authority to send out a technician, who as mentioned before then fixed the actual problem in barely any time at all. We were on the point of just writing off the machine by then, as the amount of time we were wasting dealing with Dell was in danger of costing more than just buying a new box, and at least we would been reasonably confident of having a working system the next day in that case!


> Apple service is atrocious. Don't they make you book an appointment and turn up at their store just to get someone to look at it?

I can't speak for anyone else, but the last time I had trouble with my MacBook (five years or more ago now, the touchpad had cracked, IIRC), I made an appointment in the morning to come in in the afternoon. Walked up, explained the problem, gave them the machine, and they called me back to pick it up a few hours later. That's not bad, IMO.

Home service is awesome, but unless things have changed recently, I don't think that's common. It also may be different for wear-and-tear fixes vs. DOA replacements; Dell has a strong interest in fixing the latter as quickly as possible to protect their reputation. Most companies, most of the time, expect you to come to them to get service.


I've dealt with both Lenovo and Dell with on-site next day service. I thought it was a standard option across the industry for business PC warranty.


It may be. I was thinking of standard warranties, though. I'd expect a business warranty to be different.


They also will ship you a box and ship it to a service center, I've been shocked on the turnaround time on that even with major service. From Me getting the box to getting the laptop back it was just two days (I ship on day one, they get it overnight on day two, they ship back out with a new Logic Board and cables, then I get it on day three.)


>- Ships with anything other than Windows (Linux, OSX, BSD, etc)

It doesn't exactly sound like it would be any kind of a problem for you to install your own OS if you're willing to accept a manufacturer supplied BSD.


Dell does this.


HP offers Linux on their Z workstations, which have been reliable workhorses for decades. I believe Dell does, too.


Hmm, well I'd honestly do just about anything I could to avoid macOS. If only I could compile iOS apps without it, I wouldn't have to deal with their backwards and incapable UI.

Hopefully soon I'll be able to use VSCode to remotely work on my Mac though! - https://github.com/microsoft/vscode-remote-release/issues/24

> You'll quickly find that Apple has this market cornered.

No they don't. Not the market for people who just need a corporate Unix box. At my consultancy we have a mix of machines with many people running a System 76 tower or laptop and you'll find plenty of folks here on HN who will name Dell, Lenovo or HP as their supplier.

Perhaps you're thinking of just the market for people who do the absolute highest end video work for the film industry? Other than that, I don't see it.

And Windows has been rock solid for a massive amount of users and developers of various types since Windows 2000.


>> I wouldn't have to deal with their backwards and incapable UI.

Well, that’s, like, your opinion, man...

No, seriously, I feel the same way about Windows’ UI. I mean, did you ever use Windows 8? And Windows 10 has built-in ads by default in the UI/UX?

Windows XP SP3 was peak Windows, IMHO - the awful part about them making an abortive mess bastard child of the UI/UX in 8 was that there are millions of non-techie people who literally know how to follow one sequence of events on their computer, and that usually starts with ‘press start’.

Windows 8’s awfulness is probably what drove a lot of those people to iPads. If you’re learning a new user interface idiom anyways, and even Microsoft Office is on the iPad; why stick with Windows?


I don't know anybody that does serious work on iPads. Every quarter there are only ~9 million iPads sold for every ~34 million Lenovo/HP/Dell laptops and half of those iPads have got to be for kids from what I can see.

Sure, an iPad is fine for consuming documents and doing light markup. However, if you're going to create things you're going to need to be multi-tasking and - hey, I have multiple iPads and I use them all the time - but I'd love to have a contest between what I can get done with Windows/Linux versus what you can do on an iPad because there's just no comparison as far as I can see.

I'll add macOS to that too. It's not even in the same class as Windows/Linux. I watch people using macOS daily and I swear, they are constantly swiping to find that full-screen app they lost because of the complete lack of window management in macOS. They'll put Chrome into full screen and then struggle to get the detached devtools window back up. They'll have to install things like iTerm with it's own tiling manager, to manage 3 terminal windows. Apple just doesn't care about practical things, they are constantly focusing on how things look, how thin or light they are or how they can make the most amount of money by removing options and claiming everything is always better that way, when really it just serves to remove the amount of work they have to do to support things like you know, physical buttons, headphone jacks, options/modes in software and so forth.

Anyway, the UI in Windows 8 and 10 were also completely configurable to make it more like the original Windows UI. If you don't like the default configuration you can change it or install 1 program (7+ taskbar tweaker) to make it just about perfect. What I really, really like is being able to do things the way I want to do them and not the way some godless corporation has decided it should be. Apple just gives you nearly zero choice compared to Windows and most obviously, Linux. They're just on the wrong end of the spectrum for how I like to do things.

And I never got any ads on Windows - just pre-installed apps like Candy Crush and Skype. I'm assuming they installed Candy Crush because it's a lot, lot more popular than Solitaire or Minesweeper with today's crowd. This is no different than Apple pre-installing things on macOS/iOS. And before anyone says anything about Apple not pre-installing 3rd party software...I think that's incorrect. If you want to use any of the Unix aspects of macOS, you have to start off with Apple's lame and old versions of even basic Unix utilities and programming environments until you go and install some other 3rd party things to fix the situation. That's way worse than having to right-click a Candy Crush icon to remove it once IMO.

Pressing Start is one popular way of starting a program on Windows, so I don't understand that line of argumentation.


I don't see the contradiction here honestly.

The only tangible market for a Mac Pro is professional Final Cut users in a professional setting, no?

If this is hogwash, tell me so, but it just seems that any other realistic scenario that requires this level of hardware (like research, rendering and AI) would be significantly cheaper and better supported outside of Apple's ecosystem.

Short of having a pretty device to sit in a studio, what other reason is there for this to exist? (And how much of that audience is more likely to just buy iMac Pros).

As regards development - Most development tasks won't significantly benefit from the performance offered here, and anyone who needs that performance is likely going to buy something significantly better value for money (as regards tech specs) than a Mac Pro.

I'm not aware of any significant tools for 3d modelling or video editing (besides FCP) that are OSX-exclusive, and that audience is surely better served by a much cheaper Windows/Linux machine.


There aren't any serious edit houses that use FCP X outside of gimmick advertising deals like some late night shows. Everyone is using Avid or increasingly Premiere anyway.

In my experience, the past five years have seen a dramatic shift to windows in professional facilities (Oscar winning editors).

I do know one very high end editor who cuts on a Mac mini. The old school guys are used to proxy workflows and you don't need lots of power for that anyway.


Unfortunately they will probably point to the poor sales of this model as justification that this pro tower market really is dead. “Sorry guys, for some reason only Marvel editors bought these so were canning the line”


I kind of feel that Apple needs its own pragmatic Satya Nadella to regain relevance in many niche applications. Sure, Apple is nowadays a consumer company mainly, but what is the problem having a competitive professional line as well?


Profit? And that the audience is limited.

If I have a resource heavy problem, I can solve it for a fraction of the price outside the Apple Ecosystem.

Who buys these? When you look to the source you will understand.


Were people editing Marvel movies really waiting a decade for this though? Pretty confident they've moved on to PCs years ago. I struggle to find the target audience for this, now that they are alienating home users.


>Were people editing Marvel movies really waiting a decade for this though?

Given the number of posts telling people in Hollywood not to restart their Trashcan mac Pros because of the Google Raven screw up, I would say that yes, in fact, plenty of people in Hollywood are using Mac Pros, and likely will buy this new one.

> I struggle to find the target audience for this, now that they are alienating home users

This machine has NOTHING, and I mean nothing to do with home users.

Next you'll tell me that Tesla have alienated "normal" car buyers because they make a $100k P100D rocket ship. Tesla also make a $35k regular sedan. Apple also make much cheaper iMacs, and Mac Books and Mac Books Airs for home buyers.

I don't understand why people time and time again bash Apple for making something that isn't in any way designed for "home users", while they still make plenty of things that are.


The base price has more than doubled from the original cheese grater we all fell in love with. I'd be willing to bet the home user MacPro community is much larger than Hollywood. People have been clamoring for an updated cheese grater for the better part of a decade. They are STILL actively developing hacks to keep their 2008/2009 models running. Shit, I just retired mine this year in favor of Hackintosh, because I'm not dumping $10,000 in a modular Mac.

I think this new MacPro is going to be a huge failure. Professionals are running PCs now, and home users won't spend the money.


Many of the home users who were buying $3000 base model cheese grater Mac Pros could get by now with a 6-core i7 Mac Mini.

Price would be around $1300 for the computer, $300 for an eGPU enclosure, and $700ish for a Radeon R7, plus aftermarket RAM. AMD's not in a great spot for high end GPUs right now, but when the Navi 23 cards land next year it will be looking better.

This doesn't scale as well for multi GPU machine learning workloads and Apple needs to get over their shit with Nvidia, but as a lower end "modular" Mac than the $6000 cheese grater 2019, it's an option.


> Price would be around $1300 for the computer, $300 for an eGPU enclosure, and $700ish for a Radeon R7, plus aftermarket RAM. AMD's not in a great spot for high end GPUs right now, but when the Navi 23 cards land next year it will be looking better.

I used to want an eGPU. Then I learned that you need to disable SIP in order to do so...

> This doesn't scale as well for multi GPU machine learning workloads and Apple needs to get over their shit with Nvidia, but as a lower end "modular" Mac than the $6000 cheese grater 2019, it's an option.

No one is going to wait for that hypothetical future where MacOS supports Nvidia GPUs.


> I used to want an eGPU. Then I learned that you need to disable SIP in order to do so...

I hadn’t heard that. A quick search suggests it’s only necessary for loading Nvidia kexts or using TB2 https://www.reddit.com/r/eGPU/comments/8lybin/macos_egpu_wit...

> No one is going to wait for that hypothetical future where MacOS supports Nvidia GPUs.

True, but their Navi cards are at least competitive in the price ranges where they exist. Hopefully the high end ones next year continue that. If you’re looking at Titan or whatever the current ML thing is in the $1000+ range, then you might be stuck with Nvidia.


> I hadn’t heard that. A quick search suggests it’s only necessary for loading Nvidia kexts or using TB2 https://www.reddit.com/r/eGPU/comments/8lybin/macos_egpu_wit....

Interesting. I might ask around to verify this.

> True, but their Navi cards are at least competitive in the price ranges where they exist. Hopefully the high end ones next year continue that. If you’re looking at Titan or whatever the current ML thing is in the $1000+ range, then you might be stuck with Nvidia.

I am rooting for AMD's Navi cards too. It's just unfortunate that CUDA seems to be more supported than OpenCL.


Agreed on that. AMD put a bunch of work into the Cycles rendering engine (for blender) to get their OpenCL support up to par with CUDA, and now it's completely disabled on the Mac version thanks to Apple deprecating it. Disappointing.

https://developer.blender.org/rBbb0d812d98


You could sure, but people buying the MacPros bought them because they wanted modular. A glorified laptop attached to a giant monitor isn't an option.


My Windows machine is admittedly more modular than a mini+eGPU would be. I can pull the CPU out and put in a new one whenever I want! But over the course of 12 years and 3 computer builds, I've never done that once. By the time there's a noticeable CPU upgrade available I'd need a new motherboard to go with it.

So I think there's a big segment of the "modular" market that only really cares about having GPU options and upgradeable RAM.

It's not for everyone, but the people in between the high end Mac Mini (6-core i7 + thunderbolt GPU) and the low end Mac Pro (8-core Xeon W and internal expansion slots) are a small enough slice that Apple doesn't care.


Missed the edit window but for GPU I meant the Radeon VII not R7. Double checking benchmarks, the RX 5700 XT compares pretty well to that, but 2nd gen Navi will have a new higher end card.


You can only focus on so many products... I don't see why it's confusing when people are upset by what they choose to focus on


Yea, I remember when PowerMacs could be had at a wide variety of price points. That was nice.

If Apple doesn't want to serve the "I need a decently powerful machine but don't want to waste money on what is basically a status symbol" market, maybe they should license MacOS to someone who does. Something like that might actually bring me back to the platform. As it is, I stick to PCs running Linux.


Yeah, I've waited years for this machine, was excited when it came out, and in an epiphany last night, will probably pass on it and opt for a beefed up Mac Book Pro instead. It will be cheaper and fill my needs.

I'm a programmer. If I want to prototype a multi-component stack, I've got a Raspberry Pi Cluster running K8 for that. If I want to play around with deep learning, I'll just spin up EC2 instances. Games? Let's be honest. That war was lost decades ago. I've got a good PC rig for that now. The days of a general machine to do everything are gone.

The only issue is the storage space, which I can get from a multi-bay drive enclosure.


> Let's be honest. That war was lost decades ago. I've got a good PC rig for that now. The days of a general machine to do everything are gone.

Businesses listen when their bottom line is hit.


They used to do this and it almost led to the death of the company. I’m sure they’re quite hesitant to try again. Instead, you can look into the hackintosh project


They’re making money, now - they have no reason to license MacOS, and all the more reason not to.

Apple’s hardware/software combination is the reason a lot of us have stuck with them for so long. It’s quality control. No drivers to fuss with; just plug in and go.


>It's quality control

My MacBook's keyboard disagrees


Yes. The mighty have certainly fallen since 2015.


That was back when Apple was primarily a desktop computer manufacturer. In any case, they could place restrictions on licensed clone makers to avoid cannibalizing sales of their more popular products (e.g., no laptops).


> ... I switched to windows ...

Some people do not want to do that.


I didn't do it lightly, I haven't owned Windows machine in 17 years.

But as I said dual GPU machine for half the price of an iMac Pro that can run Octane Render[0] which requires Nvidia cards with CUDA left me asking why I didn't do this years ago after using that for 15 minutes I couldn't go back to how I did my work on my old Mac.

If Apple wanted my business I would have given it to them but they were pretty insistent that they had no interest in my money.

[0] https://www.youtube.com/watch?v=QE83n0qhe48


I agree! I find Windows quite cumbersome and unpleasant to use and would rather pay more for macOS; not because it is a status symbol, but because it is simply a better product for me.


I find macOS cumbersome and unpleasant to use and would rather pay more for windows or linux.


It's it wonderful that you can have an opinion that works for you too?


It's sad that Windows users have a plethora of hardware options available for them while Mac users are stuck with few choice options.


I need a Mac where I can replace the harddrive when it fails after 18 months like Mac drives always do, and also I'd like a bay to put a decent optical drive and not the low quality Apple SuperDrive which they finally gave up on, and add a card so I can have more than 1 USB port which is ridiculous.

And it should cost around $500, not $12,000.

$500 with what I request is certainly available on the PC side of things and I have that and run Linux on it and have transitioned everything I can off Apple because they can't produce the desktop that I need. I've put Mac versions of my products on bug fix only status and when customers ask me when the new version comes out I tell them never and advise them to try Linux.


Doesn't the new top tier Mac Mini fill a lot of the role the trash can Mac Pro did? You can hook up GPUs to it with the thunderbolt ports. I think the top model is faster than the trash can Mac Pro.


> You can hook up GPUs to it with the thunderbolt ports

Thunderbolt GPU cases are hundreds of dollars.

For a dual GPU Mac Mini setup (which would be a mass of cables and require 3 outlet plugs) you could afford a 3 GPU Windows tower.

As someone who used to be a pretty extreme Mac evangelist I really did look into all this before I made the switch to a Windows machine.


Sure, Windows machines have always been cheaper for similar specs and performance. The Mac Mini and GPU setup is way cheaper than the new Mac Pro and hits a bunch of the use cases (development, machine learning, gaming) that people wanted a Mac Pro for.


The current and last generation of Nvidia GPUs are not officially supported by MacOS. How is that good for use cases which uses CUDA cores?


I think they should include Apple Care for their professional equipment. This won't put any any more money in Apple's pocket but I could see it being another differentiator and a good bit of PR, especially to all of those price arguments. Plus, it's only $300 and I'm sure Apple could suffer that "loss" on a $6,000 - $30,000 purchase.


I've always argued that for the prices that Apple charges Apple Care should be included in all its products.

The first Apple computers I bought lasted for years and I never thought Apple Care was necessary, but since the fiasco with the 2011 MBP and the butterfly keyboards I'm not buying another of its products without Apple Care. Yes, Apple ended up doing the right thing in both cases but it took years after the problems started and a couple of class action lawsuits.


It presumably is in some areas. In New Zealand we have The Consumer Guarantees Act. Things are expected to have a reasonable life span. What this means is not specified but is generally accepted to be relative to the price paid (eg if it’s expensive, it should last). It’s a fantastic piece of legislation. I’ve had a 2.5 year old iPhone covered for example.


I believe that the parent meant that Apple charges enough (in the US), so it should have the extended guarantee included there. In NZ, things cost extra due to that law, correct? The difference: is everybody is paying for that extra protection, there's no choice.


If you have this protection, it's like insurance. You, and everybody else, pay an insignificant amount of money and you get a much more expansive treatment incase of failure. Sure you have a choice, back the politicians that are against this.

I don't get why freedom loving people rather have a corporate take away their freedoms than a democratic self governed body. Both are power clusters, but in one you have a moral ground to criticise when it's acting against the populous interests (democratic government). While abuses of power from corporate cannot be argued against morally, you don't own them, they are playing by the rules, they manage to make a buck, all is above board morally while in actuality masses of people might be hurt by corporate abuses of power.

Freedom is extremely important but it is sometime conflicted with other values, a reasonable discussion of the conflict and tradeoffs is a much better way to improve life for people, than fanatically protecting the one single value you hold most dear.


We do have very high retail mark ups. Perhaps this is due to that law. But it is very nice protection to live under.


Aside from the additional taxation and regulations in NZ, it does have some pretty unavoidable issues which drive up the prices of consumer goods. It’s a very small market, and very far away. Getting things there is just going to cost more anyway, and the market isn’t big enough to invite significant competition, so there’s usually a large (or at least larger than a lot of other places) markup on imported goods.


Is it really far from China compared to US/Europe?


It’s not just far, it’s far away from everything, including all the major trade routes. In Europe and Asia, a lot more goods are produced close by, and if you ship a product to any country in Europe or Asia, there’s a lot more people to buy it, and a lot more markets close by.


And there are lots of regulations too. For example I sold an oil burner light on eBay to someone in NZ (didn't actually know as there was a UK address that forwarded the package) and I included some oils for free...

Big mistake. Package was returned and I had to remove them.

My only point being it's a poor port in part because of its delicate ecosystem, which necessitates very strict regulations.


My friend living in NZ has a fantastic quote about the distance:

> NZ is a fantastic place. It takes 24 hours to get anywhere around the world.

Yep, it's that far.


It’s far but 24hrs is a bit stretching reality e.g.

~12hrs to Santiago Chile. ~9hrs to Bali, Indonesia. ~13hrs to Los Angeles, USA. ~4hrs to Sydney, Australia.


Add a layover and you could easily be at 20+ hours (which is common since NZ is small so there aren’t that many flights). Europe is well over 24 hours even to the major hubs.


Granted, however the flights I listed are all non-stop.


There’s a non-stop Auckland > Denpasar flight?


> Yep, it's that far.

It'll be handy when the zombie apocalypse occurs though: it'll be quite the swim for any ghouls.


I don't live in the US. I live in Mexico where Apple products are on average about 20% more expensive when converting currencies.

If you compare the value of the peso in Mexico vs the dollar in the US, Apple prices are in comparison much more expensive than in the US.


>relative to the price paid

TV set you get on Black Friday works until X-mas. It was a bargain, what did you expect?

That discount you got on your car? You think it was because of your outstanding negotiation skills? Think again, the seller swapped a cheaper bifurcator, meaning your mileage will suffer. But is it wrong? You got a discount.

If you sell it, if needs to be good. If it is not good, you cannot sell it. There's no price in question.


In Norway, it's not based on the price, but the product type. So TVs in general are said to last at least 5 years. So if you sell a cheap tv that breaks after 4 years, you have to replace it.

This of course means one don't get as much ultra cheap stuff sold, but at the same time most of that stuff is crap, and it's better for the environment to build stuff that lasts.


In most European countries it doesn't matter if it was a bargain or not, warranty law is the same for everyone.


I don't think Apple should necessarily include coverage for user-caused damage as standard, but having only a single-year warranty on manufacturer defects is kinda shameful for a premium product. I expect premium products across most product categories to carry a three or five year warranty as standard.

A single-year warranty is in general an indicator that a) the manufacturer doesn't have faith in reliability past the first year and b) the manufacturer is begrudgingly just meeting the minimum standard for warranties in most jurisdictions. Even if neither of these are the case, the optics aren't good.


>A single-year warranty is in general an indicator that a) the manufacturer doesn't have faith in reliability past the first year and b) the manufacturer is begrudgingly just meeting the minimum standard for warranties in most jurisdictions. Even if neither of these are the case, the optics aren't good.

Dell's Precision workstations come with 3 years warranty, upgradable to 5 years coverage with on-site service. There is no reason Apple can't match that.


This is a really good thought and I would be OK with Apple extending the warranty on "professional" products instead of full Apple Care.


I live in Indonesia where there is no Apple store and where time for repair of an issue such as keyboard replacement is six weeks.

This wasn’t an issue years ago because reliability wasn’t an issue. My ~2005 30” Apple Cinema Display is going strong connected to my 2010 Apple mini. My 2017 MBP however has been the most unreliable hardware I’ve ever had (my first computer was a 286). 6 weeks for repairing a pro device with a hefty premium price.


Still using my 2004 20" ACD.


Apple care should not cover things that are flaws in the design caused by Apple.

These type of design flaws should be fixed up as part of their normal warranty process.


In Europe a lot of products have to include 2 years warranty by default, which makes Apple Care a lot less attractive. You're basically paying the same price for just one more year instead of two.


Not every EU country has such a law. In the UK 1-year long warranties are standard, but some other states mandate 2 years on all electronic devices.

Unless of course you are talking about the EU-wide consumer protection rights which apply for 2-6 years after purchase, but which people very mistakenly call a guarantee. The problem with this protection is that it protects you only against manufacturing defects. And anything that happens after the initial 6 months is up to you to prove that it existed at the time of purchase. So no, if your MacBook suddenly dies 23 months after purchase, apple doesn't have to repair it, unless you can demonstrate to them that it died because of a manufacturing defect. In comparison, apple care would get your laptop replaced under the same circumstance.


We have CRA 2015 here in UK. 6 years on everything. Apple ask me what I want to claim under when I take my stuff in there. I only need AppleCare for when I break something :)


2 years for people, 1 year for businesses. That includes self employed people if they want to enter it as an expense in the accounting books (I'm not sure about the terms in English, I hope you understand what I mean.)


I'm wary about AppleCare BECAUSE of the MBP keyboard incident. They refused to replace mine or my wife's. I have purchased well into the 6 figures of Apple devices over the decades.

p.s. Switched to Windows when I bought my laptop for that reason.


What is the MBP keyboard incident?


I hear you. My current strategy, unfortunately, is to buy a new MBP, sell it at exactly 23 months so that I can advertise "over 1 year left of Apple Care".


assicuration is often a form of self selction bias. people conscious about damaging their property are those that are more likely to take care of it. apple care for everyone can dramatically shift its profitability.


But then they would have to either extend that to all Macbook Pros, iPad Pros and iPhone 11 Pros for consistency, or admit that these devices only got the "Pro" slapped to the name for marketing.


> I think they should include Apple Care for their professional equipment. This won't put any any more money in Apple's pocket but I could see it being another differentiator and a good bit of PR, especially to all of those price arguments.

Do Apple hardware failures happen often?

In ~20 years of assembling PCs with consumer grade parts I've only had 2 legit non-keyboard/mouse hardware failures (3 if you count lightning destroying a power supply before I knew about surge protectors). For context, I've had 0 hardware failures in the last 10 years. I keep my computers on 24/7 too.


Funny, in about the same period ~27 years or so, I've seen enough PSU failures alone, that I always use a UPS. I've seen ram that failed a couple times, quite a few motherboards die in the early 00's. A MB that was DOA and a video card that died after a year. Oh, and a first gen Intel SSD that suddenly thought it was an 8mb drive.

I still prefer to DIY for value... my 4790k build lasted 5 years, just put together a new one, placeholder cpu waiting for 3950X. Will probably, again, last 5 years or so.


What's really funny is in the late 1990s and early 2000s I always overclocked my systems because it gave a noticeable FPS boost in the games I played. Dual Celerons (physical chips) running 366@550 (a classic) and then a Pentium III 733@900 which was a huge upgrade, especially for a consistent 125 FPS in Quake 2 / 3.

My 2 legit failures were an old IDE HDD that audibly clicked like crazy for months before failing completely. Since it gave such a long warning I was able to back everything up thankfully. The 2nd failure was a different power supply that just stopped working without any notice or grand events. I never had any RAM or video cards go bad, but after assembling a new machine I always run them through rigorous automated tests to help detect faults. I've had 1 stick of RAM be DOA but I don't count that as a failure.

My current system is an i5 4460 (3.2ghz) with a bunch of other components. It's been going 5ish years now without issues and I work this thing pretty hard. Full time development, video recording and editing, gaming, etc..


Oh yeah... I've also had a few HDDs fail too... Two that were in Raid-1 within a week or so of each other (before the RMA for the first drive came back).

Aside, kernel (5.3) and drivers (mesa 19.3) are all updated, can finally drop in the rx5700xt card I'd been sitting on and play with it. Amazing how many things are working without issue via Steam's Proton (Wine, dxvk, etc)



It goes both ways. If you can afford to spend this (crazy) amount of money, you can pay for Apple Care as well.


Why would they price it so low, and not at 10%?


A great machine to be sure for high end content creation but Apple is not chasing the deep learning dev market because of the lack of CUDA. That market is better served by Linux boxes with appropriate GPUs for TensorFlow, PyTorch, etc. support.


Nvidia really has the GPU market under their thumb. It would be one thing if their pitch was "We're not as good as AMD for games, but we've got CUDA for professionals and researchers!". At least that way there could be some sort of segmentation and I could tell people shopping for GPUs to "get the right one for what you want to do!"

But that's not the world we live in. Nvidia's pitch is "We make the only hardware that runs the framework used by almost all deep learning research and media creation software, and we're also the only folks operating at our level when it comes to high-end video game graphics, and if our high-end cards are too expensive for you, we have cheaper ones. And when our competitors start to think they can catch up, we'll drop RTX and Tensor Cores on their face."

AMD seems stuck on "We have great value middle-to-high-tier video game graphics cards" at best. I have no idea how they can get out from under that rock. They've been turning the CPU market upside down and smashing Intel's once proud "near monopoly" status. Nvidia seems like exactly the kind of prideful company that would be poised to fall, but I have no idea how AMD could make it happen.


I'd guess that within 2 years AMD will be competitive to Nvidia at the high end. Their strategy is a bit different, they're delivering good value at 2/3 the way up the performance graph... not as big a markup as a $1200 RTX 2080 Ti, but the RX 5700 XT is decent, especially for the price.

Most people aren't spending over $500 on a GPU, so they get the volume sales. The better aftermarket cards have been selling out pretty consistently. And the longer term strategies are similar to how they approached Ryzen. So, I'd think that Navi can definitely succeed in that light.

The real lock in for NVidia, is all the existing CUDA development. Intel and AMD will have a lot of work, One API may help there, so long as Intel and AMD can work together, without Intel's often and weird lock in attempts.


Yes, because nobody replicated the pragmatism and power of CUDA. OpenCL is much uglier and lower level. So AMD decided to do something about that... and invented ROCm, which is somehow even uglier and more low-level! A reference FFT implementation in CUDA is about 150 lines of code... and it is almost 10 times more in ROCm.

It was always about software. Granted, CUDA is not the best and most elegant platform in the world... but AMD seems not to be able to reach even that level.


Nvidia is still better for games then AMD. Haven't seen the GPU which beats them.


Isn’t that only if you ignore price? If you only have a limited budget, then AMD will get you better performance. For absolute maximum performance at any cost AMD has nothing that can beat Nvidia though.


It depends on your electricity cost however.


so all the high margin business goes to nvidia you're saying...


He's saying that performance gain may not be worth huge price tag. It's similar to Backblaze which decided to use consumer grade hard drives for their storage solution rather than pro grade drives because cost increase was not worth it.


> ... rather than pro grade drives because cost increase was not worth it.

Or that it was cheaper to get reliability at another layer of the stack (rather than hardware).


This will probably only get more difficult as a bunch of AMD GPU engineers defected to Intel.

I'd still likely only go AMD GPUs in the near future, just because I don't have particularly demanding GPU requirements and they have better Linux drivers.


The worst thing is that MacOS will not likely have Nvidia drivers anytime soon.

Lose-lose for customers.


Meh, just buy AMD because Nvidia sux. For gaming, the price to framerate comparison is so close anyhow.


Exactly, and to me it’s a deal breaker.

Anyone privy on what hold Apple back on supporting Nvidia? The only info I read was due to some earlier bad blood, but I wonder why hurt your users just because of some old scuffles...


I've heard it's down to patent disputes. Wouldn't surprise me.


Lawyers.

They probably both each want too much money.


Bingo. Although it looks clear that there is a lot of movement towards broader support of AMD in DL via ROCm: https://rocm-documentation.readthedocs.io/en/latest/Deep_lea...


A reference FFT implementation in CUDA: about 150 lines of code. A reference FFT implementation in ROCm: about 1200 lines of much uglier code using advanced C++ idioms.

So no, it is not a replacement, again. Elegance matters.


I'm struggling to think what super high end content studio hasn't moved on by this point? All these tools are available on Windows, and no one still in business was waiting 12 years for a worthy upgrade.


I don't know about the compute capability of GPUs on this Mac Pro, but if one were to write CUDA like programs. HIP is basically the same API as CUDA.

For and Engineer or Computation Scientist, I think this is not a bad investment

But Yes existing CUDA libraries won't work.


With the new MPX module spec, Nvidia should be able to release cards for this.


The hardware isn't the problem, it's the drivers.


Nvidia already produce third party drivers for their Mac cards.


Not for Mojave+, and it’s a bleak outlook.


Slightly off-topic, but who here uses a HEDT (high-end desktop) or workstation computer for software development? Does it make a significant difference in comparison with a standard business laptop?


Yes. I work on a Linux distro, and the amount of time my 1st gen 32 core Threadripper has saved me is truly mind blowing. It made it possible to do changes I wouldn't have dreamed of.

As an example, I worked a bunch on our PostgreSQL infrastructure, but we support 5 years of versions. So you have to CI/integrate your changes and test them across all 5 versions, every time you test. My machine could do this on the order of 2-3 minutes -- recompiling everything from scratch every time, and running all integration tests on all versions. There's no CI system or cost-effective cloud hardware that would have given me turnaround like that. In the end I was doing hundreds of builds over the course of just a few days.

In contrast, at $WORK we have a C codebase that takes < 1 minute to compile on my 4 core work laptop. YMMV.


Why not a dedicated CI server with all this power and a more modest and cheap workstation?


Because then you need to commit your chnges, push to a remote Git repo, wait for the CI server to check for changes, wait for the CI server to finish previously scheduled builds, wait for CI server to start building, ...

Also, all that needs to be set up. If you need to change some little thing, you're much faster doing it locally, rather than trying to wrap your head around the build server configuration and figure out how to just run this one test different but only for this branch and not for the normal builds ...


How's kernel support and VM support such as QEMU lately? I'm thinking to buy i7 for embedded OS dev but that sounds very tempting. I heard AMD CPUs had some breaking issues last year.


There was breaking issue, but it didn't manifest automatically.

It happened, when the 2nd gen TR arrived. It used the same mainboards, so all the manufacturers issued BIOS updates.

Unfortunately, these updates claimed to support SEV (Secure Encrypted Virtualization). Linux of course tried to initialize it at boot/module load time and the entire thing went hanging, because TR CPUs do not support SEV, only EPYCs do.

So there were the following fixes:

1) downgrade BIOS back to pre-TR2 version,

2) blacklist the ccp module; which would make kvm_amd non-functional,

3) wait for a fix in Linux kernel, which initializes SEV with a timeout.

So it wasn't that tragic issue, if you had first gen TR.


I haven't had any issues with my 1st gen Threadripper for about a year now. I'm running Unraid with QEMU from that box and couldn't be happier with performance and reliability whilst running 2-5 VMs at once along with 20+ Docker containers. No issues with Windows 10 and *nix VMs; I haven't been brave enough to attempt a hackintosh yet though.

When it was bad, it was so bad. I never wanted to know as much as I do now about IOMMU groups and PCIe lanes.


I use an 8-core iMac Pro with 64GB RAM; it’s wonderful.

When I bought it I was doing some embedded device work involving a Windows VM, as well as heavy web dev on a frontend JS app and a backend app with tons of Chrome tabs open. All these things crave memory, and my 16GB MacBook Pro was swapping itself to death. This was pre-32GB MacBook Pros, so I bit the bullet and couldn’t be happier with the setup. It’s dead silent basically no matter what I do and doesn’t get thermally throttled during heavy workloads. Having the extra RAM also makes a huge difference.

Now that I’m doing more regular web dev on an Elixir & React app, I benefit from the 16x parallel compilation & test suite, as well as the ability to keep basically anything open without resource issues.


I have a similarly specced iMac Pro and I use it for very similar tasks. Could not be happier with the machine. I think the RAM + the very speedy storage make this a great choice for the kind of dev work I do.


I do similar work (arguably a bit more intensive) on my 2016 MBP with the lowest specs possible (8gb ram, 2ghz cpu) and it's surprisingly silent most if not all the time.

I have around 5-6 windows with 10-15 tabs each within them as well some ssh tunnels and some other applications open (I do heavy front/backend with docker builds through ssh as well as ML work through ssh)

Not sure what's going on, but my lowest-end Mac is running everything I need like butter.


I got a Lenovo laptop with roughly the same specs, though.


Your Lenovo laptop has an 8 core 3.2GHz Xeon with 4.2GHz Turbo Boost, 19MB Cache, 64GB DDR4, silent cooling even under heavy load (which it can sustain), and an SSD that does 1700MB/s with full disk encryption?


Actually, P53 and P73 can be specced with all of those - and more. Up to 128GB RAM, Quadro graphics, 3 drive slots/bays (2 NvME, 1 2.5")...And the chassis is not compromised for thinness, so cooling is adequate.


I'm thinking of getting either a P1 or a P53 for my next work upgrade. The P53 will likely have better thermals, but the P1 has an option to remove the discrete GPU, and just use the integrated GPU. As I have no need for a fancy GPU, I figure it would reduce unnecessary power draw.


Forgot to add - and you get a decent display, and it's portable, and it's still cheaper than MP.


And, as an added bonus, you can have major shoulder problems for life from lugging that 17" behemoth around.


Hah, I agree...But, on the other hand, you can carry it when needed, and I do regret not buying one of those for work instead of Razer Blade, "because it's thin and light" - it gets overly hot and noisy when doing heavier work and is definitely not up to par to these...


What operating system does it come with?


I think Lenovo (together with other questionable actions that alienate their userbase) removed the option to choose Linux for workstations, so it comes only with Windows 10. But Thinkpads are traditionally very well supported on various Linux distributions.


Not at my laptop until monday so not sure about all details, but almost that, yes. It's a top specced P1. Xeon but 6 cores I think. 64gb ddr4. 1500mb/s read I think it was on the SSD.

Not silent, exactly though, hehe.


Wow, that’s a beast of a laptop. I just configured one and the cost isn’t all that different from my 18mo old iMac Pro :) I have to imagine that the battery life isn’t great given the non-mobile parts, and that it must weigh a ton?

I’m enjoying the combo of a powerful workstation where I work most of the time, and then a thin & light portable device for when I need portability. For me that combo works well and hits the right trade offs, especially the silence I get to enjoy.

I’m also remembering that Apple ships the same SSD setup in the newer MacBook Pro machines, which weren’t yet out when I bought this iMac Pro. Definitely nothing unique or unusual about that speed anymore!


Yeah, not cheap, but my company let me choose between that and a top of the line macbook, and since my previous macbook was a catastrophy I switched setup entirely.

The weight isn't too bad, I have it in my backpack commuting by bike in hilly Norway. But maybe too heavy for a shoulder bag. The battery time is useless, yes, so that's the big tradeoff. But since I bike instead of commuting by train or so, I never used my previous laptop not plugged in anyway.


I don't think the P1 is all that heavy. Maybe 4lbs?


That ~1700MB/s FDE speed is a limit coming from Intel AES support. The drives themselves (assuming NVMe, 4 PCIe gen3 lanes) are faster.

Even those crippled by being connected to two lanes can achieve this speed.


I just upgraded from a 2700X to a 3900X. Maybe it’s not quite HEDT but its closer, and my god, it’s literally mind blowing. I measured a 47% wall clock improvement on a test compilation. If you compile lots of C++ you are in for some serious surprises. Especially so if you are doing things that cross-cut hundreds of C++ packages at once, it basically enables things that aren’t possible on, say, a mid range laptop.

That said, if you are just running simple Go or Rust compilations, or small single-package C++ compilations, caching and a decent processor should be good enough, and you probably won’t benefit much from a lot of threads. (You may want it for your Webpack builds still ;)

One tip: scale up your RAM, pay attention to clocks and latency relative to your processor (especially with AMD where it really matters.) 16 GB is easy to kill when you are running 24 instances of GCC and a VM or two.


Rust actually benefits a lot from more threads once you have a couple of dependencies. A personal project using amethyst/nalgebra dropped its compile time for a fresh release build from 20 minutes to 2 when I upgraded from a i5-4670k to a r9 3900x.


This is true; I’ve never worked on a huge Rust project, only fairly large C++ projects (at home, anyways.) Rust compilations always felt fast enough to not matter, similar to Go, although maybe not quite that fast. (With Go, it never felt to slow to say, build Kubernetes from scratch; it’s just fast.)


Regarding webpack builds... absolutely NVME, when I went from SSD to NVME it's the Node/webpack build times that I really noticed the performance differences.

I would think the extra cpu would be more of an impact with Rust than even C++... I wouldn't know, running a 3600 as a place holder until the 3950X comes out. I couldn't handle the 4790K anymore, going from 32gb to 64gb and the couple extra cores made a huge difference for my docker/vm workflows. Can't wait for the 3950x. I'm sure the TR 3rd gen will be similarly impressive with the changes to IO.


Oh yeah, NVMe is an absolute given. It only took one NVMe drive experience and I have never had a desktop or laptop since without a large NVMe SSD as the boot and primary disk. It is in many cases a substantial boost and you can benefit across more things than a bigger CPU since many things these days are IO bound to begin with!

I’m rocking a Samsung 970 Pro 512 GB on my desktop. I never thought I’d need more space than that, since I can always use my NAS or the spare spinning disk I have installed. But, the more CPU power you have, the more you can feed it... I find myself building entire fragments of Nixpkgs now and it takes substantial disk space to do it.


I don't have an HEDT but an outdated Haswell i5 laptop (not an Ultra Low Voltage model at least, so it's quite fast). Last year I spotted a bug in Firefox and I though it was time to put a line in the contribution section in my resume.

The contribution experience was a nightmare because a full build of Firefox took 3 hrs and running the entire testing framework took 4 hrs, though it turned out that I needed to run only a part of the testing framework. Changing a single line and building it again still took more than 30 minutes.

That was the first moment that I wanted a HEDT in my life. It feels like devs who work with big C++ projects would want a bigger workstation because of the significant build time.


Maybe spin up an instance on GCE or EC2 and remote into it? If you live anywhere near a data center it should be under 50ms delay. Your time is certainly worth more than whatever the cost of the instance.


Probably. But I always try to build Linux programs natively or in an officially supported environment at the first attempt because I'm not sure what potentially cause problems when I build it in a something unsupported environment. On top of that running the testing framework needed an X11 environment. You know, it sometimes also takes significant times to figure out how to set up in a different environment. I might try EC2 if I became a regular firefox contributor.


> If you live anywhere near a data center it should be under 50ms delay.

Hm, I guess I don't live anywhere near a data center. 1M+ North American city, but lowest ping to any AWS/GCE region is ~130ms.


My RTT from the midwestern US to AWS/Azure/GCP in Germany is less than that. I think your ISP has more blame than your distance to the nearest zone in NA.


Yes, you guys are entirely correct, something looks fishy with my ISP. I'll give them a call them tomorrow morning if it still looks like this.

Edit: Now resolved, pings (barely) over 50ms.


Most likely that latency comes from the link between your place and your ISP.


New York to LA ping time is something like 70ms. Are you going through a proxy?


And it's much easier to set up a build env on a VM than local machine


Woof. I wasn't aware compilation times got that long for people who aren't building a whole OS.


Try building Chromium from source. Anandtech uses it as a multithreaded benchmark. The top CPUS can average just under an hour compile time.

https://www.anandtech.com/bench/CPU-2019/2224


I wouldn't try contributing if I were aware of it too lol. Imagine that I did it for Chromium. I was told that it takes about 20GB to build Chromium.


It's usually more, actually. We have seen 100+ GB build directories when building the "all" target and a non-GOMA build easily takes multiple hours on a desktop i7


A browser nowadays is pretty much a whole OS.


Hmm... Like when Oracle created their "bare metal" product. There just wasn't much left to add to the database engine to eliminate the need for a general-purpose operating system. I suspect graphics, audio, and pointing input may be significant barriers to doing the same thing here.


Well I work in games and building our thing without distributed compilation is inconceivable.


I remember building Mozilla took around 6 hours in my PC back around 2003. Though granted, that was a Pentium II PC with only 128MB of RAM and most of those hours were probably spent with the OS (Linux) swapping stuff :-P.


Are compile farm still relevant in these type of scenario these days?


Yes. As a CI pipeline the Firefox team do run their own compile farm called TryServer [1] and is building a various different versions (possibly all of accepted versions) at a time.

Level 1 contributors (contributors who are acknowledged by one of Mozilla dev) grant access to TryServer.

[1]: https://wiki.mozilla.org/ReleaseEngineering/TryServer


3 HOURS? Does that say more about your workstation or the project?

I've recently built php + most of the bundled extensions, in a VM (on a 2018 Mac mini) and it's only a couple of minutes at most to build.


Hey it's Firefox, one of the biggest C++ project you can build for now.


A modern browser is a MUCH larger codebase.


Well perhaps if browsers didn't try to cram every feature of a desktop operating system into their own runtime, it wouldn't be?

Seriously. Chrome and Firefox are both guilty of this.


Okay, support every CSS & JS DOM/API feature without doing so. Create your own browser, and get people to actually use it.


I do, and it makes a significant difference in most of my workloads. A lot of my work requires testing bleeding edge software, for example, applying a small patch to the latest version of kubernetes then running a small virtualized cluster locally to make sure it works.

Even if not for that, I tend to cross compile a lot of software since all the engineers at my company use macs for software development but we deploy to linux servers, so often I end up building rust binaries for linux and it's fairly computationally intense.

For an anecdote, on my work laptop (i9/32Gb/512SSD 2018 15" MBP) I can compile the dev environment from scratch in 40 minutes whereas it takes 4 hours on the company standard dual core/8gb/256gb 2015 13" macbook


I do, although not the kind of software development most people do.

I’m a robotics controls engineer. The code I write is pretty short compared to most of you, and it doesn’t take long to compile. But to test it I have to run very complicated dynamic simulations. These never run in realtime. It’s not uncommon for a thirty-second simulation to take five, ten, fifteen minutes to run; sometimes I have to run many iterations of the sam sim to collect valid statistics. So both CPU speed and number of cores translates directly to time spent waiting for the sim run to finish.

I have a 2013 trashcan Mac Pro with 12 cores. It’s still a good machine.


Oh yes, it makes a _huge_ difference. Most modern laptops use constrained, low power CPUs and GPUs. They are dog slow. Even my 5 year old desktop is substantially faster (like, builds take half the time kind of difference) compared to my <1 year old max spec'd thinkpad.

Possibly the only laptops that could compare to desktops are gaming laptops, and I know developers who buy things like the Razer gaming laptop to try and get desktop-like performance.


For me the problem with gaming laptops for work is their GPUs. I honestly don't need anything beyond build in Intel GPUs but I would love to have powerful CPU + a lot of RAM. Highend GPUs are driving prices of laptops very high so essentially I'm paying a lot for hardware that I would never use. That's why I prefer desktops, there is no problem with having Threadripper and 128GB of RAM in case with some basic $30 used GPU.


I think good GPUs are necessary to support external monitors. My thinkpad really struggles to power two external normal DPI monitors. If I had high DPI I don't think it would even be possible. I suspect the gaming laptops with their full powered GPUs, usually Nvidia or AMD, would have no problem powering several high DPI external monitors.


That's true but still you don't need Titan to power two 4k monitors, probably good old 750Ti or even older GPU will be enough. For me it's even easier because my sweet spot is one 1440p display (that's where I work most efficiently) so anything will be good enough for me.


It probably depends what sort of software.

I migrated from a MBP to a 2018 Mac mini with 64GB, and the ability to run say 10 VMs at once, without having to limit them to half a GB RAM each is amazing.

I'm not in the market when this thing (Mac Pro) is actually released but in a year or to, I may well be migrating to the latest of the Pro line - I don't care about GPU (much - it just needs to drive some displays) but I do care about CPU speed + core count, Memory and fast storage.


I do, I really like it. Current uptime, ~90 days, got everything the way I like it with i3... I can have hundreds of tabs open in various browsers, I've got 9 workspaces with various projects and things in them and a 4k display.

Actually my main rig is the last mac pro, the 2013 one - I just don't use apple's OS. I really like the rig and I expect to use it until at least 2023 (10 years of use).


I use a dual 10C Xeon (40 threads) with 128gb of RAM as my work desktop. I do builds development for C++, Java, SystemVerilog, and Python codebases. Jetbrains IDE products like IntelliJ, PyCharm, and CLion can use extra cores and ram for indexing our large codebases. The high core count is also useful when compiling C++ from scratch (common for me because I'm often doing weird things that defeat the bazel caches) and running large test suites or medium simulations.

The latest MacBook pros (8 core, 16 threads, 32gb ram, 2tb flash disk) are finally fast enough with enough ram to both run CLion with clangd indexing and do C++ compilation/testing. Full builds are still not advisable (not that a full build is particularly useful since our prod is only Linux and we have CI/desktops to avoid cross-compilation).


I use a desktop, not sure I would call it high end (32 GiB, 6 cores, nvme). I do lots of large builds of things like linux kernels. I doubt any laptop would be able to keep up simply due to thermals.


Yeah, what's considered high-end does shift, although that still sounds higher end than most laptops. I think a typical "high-end" desktop is one that has more cores, memory channels, and PCIe lanes than a typical "mainstream" desktop (whatever that happens to be at the time). Support for ECC memory is also sometimes a requirement.

I use a 3 year-old laptop with 32GB of RAM, an SSD, and a dual-core i7 (U-series) processor. It's reasonably efficient. I think the amount of RAM and the IO speed of my SSD remove most of the hardware bottlenecks I face.

However, I would like compile times to be faster, I do sometimes notice applications hanging, and IntelliJ (we use Java at work) seems limited on how many projects I can run efficiently in a single workspace. I'm just wondering whether a workstation-grade laptop (Dell Precision or Thinkpad P-series) would be a sufficient upgrade, or investing in a desktop would be worthwhile at some point.


> I'm just wondering whether a workstation-grade laptop (Dell Precision or Thinkpad P-series) would be a sufficient upgrade, or investing in a desktop would be worthwhile at some point.

Yes, either 17" machine offers an 8-core CPU and plenty of thermal capacity (i.e. larger chassis than their 15" equivalents), along with more than enough memory (128GB), and in Dell's case, a boatload of HD slots (4, run in whatever RAID config you want). These are desktop replacements, heavy (as in 6.5+ lbs), with 240W power bricks , so you'll probably want them to stay on the desk for most of the time.

I'm personally looking to replace my old Dell Precision with a 15" 5540, maxed out. For Scala work the extra cores help reduce build times, probably the same for Java.


My company provides a pretty standard laptop, but they rarely have more than 2 cores.

In native app development, Android Studio and Xcode are severely bottlenecked by 2 cores. The difference between my laptop and my Mac Mini (6 cores) is astounding.

The difference between a build taking 6x minutes and 1x minutes is more than just 1/6 because it goes from "Well, guess I'll check email/messenger/HN real quick." to taking a sip of coffee.

It also gets real bad when your IDE is forced to fight with your browser, email client, etc. Then you're forced to do nothing, lest your casual actions further slow down your local build.

Usually the "breaking point" is when builds take more than 5 minutes, but especially when they take more 30+ mins.


Slow builds can definitely break concentration. I'm on a dual core as well, although most new business laptops come with 4 cores.

My build times aren't as long, because I'm working with Java-based microservices, so each service takes a minute or so to build (with tests) but even that delay can break concentration. Turning off tests helps, but then you don't always have the immediate feedback of the test results (and don't worry, I always run the tests before committing).


I worked on a codebase a few years ago that took over 44 minutes to compile... a change you would think should take 10 minutes took 3 days. RSS was still a big thing at the time, and did a LOT of reading while waiting.


I used to be happy with Java and .NET builds, until I got to deal with Android builds.

Android is probably the only platform that in almost every conference has at least one talk related to how to improve build speed, thanks Gradle + Android Studio.


A little off topic, but what does your microservice build with tests look like? Trying to figure out what I need to do if I want to try microservices correctly...


The 2018, 6 core Mac Mini is underrated for use as a HEDT in my opinion. It's surprisingly powerful and silent. I really have to thrash something to even get the fans to spin up.


When I started working in my current company, the dual core, 16GB MBP that was given to me was enough since I was working on a single project.

But now that I am working across projects that run 2-3 repositories at a time with Jetbrains IDE, the computer almost barely keep up. I get times where typing will actually lag.

I am contemplating asking them for a Mac Mini instead since I rarely ever venture out from my desk.


Your employer supplied you with the wrong tools for the job. Explain why you would need a more powerful computer and you'll probably get it.


On extremely large projects (15-20 minutes to compile from scratch or longer), the more compute power available, the more productive you can be.

Cumulative wasted minutes from compilation over the course of a year can add up to a lot.


I use a first-gen AMD Threadripper 1950x (16 cores x 2 HT, or 32 logical cores), if that counts as HEDT.

It's absolutely faster than any business laptop for doing builds of medium sized C and C++ projects.

(But I think a non-HEDT desktop would also outperform a typical laptop in this use case -- thermal throttling is a real problem for laptops.)


I bought a used Dell workstation with a 8c16t Xeon E5-2670 and 32GB ECC DDR3 RAM a few years ago. It's a bit outdated now, but it's still faster than the most laptops (for multi core workloads). At the time I had a MacBook Air, so this completely blew it out the water. I remember one time while trying to debug an issue I had 10 docker instances of our CI running (Rails + Cucumber + Firefox) and there was absolutely no performance impact to the desktop experience.

Last year I bought a T470s with 20GB DDR4 and an i7 (U series) as I was mostly working away from home. It's good enough for most of the work I do, but it can be a bit slow at times. The processor just isn't as fast and the integrated graphics struggle with a 4K desktop (I think that's mainly Linux/GNOME being unoptimised though). I haven't noticed it throttling, but my workload usually isn't that CPU intensive.

If you mainly work from home I'd definately suggest building a desktop machine for your needs.


You could get a T480 with the quad-core CPU, or a P5x with a 4, 6, or 8 core CPU (like the 15" MBP) and 64GB :)


To add to the chorus, Threadripper has been incredible for deep learning related workloads, partly for the high core count for prepping batches, but also because it can support multiple GPUs and gobs of RAM. If you have a lot of big compile jobs, it shines there too.


Not a desktop, but... My employer typically gives people a MacBook Pro, but when a colleague left a couple of months ago I snarfed his laptop - a ThinkPad with 24GB RAM and an i7. I put Slackware on it (for I am Old Skool) and it's really handling many many tabs, plus our many-docker local dev environment far better than everyone else's Mac.

Of course, it's far bigger and heavier than the Mac, but also of course (for I am Old Skool) I'm using it with a decent keyboard, a decent-sized monitor, a decent mouse, and a USB-C charger. It does mean I can't go to meetings, but that's ok with me...


Does wanting to count? I don't have a need right now, but some years ago I was doing iOS development; compile times started to get to a point where a more powerful system would've come in handy.

Mind you, my workflow could probably do with adjusting as well, I tended to work like I do with webapps and just continually rebuild/restart and review my changes on the device itself (or an emulator).

Part of me also wishes we had a big setup with multiple docker images running in parallel but ATM we have the luxury of working on 'just' a website (react, some lambdas, back-end are all third party services, we connect to a staging environment during development) so it's not too bad.

But I'd still like a permanent machine, there's something about (and this is me idealizing) having a fixed workspace you don't have to pack up every day. I mean sure disconnecting a laptop and yeeting it into a backpack isn't that much effort but it's the little things.


I did once. It was wonderful not having to take the thing home.


Yes. I have a company issued 2018 MBP 2.6 GHz i7 & 16GB RAM but I prefer to do most of the dev work on my HEDT running Linux (Ubuntu 16.04), 4.0 GHz i7 & 32GB RAM.

It's way better at running Docker instances, a bunch of electron apps, and tons of Chrome tabs simultaneously without a hitch.


I do all my work on desktop workstations; I have a little burner laptop I lug around to meetings, but all it really does is browsing and outlook.

Having a real keyboard, mouse, and four 27" monitors is something I will never leave behind at this point. All that screen real estate to spread out over helps enormously. I can have a browser with our application pulled up, Visual Studio, another browser with doc, Webstorm with the front-end code, SSMS connected to the database, an email client, Notepad++ with logfiles, all the different chat clients I have to use, and more if I really need to, all on screen and available at the same time. I don't have to alt-tab around, I just look and it's there.


HP Z800 with 96GB of RAM, 2TB of RAID 0 rust and a 1TB SSD. C++ dev using Fedora 29.


6 core Xeon, 64GB RAM, some builds take 4hrs....


Holy moly. What language, and roughly how many lines of code?


Perhaps he's building Windows, or some equally gigantic code base?


C++, and N > 5.


I do, but mostly because there's a few projects where running multiple virtual machines with multiple cores and lots of ram is useful for them. Also because I run a VM for cad work for my 3d printer, on the same machine.


Yeah, I sometimes will be running a dozen or so docker containers, although the CPU utilization on those tends not to be very high - RAM is the main bottleneck for me. However, I also don't typically leave those running. I just spin them up when I need to for testing. I'm not sure how fast my other applications would run if I left those up.


I could do with one since vscode + sql server in docker brings my new high end laptop to a halt every now and then until I restart them


My HEDT at home serves as my VR development machine and doubles as a gaming rig. I generally save basic research, design, and sandboxed coding for my MacBook Pro at a coffee shop.

If I had a similar performing laptop that could replace my desktop in terms of GPU performance and compatibility with the devices I use, my HEDT would start collecting dust.


Not a HEDT here, just a regular desktop, but it is significantly more responsive than a business laptop. Also much cheaper than a laptop comparable in performance. Plus it's upgradeable so more future-proof.

I still have an old laptop that I use to ssh into my workstation occasionally.


Depends what you're doing. I'm a low-level developer, my build times are less than a minute, so cashing out on a huge beefy PC would barely make a difference to my workflow.


The standard business laptop msxes out with 16GB, which is good enough if you don’t run all the secondary stuff like databases locally. But 32GB would be great.


Reading the replies to this make me very grateful that I just run a couple of Docker containers for local Postgres/Redis along with some non-compiled node.js


Not a great example.

Web Dev. Going from a 2015 MBP to the new touchbar model, my cold build times were cut in half. Other than, saving 15 seconds once per day, I noticed nothing.


Try going backwards. It’s easier to notice the speed difference the other way around :)


Yeah, work on some pretty big codebases with an IDE. CLion, IntelliJ, and PyCharm are pretty resource intensive for instance.


Recently picked up a 9900X and now my company's javascript app builds almost twice as fast ;)


Not everyone does web development and can work with just an editor on their dev machine...


People here seem to think this is just a random set of components not as good as X+Y+Z. This machine was designed and specced based on talking with the people who will be buying them, not the general public. Complaining its not X+Y+Z is like saying IBM didn't design a mainframe for your needs. Not everything has to build for everyone.


330 Comments, No one asked why was this submitted.

We have discussions on HN [1] when it was announced. The Tech Spec page has been there since day 1, I looked hard and dont see any significant changes, if any changes at all.

[1] https://news.ycombinator.com/item?id=20087315


That was 3 months ago, no one asked because no one remember it was posted.


That was not the point though, Its release were widely known and this spec page isn't news. It is generally accepted on HN unless the last submission had little to no discussion, and there is nothing new related to the topic, we mark those as duplicate.


The starting 256 GB SSD sticks out like a sore thumb on a machine as expensive as this. The barebones Mac Mini comes with 128 GB and 6-core Mini with 256 GB. And upgrading it is gonna cost an arm and a leg.


Depends entirely on workflow. A lot of shops will have all of their stuff on a 10G NAS share, so you’re actually forbidden from copying to and from your computer because it’s just stupid and sometimes slower to do so.


Still, it isn't as if 1T of SSD is expensive. So if they set the entry price to 6k, 1T of SSD would be an appropriate minimum.


Yeah, that’s what they give on the iMac Pro as base, so it’s not like they’d never do it. But it’s clear they’re deliberately under configuring the base to force complete customisation, which is what I assume the kinds of people buying this will want.

I’d go so far as to say they should sell it without a default hard disk if they could - let the buyers make the choice. I’m guessing it’s defaulting to the smallest disk just for optics.

This is clearly a start with the skeleton and build it yourself kind of machine, which is as it should be. It just looks odd when the selector defaults to the lowest available option on every selection.


"If you can afford the Mac Pro then you can afford the extra storage" as the Apple bloggers keep saying....


More like "If you can afford the Mac Pro, you probably already have many, many Terra bytes of storage arrays"


It is still not shipping so there's not really a reason not to use a Threadripper for lower price AND higher performance. Or even Epyc chips. We know the Titan Ridge Thunderbolt controllers work on AMD motherboards. Alpine Ridge had firmware problems but Titan Ridge is fine.


You can likely Hackintosh it with a 64 core Threadripper, WRX80 8-channel board, 4TB ECC LRDIMM and run circles around Mac Pro for half the price. Ryzen seems to work quite well with the latest macOS.


Apple & Intel are deeply entwined, both at a technical and a contractual level. There's a reason Apple has never made an AMD computer.


Well, AMD had nothing to offer really especially in the low power space where Apple mostly is but Sharkstooth and Rome so throughly trounces Intel high end it's not even funny. (Yes, Sharkstooth is not yet out but already the Ryzen 3000 chips make a joke out of the lower end of Intel Xeons and leaking benchmarks paint an interesting picture of the upcoming Threadripper 3000s.)


Motherboard design takes time — and Apple would also need AMD firmware and software support. If Apple want to go AMD it would be next year's model at the very earliest.


Eh, no. Google has been testing Rome for a long time before official announcement, this Mac Pro could easily have been the launch customer for Threadripper 3000.


Same. No Ryzen :(


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: