Apple bloggers said time and time again the reason they were not making a tower was that there isn't really a market for one anymore, yet when they finally made it they decided to build something that only really serves the highest of high end video editors.
Completely ignoring 3D, mid range video editors, developers who need high core counts + ECC, deep learning, etc.
Before they made it we kept being told "There isn't enough of you to justify them making it"
They finally make it and the narrative turns into "It's not for you it's for people who edit Marvel movies"
I don’t really think Apple cares about that drama. I think they know they have customers who will pay the premium. I think the machine is squarely aimed at businesses making the purchase, not consumers.
Is it really a problem that the Mac Pro is only for ultra high end users? Apple hasn’t made a mid-range consumer tower in over a decade now. If you walk into Best Buy how many towers you think they’re selling compared to laptops?
That’s the market Apple sells in, not the low-margin custom built PC parts market.
Apple doesn’t cover every use case of the personal computer. They are just one OEM. Unfortunately if you like macOS they are the only OEM.
As far as the mid-range video market that you talk about, what about the iMac Pro (has ECC memory) or a high spec iMac is insufficient for that task? Sure, it’s not as nice as your dual 1080Ti setup, but also, NVidia isn’t actually a viable option for Apple anymore thanks to their disastrous support for the platform in the past. If you made a Hackintosh system with NVidia you’d still be SOL. You aren’t getting CUDA on Mac no matter what hardware configuration Apple comes out with. Is Metal supposed to cover that use case and compete with CUDA?
- Ships with anything other than Windows (Linux, OSX, BSD, etc)
- Has a corporate warranty
You'll quickly find that Apple has this market cornered. To many people, you're not paying the extra money for the goodness of Apple. You're paying to avoid the badness of Windows. There's also some software that works on Macs, but not Linux machines that may be necessary for the job.
- Has a corporate warranty"
Support for mac-only software notwithstanding, Dell's workstations officially support RHEL, have Nvidia GPUs for CUDA workloads, and come with up to five years warranty with on-site service. You can probably find comparable HP Z-series workstations too.
Has Apple finally started to offer one comparable to the Big Three? (SLAs, Onsite service with guaranteed reaction times and HW replacements and so on)
Serious question - this was actually a big argument against Pro Apple workstations in the past.
>-Ships with anything other than Windows
Well, you won't get macOS of course, but all big workstation manufacturers sell workstations with Linux preinstalled. It's really nothing unusual and hasn't been for quite some time.
I do not know what the criteria are to be able to get on that program, though.
At least at first glance, it seems they have learned their lesson in this area.
Apple could not possibly do that today. Maybe you don’t need it, but if you care about warranty service Apple is not the answer
Are you telling me there is a market of an intermediate OS? Because I would be ready to pay about $250/year/user for a Linux that is as good as Mac, but with cheaper and more maintainable gear than the iMac. It’s almost as if Apple were trying to tell us there’s an intermediate market up for grabs, but they’re still to close to it for any incumbent to try their luck. Canonical was close to it, but stuck to the wrong business model and decided to switch to Unity in 2013 instead of stabilizing Ubuntu. Product roadmaps are hard. Jony Ive is available, just saying ;)
I bought a Dell XPS13 laptop recently which unfortunately had a non functioning motherboard. I contacted Dell and a technician came to my house first thing the next day and replaced it, no questions asked. Totally hassle free. I'd take that any day over having to book an appointment to see a 'Genius'.
I had a macbook pro with a logic board that died. I phoned Apple, they couriered me a replacement device next day, and that courier picked up my old device. Literally couldn't ask for better service.
Good question about accessibility. I have no idea. But it's not as though our society is a perfect utopia for the disabled. I can only imagine it would have gone far worse.
We were attracted to these as there was a Linux version that potentially offered a good alternative to Windows 10 for some of our people, but the experience was so bad that instead we immediately ruled Dell out as a supplier for any serious equipment for the foreseeable future.
Eventually, literally months later, someone finally seemed to escalate it to a person with the authority to send out a technician, who as mentioned before then fixed the actual problem in barely any time at all. We were on the point of just writing off the machine by then, as the amount of time we were wasting dealing with Dell was in danger of costing more than just buying a new box, and at least we would been reasonably confident of having a working system the next day in that case!
I can't speak for anyone else, but the last time I had trouble with my MacBook (five years or more ago now, the touchpad had cracked, IIRC), I made an appointment in the morning to come in in the afternoon. Walked up, explained the problem, gave them the machine, and they called me back to pick it up a few hours later. That's not bad, IMO.
Home service is awesome, but unless things have changed recently, I don't think that's common. It also may be different for wear-and-tear fixes vs. DOA replacements; Dell has a strong interest in fixing the latter as quickly as possible to protect their reputation. Most companies, most of the time, expect you to come to them to get service.
It doesn't exactly sound like it would be any kind of a problem for you to install your own OS if you're willing to accept a manufacturer supplied BSD.
Hopefully soon I'll be able to use VSCode to remotely work on my Mac though! - https://github.com/microsoft/vscode-remote-release/issues/24
> You'll quickly find that Apple has this market cornered.
No they don't. Not the market for people who just need a corporate Unix box. At my consultancy we have a mix of machines with many people running a System 76 tower or laptop and you'll find plenty of folks here on HN who will name Dell, Lenovo or HP as their supplier.
Perhaps you're thinking of just the market for people who do the absolute highest end video work for the film industry? Other than that, I don't see it.
And Windows has been rock solid for a massive amount of users and developers of various types since Windows 2000.
Well, that’s, like, your opinion, man...
No, seriously, I feel the same way about Windows’ UI. I mean, did you ever use Windows 8? And Windows 10 has built-in ads by default in the UI/UX?
Windows XP SP3 was peak Windows, IMHO - the awful part about them making an abortive mess bastard child of the UI/UX in 8 was that there are millions of non-techie people who literally know how to follow one sequence of events on their computer, and that usually starts with ‘press start’.
Windows 8’s awfulness is probably what drove a lot of those people to iPads. If you’re learning a new user interface idiom anyways, and even Microsoft Office is on the iPad; why stick with Windows?
Sure, an iPad is fine for consuming documents and doing light markup. However, if you're going to create things you're going to need to be multi-tasking and - hey, I have multiple iPads and I use them all the time - but I'd love to have a contest between what I can get done with Windows/Linux versus what you can do on an iPad because there's just no comparison as far as I can see.
I'll add macOS to that too. It's not even in the same class as Windows/Linux. I watch people using macOS daily and I swear, they are constantly swiping to find that full-screen app they lost because of the complete lack of window management in macOS. They'll put Chrome into full screen and then struggle to get the detached devtools window back up. They'll have to install things like iTerm with it's own tiling manager, to manage 3 terminal windows. Apple just doesn't care about practical things, they are constantly focusing on how things look, how thin or light they are or how they can make the most amount of money by removing options and claiming everything is always better that way, when really it just serves to remove the amount of work they have to do to support things like you know, physical buttons, headphone jacks, options/modes in software and so forth.
Anyway, the UI in Windows 8 and 10 were also completely configurable to make it more like the original Windows UI. If you don't like the default configuration you can change it or install 1 program (7+ taskbar tweaker) to make it just about perfect. What I really, really like is being able to do things the way I want to do them and not the way some godless corporation has decided it should be. Apple just gives you nearly zero choice compared to Windows and most obviously, Linux. They're just on the wrong end of the spectrum for how I like to do things.
And I never got any ads on Windows - just pre-installed apps like Candy Crush and Skype. I'm assuming they installed Candy Crush because it's a lot, lot more popular than Solitaire or Minesweeper with today's crowd. This is no different than Apple pre-installing things on macOS/iOS. And before anyone says anything about Apple not pre-installing 3rd party software...I think that's incorrect. If you want to use any of the Unix aspects of macOS, you have to start off with Apple's lame and old versions of even basic Unix utilities and programming environments until you go and install some other 3rd party things to fix the situation. That's way worse than having to right-click a Candy Crush icon to remove it once IMO.
Pressing Start is one popular way of starting a program on Windows, so I don't understand that line of argumentation.
The only tangible market for a Mac Pro is professional Final Cut users in a professional setting, no?
If this is hogwash, tell me so, but it just seems that any other realistic scenario that requires this level of hardware (like research, rendering and AI) would be significantly cheaper and better supported outside of Apple's ecosystem.
Short of having a pretty device to sit in a studio, what other reason is there for this to exist? (And how much of that audience is more likely to just buy iMac Pros).
As regards development - Most development tasks won't significantly benefit from the performance offered here, and anyone who needs that performance is likely going to buy something significantly better value for money (as regards tech specs) than a Mac Pro.
I'm not aware of any significant tools for 3d modelling or video editing (besides FCP) that are OSX-exclusive, and that audience is surely better served by a much cheaper Windows/Linux machine.
In my experience, the past five years have seen a dramatic shift to windows in professional facilities (Oscar winning editors).
I do know one very high end editor who cuts on a Mac mini. The old school guys are used to proxy workflows and you don't need lots of power for that anyway.
If I have a resource heavy problem, I can solve it for a fraction of the price outside the Apple Ecosystem.
Who buys these? When you look to the source you will understand.
Given the number of posts telling people in Hollywood not to restart their Trashcan mac Pros because of the Google Raven screw up, I would say that yes, in fact, plenty of people in Hollywood are using Mac Pros, and likely will buy this new one.
> I struggle to find the target audience for this, now that they are alienating home users
This machine has NOTHING, and I mean nothing to do with home users.
Next you'll tell me that Tesla have alienated "normal" car buyers because they make a $100k P100D rocket ship. Tesla also make a $35k regular sedan. Apple also make much cheaper iMacs, and Mac Books and Mac Books Airs for home buyers.
I don't understand why people time and time again bash Apple for making something that isn't in any way designed for "home users", while they still make plenty of things that are.
I think this new MacPro is going to be a huge failure. Professionals are running PCs now, and home users won't spend the money.
Price would be around $1300 for the computer, $300 for an eGPU enclosure, and $700ish for a Radeon R7, plus aftermarket RAM. AMD's not in a great spot for high end GPUs right now, but when the Navi 23 cards land next year it will be looking better.
This doesn't scale as well for multi GPU machine learning workloads and Apple needs to get over their shit with Nvidia, but as a lower end "modular" Mac than the $6000 cheese grater 2019, it's an option.
I used to want an eGPU. Then I learned that you need to disable SIP in order to do so...
> This doesn't scale as well for multi GPU machine learning workloads and Apple needs to get over their shit with Nvidia, but as a lower end "modular" Mac than the $6000 cheese grater 2019, it's an option.
No one is going to wait for that hypothetical future where MacOS supports Nvidia GPUs.
I hadn’t heard that. A quick search suggests it’s only necessary for loading Nvidia kexts or using TB2 https://www.reddit.com/r/eGPU/comments/8lybin/macos_egpu_wit...
> No one is going to wait for that hypothetical future where MacOS supports Nvidia GPUs.
True, but their Navi cards are at least competitive in the price ranges where they exist. Hopefully the high end ones next year continue that. If you’re looking at Titan or whatever the current ML thing is in the $1000+ range, then you might be stuck with Nvidia.
Interesting. I might ask around to verify this.
> True, but their Navi cards are at least competitive in the price ranges where they exist. Hopefully the high end ones next year continue that. If you’re looking at Titan or whatever the current ML thing is in the $1000+ range, then you might be stuck with Nvidia.
I am rooting for AMD's Navi cards too. It's just unfortunate that CUDA seems to be more supported than OpenCL.
So I think there's a big segment of the "modular" market that only really cares about having GPU options and upgradeable RAM.
It's not for everyone, but the people in between the high end Mac Mini (6-core i7 + thunderbolt GPU) and the low end Mac Pro (8-core Xeon W and internal expansion slots) are a small enough slice that Apple doesn't care.
If Apple doesn't want to serve the "I need a decently powerful machine but don't want to waste money on what is basically a status symbol" market, maybe they should license MacOS to someone who does. Something like that might actually bring me back to the platform. As it is, I stick to PCs running Linux.
I'm a programmer. If I want to prototype a multi-component stack, I've got a Raspberry Pi Cluster running K8 for that. If I want to play around with deep learning, I'll just spin up EC2 instances. Games? Let's be honest. That war was lost decades ago. I've got a good PC rig for that now. The days of a general machine to do everything are gone.
The only issue is the storage space, which I can get from a multi-bay drive enclosure.
Businesses listen when their bottom line is hit.
Apple’s hardware/software combination is the reason a lot of us have stuck with them for so long. It’s quality control. No drivers to fuss with; just plug in and go.
My MacBook's keyboard disagrees
Some people do not want to do that.
But as I said dual GPU machine for half the price of an iMac Pro that can run Octane Render which requires Nvidia cards with CUDA left me asking why I didn't do this years ago after using that for 15 minutes I couldn't go back to how I did my work on my old Mac.
If Apple wanted my business I would have given it to them but they were pretty insistent that they had no interest in my money.
And it should cost around $500, not $12,000.
$500 with what I request is certainly available on the PC side of things and I have that and run Linux on it and have transitioned everything I can off Apple because they can't produce the desktop that I need. I've put Mac versions of my products on bug fix only status and when customers ask me when the new version comes out I tell them never and advise them to try Linux.
Thunderbolt GPU cases are hundreds of dollars.
For a dual GPU Mac Mini setup (which would be a mass of cables and require 3 outlet plugs) you could afford a 3 GPU Windows tower.
As someone who used to be a pretty extreme Mac evangelist I really did look into all this before I made the switch to a Windows machine.
The first Apple computers I bought lasted for years and I never thought Apple Care was necessary, but since the fiasco with the 2011 MBP and the butterfly keyboards I'm not buying another of its products without Apple Care. Yes, Apple ended up doing the right thing in both cases but it took years after the problems started and a couple of class action lawsuits.
I don't get why freedom loving people rather have a corporate take away their freedoms than a democratic self governed body.
Both are power clusters, but in one you have a moral ground to criticise when it's acting against the populous interests (democratic government). While abuses of power from corporate cannot be argued against morally, you don't own them, they are playing by the rules, they manage to make a buck, all is above board morally while in actuality masses of people might be hurt by corporate abuses of power.
Freedom is extremely important but it is sometime conflicted with other values, a reasonable discussion of the conflict and tradeoffs is a much better way to improve life for people, than fanatically protecting the one single value you hold most dear.
Big mistake. Package was returned and I had to remove them.
My only point being it's a poor port in part because of its delicate ecosystem, which necessitates very strict regulations.
> NZ is a fantastic place. It takes 24 hours to get anywhere around the world.
Yep, it's that far.
~12hrs to Santiago Chile. ~9hrs to Bali, Indonesia.
~13hrs to Los Angeles, USA.
~4hrs to Sydney, Australia.
It'll be handy when the zombie apocalypse occurs though: it'll be quite the swim for any ghouls.
If you compare the value of the peso in Mexico vs the dollar in the US, Apple prices are in comparison much more expensive than in the US.
TV set you get on Black Friday works until X-mas. It was a bargain, what did you expect?
That discount you got on your car? You think it was because of your outstanding negotiation skills? Think again, the seller swapped a cheaper bifurcator, meaning your mileage will suffer. But is it wrong? You got a discount.
If you sell it, if needs to be good. If it is not good, you cannot sell it. There's no price in question.
This of course means one don't get as much ultra cheap stuff sold, but at the same time most of that stuff is crap, and it's better for the environment to build stuff that lasts.
A single-year warranty is in general an indicator that a) the manufacturer doesn't have faith in reliability past the first year and b) the manufacturer is begrudgingly just meeting the minimum standard for warranties in most jurisdictions. Even if neither of these are the case, the optics aren't good.
Dell's Precision workstations come with 3 years warranty, upgradable to 5 years coverage with on-site service. There is no reason Apple can't match that.
This wasn’t an issue years ago because reliability wasn’t an issue. My ~2005 30” Apple Cinema Display is going strong connected to my 2010 Apple mini. My 2017 MBP however has been the most unreliable hardware I’ve ever had (my first computer was a 286). 6 weeks for repairing a pro device with a hefty premium price.
These type of design flaws should be fixed up as part of their normal warranty process.
Unless of course you are talking about the EU-wide consumer protection rights which apply for 2-6 years after purchase, but which people very mistakenly call a guarantee. The problem with this protection is that it protects you only against manufacturing defects. And anything that happens after the initial 6 months is up to you to prove that it existed at the time of purchase. So no, if your MacBook suddenly dies 23 months after purchase, apple doesn't have to repair it, unless you can demonstrate to them that it died because of a manufacturing defect. In comparison, apple care would get your laptop replaced under the same circumstance.
p.s. Switched to Windows when I bought my laptop for that reason.
Do Apple hardware failures happen often?
In ~20 years of assembling PCs with consumer grade parts I've only had 2 legit non-keyboard/mouse hardware failures (3 if you count lightning destroying a power supply before I knew about surge protectors). For context, I've had 0 hardware failures in the last 10 years. I keep my computers on 24/7 too.
I still prefer to DIY for value... my 4790k build lasted 5 years, just put together a new one, placeholder cpu waiting for 3950X. Will probably, again, last 5 years or so.
My 2 legit failures were an old IDE HDD that audibly clicked like crazy for months before failing completely. Since it gave such a long warning I was able to back everything up thankfully. The 2nd failure was a different power supply that just stopped working without any notice or grand events. I never had any RAM or video cards go bad, but after assembling a new machine I always run them through rigorous automated tests to help detect faults. I've had 1 stick of RAM be DOA but I don't count that as a failure.
My current system is an i5 4460 (3.2ghz) with a bunch of other components. It's been going 5ish years now without issues and I work this thing pretty hard. Full time development, video recording and editing, gaming, etc..
Aside, kernel (5.3) and drivers (mesa 19.3) are all updated, can finally drop in the rx5700xt card I'd been sitting on and play with it. Amazing how many things are working without issue via Steam's Proton (Wine, dxvk, etc)
These were one's that specifically came to mind for me:
2019: Battery Recall: https://www.wired.com/story/apple-macbook-pro-recall/
2018: Macbook Air Logic Board issue: https://www.laptopmag.com/articles/apple-macbook-air-power-i...
2018: Battery Recall:
2017: Screen staining:
2018 SSD Issue:
2017: Battery recall
"Flexgate" _ https://www.macrumors.com/guide/flexgate-macbook-pro-display...
2013 - 15 Retina Screen Issues:
White Macbooks Yellowing - "https://forums.macrumors.com/threads/unibody-white-macbook-t...
White MacBooks crack:
But that's not the world we live in. Nvidia's pitch is "We make the only hardware that runs the framework used by almost all deep learning research and media creation software, and we're also the only folks operating at our level when it comes to high-end video game graphics, and if our high-end cards are too expensive for you, we have cheaper ones. And when our competitors start to think they can catch up, we'll drop RTX and Tensor Cores on their face."
AMD seems stuck on "We have great value middle-to-high-tier video game graphics cards" at best. I have no idea how they can get out from under that rock. They've been turning the CPU market upside down and smashing Intel's once proud "near monopoly" status. Nvidia seems like exactly the kind of prideful company that would be poised to fall, but I have no idea how AMD could make it happen.
Most people aren't spending over $500 on a GPU, so they get the volume sales. The better aftermarket cards have been selling out pretty consistently. And the longer term strategies are similar to how they approached Ryzen. So, I'd think that Navi can definitely succeed in that light.
The real lock in for NVidia, is all the existing CUDA development. Intel and AMD will have a lot of work, One API may help there, so long as Intel and AMD can work together, without Intel's often and weird lock in attempts.
It was always about software. Granted, CUDA is not the best and most elegant platform in the world... but AMD seems not to be able to reach even that level.
Or that it was cheaper to get reliability at another layer of the stack (rather than hardware).
I'd still likely only go AMD GPUs in the near future, just because I don't have particularly demanding GPU requirements and they have better Linux drivers.
Lose-lose for customers.
Anyone privy on what hold Apple back on supporting Nvidia? The only info I read was due to some earlier bad blood, but I wonder why hurt your users just because of some old scuffles...
They probably both each want too much money.
So no, it is not a replacement, again. Elegance matters.
For and Engineer or Computation Scientist, I think this is not a bad investment
But Yes existing CUDA libraries won't work.
As an example, I worked a bunch on our PostgreSQL infrastructure, but we support 5 years of versions. So you have to CI/integrate your changes and test them across all 5 versions, every time you test. My machine could do this on the order of 2-3 minutes -- recompiling everything from scratch every time, and running all integration tests on all versions. There's no CI system or cost-effective cloud hardware that would have given me turnaround like that. In the end I was doing hundreds of builds over the course of just a few days.
In contrast, at $WORK we have a C codebase that takes < 1 minute to compile on my 4 core work laptop. YMMV.
Also, all that needs to be set up. If you need to change some little thing, you're much faster doing it locally, rather than trying to wrap your head around the build server configuration and figure out how to just run this one test different but only for this branch and not for the normal builds ...
It happened, when the 2nd gen TR arrived. It used the same mainboards, so all the manufacturers issued BIOS updates.
Unfortunately, these updates claimed to support SEV (Secure Encrypted Virtualization). Linux of course tried to initialize it at boot/module load time and the entire thing went hanging, because TR CPUs do not support SEV, only EPYCs do.
So there were the following fixes:
1) downgrade BIOS back to pre-TR2 version,
2) blacklist the ccp module; which would make kvm_amd non-functional,
3) wait for a fix in Linux kernel, which initializes SEV with a timeout.
So it wasn't that tragic issue, if you had first gen TR.
When it was bad, it was so bad. I never wanted to know as much as I do now about IOMMU groups and PCIe lanes.
When I bought it I was doing some embedded device work involving a Windows VM, as well as heavy web dev on a frontend JS app and a backend app with tons of Chrome tabs open. All these things crave memory, and my 16GB MacBook Pro was swapping itself to death. This was pre-32GB MacBook Pros, so I bit the bullet and couldn’t be happier with the setup. It’s dead silent basically no matter what I do and doesn’t get thermally throttled during heavy workloads. Having the extra RAM also makes a huge difference.
Now that I’m doing more regular web dev on an Elixir & React app, I benefit from the 16x parallel compilation & test suite, as well as the ability to keep basically anything open without resource issues.
I have around 5-6 windows with 10-15 tabs each within them as well some ssh tunnels and some other applications open (I do heavy front/backend with docker builds through ssh as well as ML work through ssh)
Not sure what's going on, but my lowest-end Mac is running everything I need like butter.
Not silent, exactly though, hehe.
I’m enjoying the combo of a powerful workstation where I work most of the time, and then a thin & light portable device for when I need portability. For me that combo works well and hits the right trade offs, especially the silence I get to enjoy.
I’m also remembering that Apple ships the same SSD setup in the newer MacBook Pro machines, which weren’t yet out when I bought this iMac Pro. Definitely nothing unique or unusual about that speed anymore!
The weight isn't too bad, I have it in my backpack commuting by bike in hilly Norway. But maybe too heavy for a shoulder bag. The battery time is useless, yes, so that's the big tradeoff. But since I bike instead of commuting by train or so, I never used my previous laptop not plugged in anyway.
Even those crippled by being connected to two lanes can achieve this speed.
That said, if you are just running simple Go or Rust compilations, or small single-package C++ compilations, caching and a decent processor should be good enough, and you probably won’t benefit much from a lot of threads. (You may want it for your Webpack builds still ;)
One tip: scale up your RAM, pay attention to clocks and latency relative to your processor (especially with AMD where it really matters.) 16 GB is easy to kill when you are running 24 instances of GCC and a VM or two.
I would think the extra cpu would be more of an impact with Rust than even C++... I wouldn't know, running a 3600 as a place holder until the 3950X comes out. I couldn't handle the 4790K anymore, going from 32gb to 64gb and the couple extra cores made a huge difference for my docker/vm workflows. Can't wait for the 3950x. I'm sure the TR 3rd gen will be similarly impressive with the changes to IO.
I’m rocking a Samsung 970 Pro 512 GB on my desktop. I never thought I’d need more space than that, since I can always use my NAS or the spare spinning disk I have installed. But, the more CPU power you have, the more you can feed it... I find myself building entire fragments of Nixpkgs now and it takes substantial disk space to do it.
The contribution experience was a nightmare because a full build of Firefox took 3 hrs and running the entire testing framework took 4 hrs, though it turned out that I needed to run only a part of the testing framework. Changing a single line and building it again still took more than 30 minutes.
That was the first moment that I wanted a HEDT in my life. It feels like devs who work with big C++ projects would want a bigger workstation because of the significant build time.
Hm, I guess I don't live anywhere near a data center. 1M+ North American city, but lowest ping to any AWS/GCE region is ~130ms.
Edit: Now resolved, pings (barely) over 50ms.
Level 1 contributors (contributors who are acknowledged by one of Mozilla dev) grant access to TryServer.
I've recently built php + most of the bundled extensions, in a VM (on a 2018 Mac mini) and it's only a couple of minutes at most to build.
Seriously. Chrome and Firefox are both guilty of this.
Even if not for that, I tend to cross compile a lot of software since all the engineers at my company use macs for software development but we deploy to linux servers, so often I end up building rust binaries for linux and it's fairly computationally intense.
For an anecdote, on my work laptop (i9/32Gb/512SSD 2018 15" MBP) I can compile the dev environment from scratch in 40 minutes whereas it takes 4 hours on the company standard dual core/8gb/256gb 2015 13" macbook
I’m a robotics controls engineer. The code I write is pretty short compared to most of you, and it doesn’t take long to compile. But to test it I have to run very complicated dynamic simulations. These never run in realtime. It’s not uncommon for a thirty-second simulation to take five, ten, fifteen minutes to run; sometimes I have to run many iterations of the sam sim to collect valid statistics. So both CPU speed and number of cores translates directly to time spent waiting for the sim run to finish.
I have a 2013 trashcan Mac Pro with 12 cores. It’s still a good machine.
Possibly the only laptops that could compare to desktops are gaming laptops, and I know developers who buy things like the Razer gaming laptop to try and get desktop-like performance.
I migrated from a MBP to a 2018 Mac mini with 64GB, and the ability to run say 10 VMs at once, without having to limit them to half a GB RAM each is amazing.
I'm not in the market when this thing (Mac Pro) is actually released but in a year or to, I may well be migrating to the latest of the Pro line - I don't care about GPU (much - it just needs to drive some displays) but I do care about CPU speed + core count, Memory and fast storage.
Actually my main rig is the last mac pro, the 2013 one - I just don't use apple's OS. I really like the rig and I expect to use it until at least 2023 (10 years of use).
The latest MacBook pros (8 core, 16 threads, 32gb ram, 2tb flash disk) are finally fast enough with enough ram to both run CLion with clangd indexing and do C++ compilation/testing. Full builds are still not advisable (not that a full build is particularly useful since our prod is only Linux and we have CI/desktops to avoid cross-compilation).
I use a 3 year-old laptop with 32GB of RAM, an SSD, and a dual-core i7 (U-series) processor. It's reasonably efficient. I think the amount of RAM and the IO speed of my SSD remove most of the hardware bottlenecks I face.
However, I would like compile times to be faster, I do sometimes notice applications hanging, and IntelliJ (we use Java at work) seems limited on how many projects I can run efficiently in a single workspace. I'm just wondering whether a workstation-grade laptop (Dell Precision or Thinkpad P-series) would be a sufficient upgrade, or investing in a desktop would be worthwhile at some point.
Yes, either 17" machine offers an 8-core CPU and plenty of thermal capacity (i.e. larger chassis than their 15" equivalents), along with more than enough memory (128GB), and in Dell's case, a boatload of HD slots (4, run in whatever RAID config you want). These are desktop replacements, heavy (as in 6.5+ lbs), with 240W power bricks , so you'll probably want them to stay on the desk for most of the time.
I'm personally looking to replace my old Dell Precision with a 15" 5540, maxed out. For Scala work the extra cores help reduce build times, probably the same for Java.
In native app development, Android Studio and Xcode are severely bottlenecked by 2 cores. The difference between my laptop and my Mac Mini (6 cores) is astounding.
The difference between a build taking 6x minutes and 1x minutes is more than just 1/6 because it goes from "Well, guess I'll check email/messenger/HN real quick." to taking a sip of coffee.
It also gets real bad when your IDE is forced to fight with your browser, email client, etc. Then you're forced to do nothing, lest your casual actions further slow down your local build.
Usually the "breaking point" is when builds take more than 5 minutes, but especially when they take more 30+ mins.
My build times aren't as long, because I'm working with Java-based microservices, so each service takes a minute or so to build (with tests) but even that delay can break concentration. Turning off tests helps, but then you don't always have the immediate feedback of the test results (and don't worry, I always run the tests before committing).
Android is probably the only platform that in almost every conference has at least one talk related to how to improve build speed, thanks Gradle + Android Studio.
But now that I am working across projects that run 2-3 repositories at a time with Jetbrains IDE, the computer almost barely keep up. I get times where typing will actually lag.
I am contemplating asking them for a Mac Mini instead since I rarely ever venture out from my desk.
Cumulative wasted minutes from compilation over the course of a year can add up to a lot.
It's absolutely faster than any business laptop for doing builds of medium sized C and C++ projects.
(But I think a non-HEDT desktop would also outperform a typical laptop in this use case -- thermal throttling is a real problem for laptops.)
Last year I bought a T470s with 20GB DDR4 and an i7 (U series) as I was mostly working away from home. It's good enough for most of the work I do, but it can be a bit slow at times. The processor just isn't as fast and the integrated graphics struggle with a 4K desktop (I think that's mainly Linux/GNOME being unoptimised though). I haven't noticed it throttling, but my workload usually isn't that CPU intensive.
If you mainly work from home I'd definately suggest building a desktop machine for your needs.
Of course, it's far bigger and heavier than the Mac, but also of course (for I am Old Skool) I'm using it with a decent keyboard, a decent-sized monitor, a decent mouse, and a USB-C charger. It does mean I can't go to meetings, but that's ok with me...
Mind you, my workflow could probably do with adjusting as well, I tended to work like I do with webapps and just continually rebuild/restart and review my changes on the device itself (or an emulator).
Part of me also wishes we had a big setup with multiple docker images running in parallel but ATM we have the luxury of working on 'just' a website (react, some lambdas, back-end are all third party services, we connect to a staging environment during development) so it's not too bad.
But I'd still like a permanent machine, there's something about (and this is me idealizing) having a fixed workspace you don't have to pack up every day. I mean sure disconnecting a laptop and yeeting it into a backpack isn't that much effort but it's the little things.
It's way better at running Docker instances, a bunch of electron apps, and tons of Chrome tabs simultaneously without a hitch.
Having a real keyboard, mouse, and four 27" monitors is something I will never leave behind at this point. All that screen real estate to spread out over helps enormously. I can have a browser with our application pulled up, Visual Studio, another browser with doc, Webstorm with the front-end code, SSMS connected to the database, an email client, Notepad++ with logfiles, all the different chat clients I have to use, and more if I really need to, all on screen and available at the same time. I don't have to alt-tab around, I just look and it's there.
If I had a similar performing laptop that could replace my desktop in terms of GPU performance and compatibility with the devices I use, my HEDT would start collecting dust.
I still have an old laptop that I use to ssh into my workstation occasionally.
Web Dev. Going from a 2015 MBP to the new touchbar model, my cold build times were cut in half. Other than, saving 15 seconds once per day, I noticed nothing.
We have discussions on HN  when it was announced. The Tech Spec page has been there since day 1, I looked hard and dont see any significant changes, if any changes at all.
I’d go so far as to say they should sell it without a default hard disk if they could - let the buyers make the choice. I’m guessing it’s defaulting to the smallest disk just for optics.
This is clearly a start with the skeleton and build it yourself kind of machine, which is as it should be. It just looks odd when the selector defaults to the lowest available option on every selection.