They are spending a fortune to understand what Pros need, while we have been screaming for the better part of the decade: lots of user accessible RAM, storage, PCI & GPU. How much would it cost R&D to update the Mac Pro? Teenagers can do it for peanuts, I'm sure the largest company in the freaking world could manage. If they were truly sorry or marginally “cared”, that's what they would have done years ago.
I'm fine with them playing the “let's reinvent the Pro computer” game as a side project, they have this massive amount of money burning a whole in their pockets after all, but some of us need to work.
I hope/think that Apple is thinking that bringing back the cheese grater or in other word a PC running MacOS is not going to be enough to bring back the pro. They need to come back with something that has something unique.
The TrashCan failed but you can see where Apple was going. A small, silent, powerful machine infinitely customisable on the fly via TB.
So of course if failed and this attempt has a good chance to fail too. I'm not quite sure that Apple has in its DNA the deep understanding of the type of Pro that are not content with either the iMac Pro or MBP in the same fashion they get the consumer sphere. It is just that for a long period of time the intersection between the 2 world was large but are now diverging. Perfectly happy for me to be wrong though.
Before they can think of "bringing back the pro", they need to stop the bleeding: pros who need to upgrade this year have no choice but to go PC. An upgraded cheese-grater would have been a good stop-gap while they go back to the drawing board on their 2019 product.
Right now I program Rails on an old 2012 MBPr. I want to get a faster machine. Possibilities:
1) Spend 5k on a iMac Pro. That's an absurd amount of money.
2) Get a new MBP limited by USB C, new keyboard, and the abominable Touchbar. That's a lot of money for a downgrade.
3) Get a decent specced iMac or Mac Mini. Most viable option.
4) Ditch Rails, start learning .NET, so I can go Windows 100% to build my own PC. I save money. I lose a lot of time.
5) Build PC, install Linux. Viable but might be a configuration nightmare. I just want to work.
6) Run Linux under Windows host as a VM. Second most viable option.
I'm pretty frustrated. Suggestions?
System 76 sells machines with Linux pre-installed. I've had good luck with these (replacing the default Ubuntu with Arch of course).
7) continue developing Rails using Windows 10 and the Windows Subsystem For Linux
Overall my experience with WSL has been great.
I did this, bought an XPS and a precision. Both refurbed in sales/with coupons, great specs. Cost me ~500 less than retail. Both like new.
Sure I'd prefer a mac, but its over double the cost for similar specs. I could look past the touchbar and ports. But not the keyboard
8) buy a librem. I don't know how good they are, though
I suggest an upgraded 2009 Mac Pro.
The upgrades would include: flash to 5,1, 3.46 MHz Intel Hexcore proc, 32 GB RAM, SSD on a PMCIA card, and 5770 GPU. This should cost $1K USD or less, dual CPU (12 core) for a bit more. Mine Geekbenches at 3800 (S) and 17,800 (M).
I'm running one of these and it has been bulletproof and 100% hardware/software compatible.
The iMac Pro is overspecced for Rails development. You can get a very decent Mac for much less. If you keep your work environment synced via iCloud or Dropbox, the problem of syncing I mentioned doesn't exist (I prefer the minor discomfort to syncing through a third party).
As for 6, get a Linux PC and run a Windows VM when you need it. It's better to have the decent OS host the crappy one than the other way around.
If you need more power for $2900 you can get a Core I7 4.2Ghz quad core with the same specs.
I'm not a Mac fanboy by any stretch of the imagination - I haven't bought a Mac in 12 years but I want to get away from Windows and don't want to futz with Linux. Getting an iMac would allow me to develop for iOS, Android, Windows, and give me a real Unix environment.
You can do .Net Core with a Mac. I am a Windows developer now, and will probably stick with .Net, but especially for hosted solutions, using Linux is a lot cheaper for deployment.
Also configuring Linux is not the beast that it was years ago. Even on laptops it is fairly straightforward.
I feel the pain here too. It feels like macbooks have stagnated the last 5 years. I am struggling to do java / microservices development on my MBP. I just don't have the RAM and cores to run a lot of server processes.
The only Macs worth buying now are the iMacs.
If you don't have a legal office declared for business activities to the regulators, then yeah it might be zero.
Then good luck in court.
That would be MacOS itself. That's all they need to bring a huge number of Pros who bailed on the entire Apple ecosystem back into the fold.
But until they bring back that stability, macOs won't be sufficient an argument to bring back pros I think
> I think we designed ourselves into a bit of a thermal corner, if you will. We designed a system with the kind of GPUs that at the time we thought we needed, and that we thought we could well serve with a two GPU architecture. That that was the thermal limit we needed, or the thermal capacity we needed. But workloads didn’t materialize to fit that as broadly as we hoped.
The biggest problem is how they didn't know this when they were designing it? And how they managed to find out in 2014 and didn't spoke a word until 2017. And now their answer is 2019.
Not to mention, as the OP suggest, they could have fitted Modern Intel CPU and 150W Vega GPU in it, and sell it for lower price.
It's going to be interesting to see if they stick with Intel for the Pro. It's entirely possible they won't, especially given Intel's security issues.
What that means for performance remains to be seen.
Wouldn't the apps that the pro users dictate that to a large extent? If the truly "pro" apps don't get rebuilt against ARM then it would be like Apple never really released the Mac Pro if they ditch Intel so early into its return.
Arm is not going to cut it in the dual Zeon workstation role
I'm more doubtful of their ability to produce something sufficiently competitive.
Usually consumer software moves faster to adopt new tech stacks anyway. OS X supported x86-64 in v10.5 in 2007, but Photoshop didn’t go 64-bit on Mac until 2010 with CS5.
My best guess is that it’s because big pro applications have a mess of dependencies and some inner guts in the high performance code where it drops to assembly. Building that for alternate architectures isn’t as simple as checking a box in Xcode.
I am sure major shareholders of apple will not want to bankroll a 7nm fab and the massive massive costs in producing a competitor to AMD and INTEL
I really still have a lot of doubts about this idea that they will even try it. It doesn't seem that there is really anything in it for them in doing this.
And it would have shipped.
You could do that when the workstation cost $25,000, (in 1990 dollars!) I'm not sure how viable it is when the workstation is less than $5,000.
That said, I really miss using software products where that level of thought has gone into its design. The three 'creative' apps I use (other than writing code) are drawing, schematic capture/board layout (EDA), and writing. All systems that benefit from people investing in how the flow of these things work, and all of which have degraded over time.
I think it work because the customer spend is of comparable magnitude. Back in the mid 90s you might have a couple of dozen game devs working on $2000 PCs with a few $45K Onyx machines used for back end rendering or part time use of a few artists = ~$200K. Or (in my case) you only had a couple of AI developers but each had a pair of $50K 3650s = ~$200K. Or your prop traders had a $25K Sparcstation on their desk and one at home, but the rank and file just had a Bloomberg terminal.
Now you have much larger teams with more uniform machines (+ some cloud resource). The total institutional spend is probably higher in constant dollars.
Speaking of which it will be interesting to see if Apple is able to push some of their APIs into the cloud so you can develop on a mac pro and dynamically push the heavy lifting into an apple rendering cloud. This article just talked about scaling ios <-> Mac Os <-> MacOS+iOS+eGPU (nice!) but left out "<-> cloud". That's where editing on iOS could really shine: chop up / assemble your downsampled rushes on your tablet and then stream them to the AppleTV at your bosses' office. Remember Peter Jackson used to bring his to LA every week on an iPod in his pocket.
I'm wondering if it'd be viable to have a 'compute unit' shared by multiple users. i/o would probably be a bottleneck, so maybe have it link up macs via usb-c? It might help with e.g. compile times (which last time I did iOS development, two years ago, was kinda slow (>1 minute for our app, CPU bound)).
But that's probably too specialized a thing. Apple wants to sell units by the million, not the hundreds (if that). Which is why they discontinued the mac pro and the 17 inch MBP.
They could have put out a cheese-grater with fresh Intel chips every year like clockwork, but they were too cool for that. And the failure is all the more striking when they continue to "crush it" in other categories.
These days I get astoundingly better performance than whatever that low volume hardware did back then out of a $700 desktop with a $800 graphics card and a $500 monitor.
$5000 is probably still too high, unless you see the Apple product line as your only option.
The worst thing is the linkage between knowing what you want to do and assembling the right tools to do it in the app you're looking at. Most of my drawing is technical in nature (systems, small components, software architectures, Etc.)
Most Mac Pro users have moved back to PC, or iMac's/minis just for iOS/macOS related dev, with most work being done on PCs.
Apple had a window where developers needed a heavy Pro Mac for iOS, nix development like python/ruby/etc, even Unity dev back when it was only on Mac (2007-2010 ish), it was also great to have a Mac after the 2006 Intel processor move as it was the sexiest nix and great for developers from 2006 on, but those days are over.
Somewhere in 2013, they just moved on from the cheese grater and Pro market that included developers and content creators. Now they want to regain it? There was so much momentum squandered here and hearing Tim Cook repeat "Post-PC" and Apple messaging that desktops were like trucks, only developers need them, caused them to move on, so did the pro users and developers. Apple even watered down their developer laptops and Macbook Pros, 17" screens were relegated to being 'lapzillas' and Apple went only mainstream. Content creators and programming were not something they focused on anymore after the iPhone took over, eventhough those influencers were always the focus. I thought they would use the iPhone and iOS/macOS platforms to get more people on their desktops as well, instead they went the other direction.
We used to be Mac Pro heavy now we just have iMacs/minis for the last mile or iOS/macOS export and testing with performance heavy beefy PCs for most of the day to day work. Additionally, taking your jet engine/trashcan new Mac Pro or iMac to the mall for repairs instead of just popping in a new video card or drive also sucks.
Developers can get two performance PCs for the price of one Mac Pro. Usually you can get more power out of both as well since Mac Pros from 2011 on were 1-2 years behind. There is no getting back pro users such as devs/game devs that had switched over and have now gone back to Windows/PC.
Wasn't there a statistic from github recently, that 75% of the PRs created in 2017, were done from Macs?
They are bigger in the market for devs then ever before, the thing is, that the majority of this peer group is totally fine with a MBP.
I see this whole new MacPro story more as a marketing campaign, the driving factor behind this is not the demand from the market, it's only for their long-term reputation.
At far as desktops though, even creative/advertising agencies are moving to PC for 3d, photography, graphic design, audio production, video production and more. These were always the target market of Apple but they lost many.
Apple iPhone lost ground to Android and laptops may also do so starting with development. I see many devs that went from Mac to PC desktop go from Macbook Pros to Surface Pros but that is just starting. Eventually it could also start switching people from iPhone to Android/Samsung/Google hardware as well as the hardware has caught up.
Apple being a hardware company, it is strange they did not take more advantage of the desktop inroads they were making in 2006+ with Intel and iOS development.
Apple can't regain the 3D, design and production market unless they can get a solution that's competitive in price with PCs. I'd suggest keeping Macs as the most delightful to use desktop and transparently offloading the heavy lifting to racks of commodity PCs running headless Apple software (which may even run on top of Darwin if that makes the porting easier).
OTOH, if Apple can develop a significant edge in performance like it had with early PowerPCs. Then Apple becomes the best bang for the buck again. Since they are designing CPUs, they can be pretty creative here.
I don't think this is really true, all the creative agencies I'm aware of are still firmly mac shops. Do you have a citation or anecdotes to back up your assertion?
Adobe Creative Cloud apps tend to work better on PC now so this is a big reason.
The areas I mentioned (video, photography, audio, gamedev, 3d etc) are moving to hefty PC machines that need 64GB/128GB/256GB+ of ram, latest GPUs, many drives and many cores (16-40+), no iMac can do that and Mac Pros have left the building. Many of the PCs are cheaper and more powerful as well, custom built ones especially. Many web developers or graphic designers still on iMacs, but the other areas on PC.
Basically, anyone that needed a Mac Pro, where an iMac or Macbook Pro was not enough, has probably switched/back to PC. You'll see most people do this around 2013-2014 when the Mac Pro was un-cheese grated and put in a jet engine/trashcan, looks cool but too expensive and not for hands-on pros or expanded easily.
One video/photography/3d guy I know is running 40+ core/thread machines 256GB RAM, hard for Mac Pros to compete with that.
(and personally, yes I'm stuck with a laptop due to work atm. Wouldn't mind having an office or fixed work place with an imac pro though)
No, there wasn't. Github posts lots of stats but nothing about OS shares.
There was also a window where Unity was only available on Mac from 2007-2010 where lots of Macs were purchased by game devs.
Also making Linux/Apple/web games there was a little blip for a while where Macs were making inroads especially after going Intel processors. OSX is arguably the sexiest nix compatible machine and is quite fun/efficient for development.
Overall game devs have always used PC as the main for desktop, but with mobile and iOS taking over handheld gaming in 2007, game studios all had to buy them.
Now, most end up just getting a few iMacs or Mac Minis to export on and debug otherwise much of the game dev work is done on PCs.
Apple truly lost game devs, but also other content creators like video editing/production, graphic designers, photographers, audio production and more. These professionals were always the focus of Apple, amazing they just let those people wander off as they are also influencers. I even see lots of devs going from Macbook Pro to Surface Pros now because their main machines are back to PC, that could eventually start causing people to switch from iPhone to Android/Samsung/Google as well.
There was an anecdote about a Power Mac cluster at a university, which basically couldn't get past boot because the memory errors was so frequent.
But even if true, so what? What are you doing on a workstation where a bit flipped every 3 days is going to cause a problem? Anything critical is going to have a software check. I just have never seen any legitimate argument for needing it in a workstation.
Depends what bit flips. Given some of the file formats out there, the bit flip just before it is written to the disc might be fatal.
> Anything critical is going to have a software check.
I would love to see someone go through github and analyze if that is true.
I just cannot imagine why it would be acceptable to have bit errors when a preventative measure is available. If the industry would stop looking at it as a value add, the cost would come down and we all would benefit from a more stable platform.
What instability are you talking about? Has anyone experience any instability ever on a workstation that could have been attributed to a cosmic ray bit flip?
I wouldn't trade a 10% memory performance drag for ECC if the memory was cheaper than standard.
By definition, if the bits can flip in a way that wasn't intended then it is not stable. I'll take the 10% because I pretty sure getting to wrong 10% faster is not worth it to me.
Not everyone crcs 100% of their structs.
If they don't, data is being slowly corrupted.
Hardware was at fault. Turns out suppliers push the truth a bit here and there about their chips, what is supposed to be an identical part # isn't always identical, parts from different manufacturers that are supposed to be interchangeable aren't, and technical manuals aren't always the best.
We had (what we thought was) rigorous memory testing in place at the factory, but under certain extreme conditions there was a 100% reproducible flip of a few bits in RAM. It was almost always the same bits, thankfully, that made it possible to track down!
It really felt like the best platform. I had a nice G4 tower, and eventually used a G5 and then Mac Pro tower. Things generally just worked, and you could expand things as needed.
Roll around to today, and it's a mess. Apple canceled a lot of their pro products. Aperture? Dead. Final Cut got nerfed pretty hard. Logic is still reasonably well maintained, but Ableton Live took over a lot for me.
The hardware options aren't that great right now for my needs. Last week, I sold my 2011 Macbook Pro to a friend and switched all my audio stuff to my old Windows gaming system. Picked up a Firewire PCI-E card for $20 and I was in business. It's fast, has a ton of ram and is easily expandable. Windows 10 drives me nuts, but eh...
I've still got three Macbook Pros for software development, and they are pretty great for that (minus the new keyboard, and that Tensorflow GPU has dropped OS X support). I don't think I'll be switching laptops anytime soon. But ugh, audio work on them just feels like an overpriced joke at this point.
I really never thought I'd be saying it, but for photography and audio stuff, Windows seems to be the place right now!
But the native terminal, etc, has won me over as a *Nix person for a long time. I don't like Linux desktop interfaces at all, and the new Windows linux/bash stuff hasn't been compelling or integrated enough for me to want to switch (ConEmu and all of the terminal stuff being garbage is a big issue). I'll be sticking with a MBP and OSX for my laptop and work related stuff, but I don't understand why anyone would still be using OSX for a photography workflow. It's just so much more polished on Windows.
That’s not my experience, at least with Classic.
Mine is a Mac house. My parents, though, are long-time Windows users; I only managed to get my parents to run a single iMac, which they gave away as soon as they could financially justify it.
I tell you that because that is where my cross-platform Lightroom experience comes from: almost all Mac, with occasional mentoring/support sessions under Windows at the parents’ house. With Lightroom on Windows, I’ve repeatedly observed:
1. Windows’ brain-dead default-mandatory file locking prevents perfectly reasonable operations, simply because Lightroom is busy working with one or more files in the background. Advisory locking as is default on POSIX type systems allows many things that Windows refuses to allow by default. (E.g. Rename a parent folder while a file in that folder is open for writing. Who cares, the FD is still valid!)
2. Lightroom can take a long time to shut down when it gets busy, as it too-frequently does. On the Mac, the app icon remains marked “running” while Lightroom grinds away, trying to figure out how to shut down, but on Windows, the app icon disappears from the task bar almost immediately, but Lightroom.exe remains running in the background, so that it is not obvious why reopening the program fails for minutes at a time.
(And why reopen? Because relaunching Lightroom often solves slowdown problems, and has for years upon years, which is a separate rant.)
3. Some plugins simply won’t run on Windows, at least not without dragging along a bunch of compatibility junk. Anything that depends on ExifTool, for example, requires dragging over a whole Perl environment just to run the plugin. If you have multiple plugins dependent on ExifTool, as I do, each one usually comes with its own Perl environment. Compare macOS, where at worst you have the CPAN module alone, and at best, it might simply require that you install ExifTool separately, that being a reasonable user requirement on an OS like macOS.
If your comment was referring to GPU acceleration and such, Adobe’s recent focus on that isn’t helping me anyway. Almost all of my problems with Lightroom’s speed are in the Library module, not the Develop module, where GPU acceleration doesn’t help much anyway.
If you're training a model on your laptop, instead of a server, does it really matter? It doesn't seem likely that you'd want to deploy a model trained locally.
After "can't innovate anymore my ass" Apple released their architectural dead-end trashcan Mac Pro in _2013_ we have to wait until _2019_ for an update? What am I supposed to buy? I'm certainly not going to buy anything with a Touch Bar. It's 2018 and I had my job order me the 2015 Macbook Pro for my work computer so I could skip the Touch Bar and still have USB ports.
What's going on at Apple?
As painful as it was to have to buy dongles and whatever else, having monitors + USB docks using USB-C has been pretty pleasant.
Do they really think a 10% size difference is more useful than a good feeling keyboard? What are they smoking.
They are nice machines with plenty of performance. With 4 Thudnerbolt ports, you have some expandability.
It’s not the same as internal upgradability, but it’s also not insurmountable.
For the first time in 33 years, I no longer have access to a Mac. Apple doesn’t sell one that my employers nor myself want to buy.
That it has taken so long to course correct would have sunk most companies. It is probably a couple percent hit on Apple revenue at most.
Doubt there is a Lenovo waiting in the wings, nor would they allow it, a shame.
It's as if Apple are saying "we're making a computer for people who demand peak performance, but are willing to use cheap outdated hardware until we get round to it."
The new imac pro has the entry Level xenon and still throttles after 10 min.
Schools for sure. The staff believes that Macs are better and have bough trash cans to replace the cheese graters.
It takes effort to set up, but not that much.
A Core i7 is not the same as a Xeon. ECC memory matters. The Apple premium simply isn't that much higher.
I also wanted to add that the cost of a person's time eclipses the cost of hardware (especially with computers). As a business, I simply wouldn't risk having days of downtime from a bad update, just to save $2,000 for something that's used for years, when the lost time in paying a creative person's salary will eclipse that.
What could you possibly be doing on a workstation that requires ecc?
For every 8 gb of memory, you are potentially looking at a single bit flip every 9 years. With ecc, only once every 45 years. And in exchange, you are looking at a 10% performance penalty.
What could you possibly be doing on a Mac pro that would need that kind of accuracy?
My argument is that every time someone says "I can build the same thing for 1/3 the price", they are literally using completely different parts. They AREN'T building the same thing.
The idea of an Apple Tax is that apple arbitrarily marks up everything. But if you bought the same parts independently, you'd find the markup is very little. More like 10%.
Look at those $10k HP workstations. They are more comparable to Mac Pros.
Except, at least with a fancy purse, you can carry it around and signal your wealth to others. My workstation is hidden away under my desk.
I just want to buy a car for $20k that goes 70 miles an hour. But you’re only selling me cars for $50k cuz you insist on putting things like carbon fiber brakes and titanium alloys. I don’t need any of that. I can build something that perfectly meets my needs for way less. Because those enhancements are meaningless for what I’m trying to do (go 70 MPH)
You want to sell me a car for $50k that can go only 60 mph. But it has carbon fiber brakes and titanium alloys, requires premium gas, and has a fancy logo. Oh, and it has a flux capacitor which does nothing.
Or I could just buy a car that can go 100 mph for $20k.
You can use different parts and end up with an equal or better outcome.
Yet every single one of them I follow has used only MacBook Pros for the last 5 years or so which obviously don't have ECC so yeah I feel the importance of it is overstated.
And not that Row Hammer should really be a concern for a workstation, but ECC does not necessarily protect against it. Some types of ram are vulnerable and some aren't. But ECC is not necessarily a factor.
For consumers who just really want to run macOS, I can see the appeal.
It's the market filling in the gaps of Apple's engineering & marketing plan.
A computer that takes a bunch of fiddling to set up and might not work if updated doesn't seem like a productivity booster to me.
If Hackintoshes have motivated Apple to release a new Mac Pro after a multi-year gap, then Apple-only shops will benefit from the Hackintosh investments of other shops.
Other people use a Hackintosh and help to define requirements for Apple's next computer.
Apple gets free R&D. Those who need performance get it. Everyone eventualy gets the benefits in the next computer.
Works perfectly in a home office and saving $2500 is not a small thing.
Not all updates are critical.
I don’t want to fight with my workstation all the time. I want to use it to do... work.
Hackintosh doesn't work for me because my business isn't fixing laptops. Apple used to provide a full solution and it was really great.
In 2014 Apple would sell you a monitor that doubled as a dock which worked great with the RMBP and Apple's own wireless KB and Mouse and even Time Machine.
Today you're stuck with USB-C docks which are pretty awful. Pretty much on par with a Dell or HP laptop dock. The new Macbook Pros are the worst device I have used from Apple.
The thread you are replying to is about hackintoshes. The point of a Mac for me is that it Just Works (tm). If I want to fight with something I'll buy a Thinkpad and install Ubuntu. I have no interest in fighting with macOS, it's only useful to me if it is easy.
No. Their complaints are about the laptop itself. The keys are constantly breaking. The software constantly crashes. We're all a bunch of consulting devs and the 2017 macbook pros have cost every single one of them at least a week's worth of billable hours in the past 6 months. Most of them have gone back to their 2015 models.
To anyone with an understanding of PC hardware it's clear this list was composed specifically to justify the iMac pro's price.
Here's a much more reasonable list: https://pcpartpicker.com/user/xanderstrike/saved/mNDWXL
My wife is a designer, she needs much more powerful computers than I do (as just a developer). She constantly runs out of memory, GPU performance is very relevant for Photoshop, Illustrator, and other tools she uses in her work. So spending $2500 or $3000 for a high end (but not pro) imac is a given, even though I would be perfectly happy with the $1900 one.
For her, the advantage in productivity is worth it. I can imagine there are many people who would see the imac pro as being worth it to them also and the price isn't going to bother them much because their time is already very valuable.
I guess I'm no longer part of their target market for their computer systems.
My 2017 MBP is less than 6 months old and two keys don't attach to the switch anymore. They literally come off with my finger as I type. They aren't broken, the C-clamps are still intact. They have just widened enough with wear such that they don't clamp anymore.
It's so bad that I use my 2014 Macbook Air when I need to do a lot of typing.
Most of the people I know IRL are having the same issues with their newer model Macbooks, especially my developer friends since we of course tend to be harder on our keyboards.
So you aren't missing anything.
My dad had is IPhone 6+ battery replaced under the $29 plan. 6+ week wait for the battery. They call, the battery is in, dad shows up at 11:30am on a weekday, it'll be ready by 3pm. He shows up at 3, and it takes until nearly 4 o'clock to get someone to get the device back out to him. I'm reaching the point where I may as well go with Dell/Samsung because shipping a device in is guaranteed to be less of a hassle than the "Apple Store Genius Bar" experience.
I've seen the phrase "not part of their target market" deployed at developers and particular users with increasing frequency for years.
Seems inevitable that eventually people will go somewhere else.
Wat.I just went googling after reading this comment and at first glance, it looks like this is an officially supported method to run GNU apps in a Windows terminal? Can you boot directly into Ubuntu desktop without messing with BIOS and UEFI?
It's hardly limited to GNU apps. Most of your Linux applications should work fine on it. I've had no problem with all sorts of non-GNU releases when I use it on my desktop.
>Can you boot directly into Ubuntu desktop without messing with BIOS and UEFI?
No, it is a Linux userland on top of the NT kernel with an emulation layer written to handle all of the syscalls.
It works well. The only really glaring issue is ConEmu and related Windows terminal ecosystem is still hot garbage compared to any Linux or OSX terminal.
...and that bites you in a surprising number of ways.
I recently tried porting some software that runs just fine under Ubuntu-on-x86 to Ubuntu-on-WSL, and I found three different failure modes in the console/pty mechanism before I gave up. I then tried running it under Cygwin and it ran correctly out of the box.
Perhaps the easiest way to see this is to try to run a program like GNU screen under WSL, but the core problem isn’t any specific application. Any program that does anything even moderately tricky with ptys is likely to fail under WSL.
Does that mean Ctrl-C is still "copy" on Windows even in a bash shell?
> I saw a bunch of them walking by in Apple park toting kit for an outdoor shoot on premises while walking
> Is it the OS is it in the drivers is it in the application is it in the silicon and then run it to ground to get it fixed.
When scrolling to the bottom of the article the last thing I expected was the entire content of said article to be replaced by headlines of other stories from TC.
Viewing the article in portrait mode on desktop displays the main contents on the left 50% and complete whitespace on the right 50%.
Overall the TC redesign is god-awful and has degraded usability substantially.
Furthermore it‘s concerning that the article mentions modularity mostly in ways that involve external hardware and peripherals. If that turns out to be the approach Apple is considering, it sounds worryingly similar to an iMac Pro without the fanatastic display.
This approach is reminiscent of Apple's approach to low-latency touch scrolling on the original iPhone; to Quicktime and MIDI a decade and a half before that; and, more recently, to the Apple Pencil.
I knew executives at Palm who were frustrated by the challenges that both the tech stack and the organizational structure (Conway's Law) posed in the (failed) effort to reduce touch latency to where it felt physical. I know that it took Android many years to reach the point where objective observers described it as equally “buttery”. IMO this full-stack integration and optimization is one of Apple's core competences. It will be interesting to see what it leads to this time. The effort may be too late, but if the past predicts the future, it will not be too little too late.
In the end I moved my creative work over to PC and currently run a nice compact mATX PC built for GPU computing with two 1080Tis. Still cost me less than half the eventual iMac Pro and presumably this Mac Pro.
If this had been released a few years back maybe I'd never have moved to PC but at this stage they'll have to be still supporting the new Mac Pro in 3 years before I even consider moving back at this point.
Been a Mac only user for 15 years now, I didn't move away lightly, but now I've switched it'll be hard to convince me back.
They have managed to keep the music recording industry; a new macbook pro or imac is plenty powerful enough to be the heart of a recording studio.
It also depends on the music you do and the plugins you run.
But I wonder what the hell the people at Apple were thinking when they put put the "trashcan"-design. It looks awesome, I cannot deny it, and I imagine it is very convenient to handle.
But something deep inside of me cringes when a computer in this price class has everything soldered on and has no real option to extend it. You cannot upgrade the CPU, the RAM, the GPU, nor can you install any additional PCIe cards.
"I know, let's put out a boring commodity product that has no unique selling points, doesn't match our ethos, and has a tiny niche market. And we'll sell it at the same razor-thin margin that everyone else does!"
That couldn't be further from Apple's way of doing business.
If you want a cheap commodity tower, just buy one, but don't expect Apple to make one for you.
Apples other concern is the developer side and the app ecosystem: high end build machines and developer workstations are also relevant for their actual business which is selling apps and phones.
Computer hardware for professionals is just a boring service Apple needs to provide in order to support the rest of their business. It’s not even a visible product like a MacBook or an iMac- it’s designed to be hidden on the floor because it’s huge and noisy (the trash can was a failed attempt at working around that fact)
Edit: to follow up on the “powerful machines are a boring service” idea: If Apple didn’t want to risk the brand damage of launching a boring product, they could simply license dell or hp to sell certain servers and workstations with Mac OS to let the highest end render machines and build servers etc run Mac OS. They could easily charge a 20% premium and people would be happy to buy them.
I really think you're wrong. Even a lot of people who run Windows or Linux prefer Apple hardware.
Apple make excellent laptops, phones etc. but those are products in a completely different category.
I think lots of people use macs to run linux (especially developers). The number of people that run Windows exclusively on Mac is probably pretty small though.
Most importantly though it needs to be compatible with most high end hardware from the day it’s launched. If Nvidia makes a gtx 3080 the day after you bought your Mac Pro you can’t worry about when the Mac compatible version comes out, or whether it will cost twice what the PC compatible card does.
They might not make a single dollar on the sale of the pro machines - and it would still be worth every dollar of engineering put into the project to keep the devs happy in order to keep their other markets healthy.
Is there any evidence that iOS doesn't have as many great apps because developers are fleeing because there's not a Mac Pro that they like?
I kind of suspect that's what happened last year - everyone was just toughing it out waiting for the pro kits, and when it never came the threats that came in from big shops were very real. Enough so that Apple broke every tradition it had regarding future product announcements, and laid a lot of their goodwill on the line to promise a fix.
Regardless, this doesn't have much to do with the macOS / iOS dev market, which is what I was responding to. No chance that there are a bunch of big dev shops that got fed up with not having the Pro desktop / laptop hardware from Apple that they want and decided they'd rather just abandon the Apple customer base rather than deal with it. The value of that customer base is huge, and those dev shops will go through a lot to reach it.
"Our business has always been building apps for iOS and Android, but we don't like that we can't expand the memory in the iMac Pro, so we're just building Android apps from now on, even though more than half of our clients and customers are on iOS."
For any shop with a dozen mac pros running some mac software package it's a huge step to switch to PC if it also means switching the whole software stack. At this point there are probably a whole lot of shops that struggle with very old pros, or have started buying the iMac pro. But I'm sure there are shops that also held on for a long time and finally gave up and swapped both hardware and software because they grew tired of waiting (I know of a couple - but anecdata isn't data, so these are guesses).
The only reason Apple broke tradition and disclosed that a new pro is coming is to convince shops like these to hang in there for another year.
But no. It's going to end up being the desktop PC equivalent of a RED camera system, with a price to match.
Get ready for an underpowered $10k pyramid whose only input is Siri.
Creators were pretty much the only people keeping Apple alive in the tough years and they abandoned them.
- Will the Mac Pro actually be Pro?
- Will they ever bring back a Pro computer in a notebook form factor?
If you mean no SD card reader etc, it's my understanding that pro photographers tended to use a USB3/thunderbolt reader anyway, as the integrated reader was connected over USB2 internally and was far too slow for pro use.
- Touchbar at all
- Battery reduced from 99 whr to 76 whr
- Lost magnetic charging port
- Lost every other useful port
- Keyboard generally considered worse
I currently have a 15" mid-2015 MBP, and I use the DisplayPort, HDMI, and USB-A ports on a daily basis. Losing the ports doesn't bother me too much, but similarly priced or cheaper laptops don't ask me compromise there. On the other hand the battery life, touchpad, and keyboard will be absolutely stellar or I'm not buying it, especially in this price range. Again, others ask me to compromise less. While Apple still has the best touchpads in the industry, others are catching up. Meanwhile, Apple has regressed in several ways
I don't particularly like the touchbar (though that may change if IntelliJ ever bring out their promised support and it's any good), but it's not a huge deal for me (at least after I remapped esc to caps lock; I now do this on every keyboard I use, as I find it more ergonomic). I do love the touchId thingy, but they could have done that without the touchbar.
I prefer the keyboard to the retina MBP one, though I preferred the pre-retina MBP one to either. There was something about the feel of the rMBP (and MacBook Air) one I never liked. This is very subjective, of course.
The port situation mostly hasn't bothered me, as I just use a USB-C hub with power passthrough. It would have been nice to have at least a single USB-A port, but realistically I wouldn't use it much.
I find battery life to be similar to the previous one, generally. I think the size sacrifice is worth it, at least for my needs, because it makes the machine very portable. I barely notice the weight of it in a backpack now.
My biggest complaint is one you haven't mentioned; you can't get an integrated GPU only version. You can't even fully disable the discrete GPU. This means that you're stuck with all the quirks of the dual GPU setup. This is a step back, IMO; I've no real use for a discrete GPU on my work machine, and these dual GPU Macs have never quite worked properly.
I wasn't expecting to like it as much as I did, really, but I was mostly impressed.
As far as I'm aware, you've never been able to fully disable the dGPU. On my previous 2011 MBP, the dGPU failed. You could twiddle some UEFI variables to prevent switching to it so the system would boot, but it didn't actually cut power to it so it just ran hot and the battery died in 3 hours. Also, all the DisplayPorts were attached to the dGPU, so you couldn't plug up external monitors anymore if you did that. Units with the dGPU have worse battery life when there's a monitor plugged up, because it forces a switch to the dGPU.
I have tried 3 and all 3 have not worked well with 2 screens (one screen being 2k)
I read this every time it comes up on HN yet every single one of the five photographers in our shared office laments nothing more than the missing card reader.
If it was so slow why not just upgrade it to a USB3/PCIe-connected reader? and being a 'pro' device why didn't they do that already when USB3 became available?
Some of the top end cameras use both CF and SD, so sure, it doesn't cater for everyone, but SD has become so ubiquitous it seems odd it's missing.
(I'm actually not sure what the reason for this was; iMac SD card readers were PCIe).
EDIT: According to Apple, all the laptops are USB2: https://support.apple.com/en-us/HT204384
Desktops are PCIe.
The first is time. If they were ready to release a simple, expandable, better-everything box a few months from now, every single person who’s been chomping at the bit for a new pro Mac would buy it immediately. That would fix their market, gain them goodwill and give them plenty of time to explore what all these amazing professionals actually want to do, later.
Second, this just doesn’t seem like the hardest list of requirements to guess. Heck, there are practically no constraints, not even price! It doesn’t have to be thin, it can guzzle power, they don’t have to compromise on any ports (put 4 of everything you can think of in, old and new). Theoretically they could build a slightly better box with some analysis but literally no one is asking for that gargantuan task to be done yet, at least not before 2022.
A maxed out Macbook Pro has 16GB of non-upgradeable, non-ECC RAM and a GPU with 4GB of vRAM and a Passmark score of 3,553.
Lenovo will sell me a machine with 64GB of socketed ECC RAM and a Quadro P5000 with 16GB of vRAM and a Passmark score of 10,281.
The Macbook Pro doesn't have built-in LTE. It has a non-removable battery. It throttles badly under sustained loads. It has chronic issues with display calibration. If you want a useful selection of ports, you'll need a bag full of fragile and easy to lose dongles. They've only just rolled out eGPU support and there's still no official support for Nvidia GPUs. There's no 17" option. It has that damnable butterfly keyboard.
Yes, the Macbook Pro is very thin and lightweight, but Apple have made a lot of performance and usability tradeoffs to get there. That doesn't sound very "pro" to me.
Apple doesn’t make a mobile workstation. They make a Pro version of the MacBook.
Take a MacBook, give it some more power, and you mostly have a MacBook Pro.
If the complaint is why doesn’t Apple make a high-end mobile workstation I'm guessing because the market isn’t there for it? At least not at the scale for Apple to touch it?
A lot of professionals end up being serviced just fine by a “Pro MacBook”. I don’t get this obsession with acting like Apple is doing something wrong by making the MacBook Pro a pro version of the MacBook instead of something like a Dell Precision, they’re two very different things with two very different markets.
It had gigabit ethernet, FW800, three USB ports, 1 thunderbolt port, ExpressCard/34 slot, audio in, audio out.
With the exact same form factor and updated ports, Apple could offer a hell of a Pro Macbook Pro. Just give it a 4k screen and an option for a second SSD in the space formerly used by a DVD drive.
If Apple honestly say "we couldn't care less about professional users, we're making money hand over fist anyway", then at least we all know where we stand. Apple haven't taken that approach - they keep making grand overtures about how much they care about professional users. They keep making grand promises about how the perfect professional machine is just around the corner, how they're not updating the current machine because the next big thing is so incredible, amazing, awesome.
There are a lot of very loyal Mac users who have kept Apple in business through thick and thin, but their patience has been tested to breaking point. Those users have been a key part of Apple's marketing; Apple make computers for dreamers, visionaries, artists, the cultural elite.
I don't know anyone who is genuinely happy about being a Mac user, but I know a lot of people who grimly tolerate giving Apple their money. They know what to expect from next year's Macs - they'll be thinner, lighter and subtly worse in a lot of important ways.
What happens to the Mac brand if those users leave? What happens if the resentment felt towards Apple by photographers, video editors, 3D artists and recording engineers turns into a full-scale revolt? What happens if owning a Mac marks you out not as a sophisticated member of the cultural elite, but a nerd who wants a slick *nix experience or a sucker who paid $2000 for a Facebook machine?
Apple are playing a dangerous game. Apple don't need to care at the moment because the iPhone is so gratuitously profitable, but they should know better than anyone that the market is incredibly fickle. They're burning decades of goodwill for no obvious reason.
It wouldn't be particularly expensive to make the majority of pro users happy. Bring back the old cheesegrater Mac Pro and stuff it with the latest commodity components. Bring back a previous-generation MBP chassis and fill the extra space with an i9 or a Xeon E3, four SODIMM slots and a GTX 1070 Max Q or a Quadro P5000.
They don't have to promote these machines, they don't have to display them in Apple stores if they're ashamed of making a functional computer, they just need to make them and promise to keep making them to secure the continued loyalty of their most loyal customers. Apple won't do it, for reasons known only to them. In the long term, that could prove to be a very expensive oversight.
My monitor does video, charging, and USB 3 with full size ports with one cable.
And the keyboard is subjective, I got used to it pretty easily. I actually feel like it’s more tactile than the old one
That is not the most ringing endorsement for a very expensive and supposedly premium laptop.
My Lenovo X1 Yoga also does everything via one Thunderbolt cable, but it also has a full selection of normal ports on the notebook itself rather than only on the docking station. It also has a 1440p touch screen and a nice keyboard, and still costs 1000 € less than a Macbook Pro ^W Deluxe.
I’d rather pay the difference in AppleCare. Most developers on this site are making large sums of money with their machines, 1000$ or even 2000$ is hardly going to make a measurable difference on the ROI of my laptop.
If the $2k isn't that significant, and minimal downtime is so important; may as well keep a slightly older refurb model on-site as a hot-spare ready to go just in case there's a problem... A policy that would work, whatever your OS / brand preference.
The endorsement was tempered by the fact that around these parts endorsing the new MBP is apparently treason. I’m getting my comments downvoted for implying the MacBook Pro is a fine machine.
Needed keyboard repair twice in that time, and was horrible to type on. Useless touchbar, and an over-large trackpad that was constantly activating when typing. The last was odd as I regularly touch the old size trackpad with thumbs when typing but never had an issue. Add in the occasional freeze and crash and I feel it was a complete lemon.
I'm very happy with my 2015 and get longer battery times than I did with the newer model. I have no idea what I'll do when it needs replacing.
IMO best all around monitor being produced today. 38” 1600p ultrawide (equivalent density to a 1440p 34” ultrawide), 75hz which is high enough for gaming without going into gamer aesthetic/retail markup territory, Freesync.
I can’t recommend it enough tbh
The keyboard is subjective.
The feel of the keyboard is indeed subjective (I happen to like the feel of the keyboard on the new MacBook Pro), but in this case Apple genuinely built a buggy keyboard, so much so that people are writing songs about it .
Largest reason for return: G key. 30% of the time = two Gs. 50% = 1 G. Remaining 20% = 0 Gs/no keydown whatsoever. Hah.
Yeah, I think if the keyboard didn't break from a spec of dust it would be fine.
With my work my swap usage is 10-20GB at a time, between my IDE, app, tests, and browser tabs. 16GB is utterly painful but it’s the most you can buy.
It's also more than two times slower on my 2013 Dell Precision Xeon workstation.
But that comparison is not really fair, a MacBook Pro has to use a mobile CPU or it won't be able to dissipate the heat. Also, depending on ambient temperature, possible airflow, the MacBook's CPU probably throttles fairly early to prevent overheating.
A desktop CPU can larger power envelopes, typically throttles less and most virtualization has gotten so good that for CPU-bound workloads the difference between running on bare hardware or on a VM is barely noticeable.
to the point where I'd work more efficiently running remote desktop on a $200 chromebook.
Maybe it depends on the field I am working in, but I can't really run stuff on any laptop or desktop unless it is a > 5000 Euro workstation with plenty of RAM, storage, and a beefy GPU. The value that the Mac platform provides for me are the many applications that Chromebooks/Linux don't have good substitutes for, such as OmniGraffle, Deckset, Things, Microsoft Office, Little Snitch, Pixelmator, Acorn, Lightroom, and Affinity Designer. So, it's not just an expensive terminal ;).
But the memory is noticeable. For general performance, but also for running lots of vms and stuff in different configurations.
Still situations where its convenient to spin stuff up locally rather than cloud.