It's interesting that development of the iMac Pro was already underway when Apple and pro users (mostly developers) had that big showdown after the touch bar came out. The touch bar seemed like the final slap in the face to the pro crowd, but this was already in the works then.
Every pro review I've seen so far has been that's it's excellent - not because they're fawning over this or that, but just that's it's a solid option with a great screen at what's actually a reasonable price for the components inside. If you think the price is unreasonable, you're probably not the sales target for this machine - the ones who really need this, like in the case of this reviewer, are buying three at a time - just because.
It's interesting because upgradability doesn't seem like a very big concern to anyone buying this - they're happy with that they're getting at the price they're paying. It almost seems like Apple needn't have promised everyone a new modular Mac Pro. If they'd just announced / launched this machine alongside the pointless MacBook Pro update, everyone would have been happy and with no accusations of taking the name of the Pro in vain.
I'm not so sure the reviews would have been _quite_ so positive if they hadn't also announced they're releasing a 'modular' traditional Mac Pro.
I think the knowledge that a new 'modular' Mac Pro is coming gives a different context to this machine. I think people are framing this on the idea that: if you want a quiet, good looking machine with a great screen and the other 'benefits' of an all in one (and don't mind the trade offs), then it's good to have this option. If the trade offs don't work for you, then wait for the more traditional Mac Pro.
If it had been released without any mention of another option coming for pro users, then it would have been framed as "if you need workstation grade hardware running mac os, then your only choice is now an expensive, non-upgradeable all-in-one which has traded off performance for quietness and for other thermal constraints".
I'm definitely in the category you describe: owing to ancient personal prejudices, I consider the idea of having one's computer integrated with the screen to be an utter abomination. I have trouble suffering the trend towards laptops and notebooks precisely for this reason, let alone the rise of the integrated all-in-one.
Knowledge of the coming Mac Pro allows me to view this device with levity because I can view it as a herald of things to come in a more satisfying format; had I not known of the ‘imminent’ (end of 2018 or beginning of 2019) Mac Pro, I would have been utterly aghast at the idea of Apple pushing me towards the integrated solution.
I tend to cram my computers with cards (not graphics cards, but things like SDRs and FPGAs) but what really does it for me is having the monitor built-in: I'm used to multiple monitor desktops, and I like both screens to be absolutely identical make and model. That clearly isn't possible when you have an all-in-one computer with no available ‘identical’ screen.
Also, my computer belongs beneath my desk, and my monitors on top of it. It's just the preordained way. All else is heresy. ;)
Also, computers and monitors have vastly different life cycles; the former become obsolete much faster than the latter. Binding them together physically forces you to replace one or the other either too soon or too late for absolutely no benefit.
I agree. I was the very happy owner of an Apple Cinema Display that served me well for many, many years, and was still in perfect working order when I had to abandon it in favor of an ASUS display because I needed more resolution and Apple wasn't selling standalone displays any more :-(
I want to be an Apple fan. I really do. But they are making it impossible for me nowadays.
Apple has mentioned that the next Mac Pro will also have a matching display.
I'm fine with a typical 4k display (or better). The panels of the better ones seem to come from LG anyway and are widely available on the market. I wonder if Apple would be selling a good quality 5k monitor with TB3, or if they go for a high-end 8k screen. Personally I'm not in the business for the highest end. LG has announced a bunch of TB3 screens, where the 4k 32" version would be sufficient for me. But LG lately has problems bringing their better screens in enough quantities to the market. Last years 32" 4k screen is still not widely available here in Germany.
I actually think that the 27" 5K display in the iMac Pro is going to age quicker than the internal components. I suspect that ~32" 8K displays will be mainstream in a year or two.
Given your use case is a Mac, any Mac, running OSX even an option? I also use FPGA cards and what not but I do so in a separate Linux box because all the tools I need to for the setup only have native Linux versions. And if all you need is a screen running OSX which remotes into other things, isn’t that what the iMac is?
I wish they would have left the Xeon and Pro GPU components to the new Mac Pro instead of putting them in this. It would’ve been nice just to have top of the line consumer grade hardware available in a Mac instead.
That and the amount of time they’re taking on this modular one is just nuts. All people want is a simple tower. They could just bring the damn cheese grater back.
I only have one teeny tiny nit and I'm only mentioning it because I think it's a common misconception:
if you want a **quiet,** good looking machine with a
great screen and the other 'benefits' of an all
in one (and don't mind the trade offs)
I don't think "quiet" needs to be the domain of all-in-ones; with a $60 CPU cooler from NewEgg one can have a ~4ghz i5/i7 PC that's nearly silent. The old cheese grater Mac Pros were noisy, but that really doesn't need to be the case, especially if Apple would just release a damn desktop i5/i7 based Mac Pro.
Which might have been okay, because iirc the complaints weren't really 'we want upgradable pro machines' - they were more 'just build a pro machine'. Other than upgradability of core components, I'm not sure what the Mac Pro can possibly bring to the table now. And it seems like there aren't that many folks who care about upgradability alone.
because iirc the complaints weren't really 'we want upgradable pro machines'
That is totally untrue. The places I was complaints were all about upgradability. That's why many people still get the previous generation of Mac Pro's and upgrade the processors, graphics cards and add SSDs. My main machine at work is one of these, and feels just as fast as my Mac Pro 6,1 (the trashcan version) at home.
> Apple and pro users (mostly developers) had that big showdown after the touch bar came out. Every pro review I've seen so far has been that's it's excellent
Almost every Touch Bar MBP review has also been excellent. Sales of the Touch Bar MBPs surpassed the sales of almost all other competitors combined, in less than 1 week [0] (annnnd here comes the expected downplaying of why it sold so much, offering every convoluted reason but the possibility that it is actually, in fact, a good product.)
"Pro" users and Apple did not have a "big showdown." It's just an obnoxiously loud minority, with their puerile 4chan'ish dismissals like "Emoji keyboard!!1", who had never even really used a Touch Bar MBP, that were clogging up a few forums. People who barely exhaust even 16 GB in their daily work fake-whining about the inability to upgrade to 32 GB, even though almost no viable competing laptop supported that either at the time.
It would be fair to mention, if any consumers ( which far out number professionals ) that wanted a Macbook but dont want a silly 12" screen would need to get a Macbook Pro.
And just to Mention, the report does not spilt between Touch Bar and Non Touch Bar. And we already know most premium prices notebook goes to Apple. That means Apple would have sold just as much if they didn't had the TouchBar.
Every other laptop manufacturer on earth had the opportunity to put together a computer with the not-weakest-cpu and more than USB ports without a touchbar.
I bought a Touch Bar MBP because it was finally time to retire my 2011 MBP and I think it's the best computer I've ever owned, period. I do video editing and media production (podcasts, company videos, etc.) and I feel like I put this thing though its paces regularly. On top of that, although I mainly have it hooked up to a separate display, keyboard, and mouse, the Touch Bar is amazing when I'm editing on the go. I kinda wish that my regular keyboard had it because I can scrub through all my media without having to jump back and forth between keyboard and mouse. On top of that, I installed Better Touch Tool and I can customize the Touch Bar to do pretty much anything I want and that's made it infinitely more usable. I also do app development and web development for our shop so having the terminal and being able to run node locally is wonderful.
I think the people that complained about it are people that either 1) didn't actually use one at all, 2) are 100% resistant to anything changing in their workflow, even if it might improve it in other ways (I'm looking at you ESC key nihilists in VIM), or 3) would never buy a Mac anyways because they like to tinker and/or upgrade their PCs regularly.
Frankly, I don't understand when these types of people get butthurt about a product that they will never buy and don't intend to buy just because others might get them. Just don't buy the damn thing and get on with your life. For those that do buy it because it fits their workflow and lifestyle, I think the damn thing is amazing.
It's a shame BTT won the mindshare, and Jitouch devs abandoned it. BTT still doesn't have the ergonomics Jitouch had ~2 or 3 OS versions ago, but it has the userbase.
no viable competing laptop supported that either at the time
But, so what? You shop at Apple if you want better than the competition.
I work with multi-Tera datasets, and my use cases are common enough these days. Anyone doing CAD, bio, finance, video, etc etc etc would easily find uses for 32G or even more.
but just that's it's a solid option with a great screen at what's actually a reasonable price for the components inside
PC Gamer did a price comparison that tried to match the components as closely as possible and found that the result is barely cheaper than the iMac Pro:
I am not in this market segment, but if you are doing something like high-end video editing, the iMac Pro is competitively priced and (probably) has a great build quality compared to systems in the competing in this segment.
I hope that they won't forget some other market segments. It would be a great to have a new Mac Mini in the 700-1000 Euro range.
They under-balled it if anything. They used a $400 Broadwell Xeon, while the base iMac Pro has a Skylake Xeon W. The chip is a custom Apple SKU, but the closest is a $1,100 Xeon W 2145.
$650 is steep for a motherboard, but it has dual 10 gig-e ports and supports registered ECC. A SuperMicro motherboard with the Xeon W support will run about $350 (they’re not out yet) and an X550AT2 NIC is about $200 on eBay.
Slightly less: the iMac Pro uses the ~$100 AQC107 whereas the X99 motherboard chosen uses the more expensive X550AT2 because Aquantia only launched their chips this year.
But yeah, the Xeon they chose is quite a bit slower than the 8-core Xeon-W; the Broadwell equivalent was really the $1100 E5-1660 v4.
So -$200 for motherboard, +$700 for CPU, and -$100 for PSU if you're feeling generous. That's still $5k.
* Video Card: Zotac - GeForce GTX 1080 Ti 11GB Founders Edition ($678.99) whereas the actual gpu used in the mac is $399
* Power Supply: SeaSonic - PRIME Titanium 1000W 80+ Titanium Certified Fully-Modular ATX Power Supply ($242.89)whereas they could have gotten a fine psu for $99
* Operating System: Microsoft - Windows 7 Professional ($134.99) How about not and say we did.
* Monitor: LG - UltraFine 5K Display 60Hz Monitor ($1299.95) whereas one can probably get a really nice 4K monitor for 400-500. Some can spend the extra funds on another 4K monitor, others you know already have monitors.
* CPU Cooler: NZXT - Kraken X62 Liquid CPU Cooler ($158.99) Did I miss the part where the mac is liquid cooled why is this here?
* CPU: Intel - Xeon E5-2620 V4 2.1GHz 8-Core Processor ($408.99) This is an amazingly poor choice while relatively expensive its performance is decidedly mediocre being an older generation.
Motherboard: Asus - X99-E-10G WS SSI CEB LGA2011-3 Motherboard ($649.00) Expensive not primarily because its awesome but because its xeon.
All in all a machine could be built that is nearly equivalent for about 3544. Given the extra $1500 one could buy a second 27" 4k monitor upgrade to 16 cores and buy a second gpu.
Pretending mac is comparable in price to pc remains silly.
I can’t soeak specifics about the PSU but to get a server class intel motherboard (Xeon support, ECC ram), you really do have to spend quite a bit more than for the equivalent consumer board. Even if you reduce the cost of the motherboard to 200, and the psu to 150, you’re still not saving a huge amount off 5k, and you’re not getting an all in one
I'm not sure how Apple has people convinced they need ECC. It's practically worthless outside of very specific workloads. A workstation user does not fill several gigabytes of RAM with terse important data, query it millions of times/day, and do so 24/7 all year long.
I've had ECC RAM once, it failed and ECC caught it before it could damage any documents. ECC memory also keeps a log of how many corrections it's made. In a long running machine that gets hot there _are_ bit flips and they _are_ significant. Especially if you're spent days working on a video file and don't have the technical skills to fix it in a hex editor.
I've also had memory issues on non-ECC machines, it's a lot harder to diagnose and much more frustrating. ECC memory should be standard in modern machines IMO.
Backups and ECC address completely different problems. Backup systems will happily store corrupted files into your backup disks. ZFS-style filesystem level checksums won’t catch many problems either, because they only check to see if what you wrote is the same as what was read back. If the data was corrupted in memory, it’ll happily pass you back corrupted data.
it only takes one bit flip in the FS cache, which is what your 'free' RAM will be used for, to make your day miserable. ECC (maybe) isn't worth it for consumers of content, but for creators it should be standard. not to mention rowhammer-like attacks which ECC makes harder though not impossible.
From what I understand (very roughly) with ZFS or BTRFS at least it takes one flipped bit in the particular block of memory which has just been successfully checksummed and 'approved' by the filesystem as valid for a write operation. Those are quite a few hoops to jump through so let's not head towards melodrama just yet.
That’s fine but most of us are running on NTFS, HFS+ or ext4 where it matters.
And then there’s the same problem with MLC SSDs which are just arrays of capacitors encoding that state as different charge potentials. As a former EE, capacitors leak regardless of implementation.
Also SSDs scare me more. MLC disks have so many states stored in a capacitor that is trusted not to leak over the lifetime of the device. All it takes is a few electrons to go walkies and it’s game over.
I ran a lab notebook in word for a number of years and the document got fried by a filesystem error. The backup was toasted as well. This was corruption which broke the stage machine or parser in word which acted as a damage amplification.
As for the source of the error, we can only speculate but the possibility of a flipped bit is higher than you think.
Try running memtest86 on a PC without ECC for a week. You will be surprised at the results. I’ve seen perhaps 3 or 4 failures of this in new hardware. DRAM is not perfect.
Yeah might as well get cheaper RAM, a slower SSD, and a Pentium, because everyone knows quality components are a ripoff, and the people at PC Gamer are just Apple fanboys and don't know what they're talking about.
That's a horrible system for $3700 dlls. You can easily shave $1k off if you're smart about shopping for the parts. Also, assuming you're not going for OSX (apples and oranges I know), a thread ripper would be a much better choice for a CPU.
Bit silly I reckon as well as you can get similar single threaded performance for much cheaper. If you need 18 cores maybe, but $5k? But Apple have always figured out how to upsell computers perfectly.
If you buy a workstation from HP/Supermicro, then you get something much more powerful for the same price. So, to say, they are very close is not very correct. The problem here is that the assembled PC is a customer PC assembled for workstation performance. Even then, they were not trying very hard.
If you buy a workstation from HP/Supermicro, then you get something much more powerful for the same price.
Not really. Go to HPs site and put together something like a z640 with as similar spec as possible and you're close to $4k before even adding a screen.
Point. Somehow, I have seen quotes from HP which were always much cheaper. May be it is because my employer had some sort of deal? Also, proliant servers were somehow cheaper than towers. Again I am not sure how, but yes, point taken.
What HP (and others) does give you is the opportunity do buy something almost as good (and quite possible more than good enough) for less money. They also let you buy something for the same price that is a lot better along the one axis you care most about (but worse along the axis you don't care so much about). But if you want to blindly match spec sheets you're going to pay about the same.
If you think the price is unreasonable, you're probably not the sales target for this machine - the ones who really need this, like in the case of this reviewer, are buying three at a time - just because.
I think it's pretty simple: People who want or need Macs and who have a workload that's highly parallelizable (sp?) will like this machine. Or people who need ultra-fast I/O. That group is basically video editors, (some) programmers who do a lot of local compilation for whatever reason, and a couple of others.
It's a pretty small-ish group but also probably a noisy, discerning one online.
For much of that group, the computer is a tax write-off anyway—something individuals buying for themselves often forget.
Many people forget that, in smaller businesses, the older machines are also quietly wiped and then sold for cash on Craigslist.
> the computer is a tax write-off anyway—something individuals buying for themselves often forget.
While this is true, the people who are unaware about what tax admitted expenses are, often just assume that it is "free money". It is not. You still have to earn whatever you spend on the machine, and it is up to you to consider, what is and isn't cost effective for your specific situation.
And not only that; in many jurisdictions, there are limits in what can be considered OPEX. If you go above that, your spending has to be accounted for as CAPEX and you have to depreciate the item. Depreciating a computer for several years, instead of expensing it in the current year, can quickly decrease its appeal.
In the us right now small businesses can depreciate capex fully on year of purchase for up to 500k. About to become 1 mill. So for many small businesses there's strong appeal.
Upgradability and serviceability is a big concern for a very niche market. If I ran a smaller shop, the iMac Pro would be a clean solution to a complex problem. But once you hit edit-bays at scale the technical problems exponentially grow. You also can't purchase every machine maxed out when you have to fine-tune a budget. We've often purchased almost all of our Macs without RAM to shave costs.
I'm personally holding out for the new Mac Pro specs to become public. A modular system is ideal in a large post-production facility. I have enough external peripherals and connections for 10GbE, fiber and SDI that attaching them all to an iMac is as much of a headache as anything. Especially when you're doing live productions and you have to swap out parts frequently.
Price is the least of my concerns though. If I can shave a lot off my budgets by upgrading RAM and GPU over time vs replacing entire machines, that's a big bonus. But I have more concerns with overheating in the iMac Pro than anything.
And the touch bar is useless to most of my editors and graphics team. If we could opt for a model without it, we would, every time...
They might care about having to round-trip them to a repair facility (even if that's the back of an Apple Store) though, and not (without complete disassembly) being able to take the storage out before handing them over.
Quick comparison, HP: 3 year Next Business Day onsite Warranty - they come to you, they make assertions about how quickly they'll be there. Can be extended to five years (for a Z840 for less than £200 or ~5% of the machine's purchase price)
Dell Precision: same. To add another 2 years to a Precision 7920 costs a bit over £200 or ~7%.
Apple: one year. £159 (~3% of base price) to extend to three years, and no option for more. You have to take it to an Apple Store (good luck with that), or Apple can arrange to have it shipped by courier. Hope whatever courier they use don't drop it along the way, and no sign of a guarantee when they'll get it fixed.
I expect the target market for the iMac Pro overlaps strongly with Apple's other machines (i.e. the iMac Pro's replacing an existing Apple computer) so this won't be a surprise but it's disappointing that Apple won't stand more behind their machines.
FWIW Apple does onsite repair service as a part of standard AppleCare, or you can get Enterprise/Premium for next day onsite (there's also mention of a 4 hour option, but im not sure what plan thats a part of)
I had a 2011(?) iMac at my last job that was having GPU issues common to all of those models. The machine had left warranty a couple years prior, but Apple knew of this issue and paid my local Apple dealer to replace the GPU.
For wide-spread issues at least, they do stand behind their machines.
Frequently because they're forced to, by the threat of litigation and/or a smackdown from consumer-protection organisations. There's too often a window where Apple won't acknowledge there's an issue but your warranty's expired so eh...maybe they'll make an exception for you? Or maybe not.
I'm looking for something a bit more systematic. On an iMac Pro that starts at a shade under £5000, it wouldn't hurt them to make the default warranty 3 years and offer an extension to 5 (I mean, who's throwing away these machines after 3 years, never mind one?).
Exactly - I think by the time they decide to upgrade workstations every component in the stack has advanced enough to warrant a completely new machine.
That, and there are typically many other good reasons to just replace a machine:
- A new machine is in warranty.
- You don't need to keep track of what machines you have, how many memory slots they have, what DDR version and speed they require, etc. You don't need personnel to do the replacements.
- In large organizations, hardware is only a small cost compared to salary, etc. So, it is typically not a problem to replace machines every 2-3 years, if necessary.
Upgradability is more interesting to individuals and small businesses.
Warranties are for individuals. Businesses will use them if they exist, but when a machine fails you replace it with a spare off the shelf. All a warranty does is save you the cost of buying another spare, which amortized over a fleet is nowhere near the cost of buying all new equipment.
> You don't need to keep track of what machines you have, how many memory slots they have, what DDR version and speed they require, etc. You don't need personnel to do the replacements.
There is inventory software to automate this. And upgrading a machine requires much less work than replacing it because an upgrade doesn't normally require transferring programs, settings and user data.
> In large organizations, hardware is only a small cost compared to salary, etc. So, it is typically not a problem to replace machines every 2-3 years, if necessary.
That other costs are larger is no excuse to waste money. An upgrade will often make a 2-3 year old machine equivalent to this year's machine, which is typically less expensive than replacing the entire machine. Especially since you don't have to replace any component that isn't actually a bottleneck in your applications.
But the main value of upgradability to large companies is as a hedge against unexpected requirements. Some vendor's new software requires twice the resources as the old version two months after you've bought equipment using the old spec. Either you can upgrade the machines or you have to replace them. Even if you don't always have to do it, you always want to be able to.
I believe Nvidia drivers are already in beta, this or any other eGPU enclosure should be able to remove that sticking point.
There is a slight performance penalty because fo the TB3 bus conversion bus, but it's only applicable for real time work like gaming - pro work should be able to use Nvidia cards just fine.
> It's interesting because upgradability doesn't seem like a very big concern to anyone buying this
From my discussions with an iMac hardware engineer, he explained that the component set that these machines are made with are at least 7-10 years away from being replaced because the software catches up. Whereas, traditionally, we're used to machines that last 3-5 years, but you can easily upgrade in the future.
If true, it appears to be a major paradigm shift, which rightfully the market is having a hard time trying to swallow.
I bought a max specced Macbook Pro and it's borderline unusable for me due to the touchbar. Apparently I rest my hands normally in the area where the touchbar is and it's very hard to undo habit. Since the touchbar doesn't need touch force I often randomly activate one of the buttons. I really wish they offered it without the touchbar as an option.
> Secondly, you get a wireless mouse and extended keyboard. Both have to be plugged in to charge. In the case of the mouse, the cable plugs in at the bottom, rendering it useless during charging. Truly a bad design.
The Magic Mouse 2 is one of the worst mice I've ever used, but the charging thing is a non-issue. The battery lasts like a month on one charge. Mine has never gone below 50%.
The Magic Mouse 2 is one of the worst mice I've ever used, but the charging thing is a non-issue. The battery lasts like a month on one charge. Mine has never gone below 50%.
This would be terrible for me, my battery always dies in the middle of the day, because I do not want to keep track of charging. Luckily, the Magic Trackpad 2 does not have this shortcoming and can be used with a lightning cable attached.
(I absolutely love the Magic Trackpad 2. Not just because it is large, but some applications use haptic feedback. E.g. OmniGraffle uses a subtle vibration so that you feel when two objects are aligned.)
You don't have to keep track of it. The OS will tell you when you have 10% left which is about 2 days of use and it quick charges in about 20 minutes to give you a month's worth of use. It's a non-issue.
FWIW, the reason it's on the bottom (according to one of the engineers) is because they didn't want people to leave it plugged in all the time while using it as that severely degrades the lifespan of the internal battery. Forcing people to plug it in while not in use apparently extended the lifespan by something crazy like 20x.
>The Magic Mouse 2 is one of the worst mice I've ever used, but the charging thing is a non-issue. The battery lasts like a month on one charge. Mine has never gone below 50%.
Worst mice? It's one of the better mouse I've used, if not the best, the touch interaction is fantastic and the accuracy spot on.
I'd only change the stupid location of the charger port.
My theory on the charging awkwardness was that Apple really wants people to use trackpads anyway.
But, since macOS is quite good about warning you when the battery is low with plenty of time to spare, I don't see the Sturm und Drang, except as usual "Gotta complain about Apple's unorthodox decisions" wankery.
With the prior version, I started out just changing out the normal AA batteries when they died every several months. That felt wasteful, though, so I got rechargeable AA batteries, but swapping them out every month or so was tedious.
Now I just plug it in every few weeks when I leave the office for the weekend. I definitely prefer it.
Handy to know. I didn't know if there was a subtle difference between the Magic Mouse 1 and 2 other than the charging port (which honestly is in a ridiculous position). I started using the original in 2010 and haven't looked back.
I honestly think it was a conscious decision. The problem with most wireless mice I see in our office is they end up being left constantly plugged in and are essentially wired mice. By placing the charge port on the bottom they are actively prohibiting you from using it as wired mouse.
Other have mentioned other aspects, so I'll just pitch in with my own specific experience:
Have a Magic Mouse 1 at home. Used Magic Mouse 2 [MM2] as a daily driver for a freelance gig for a couple of months and I was very happy with it. MM2 feels more balanced/lighter and the feet are angled better and possibly made of a different material that slides with less friction. Overall I prefer the 2 and I never had any issues with the charge port placement. Low on battery? Plug in, grab a coffee or lunch, come back: Done.
It's the same, except it has a built in battery and is rechargeable with a lightning cable (which the port is located on the bottom of the device, so you can't use it whilst charging).
I don't understand battery charge cycles and batter degradation much or really at all, bit could the port on the bottomaybe a subtle way to encourage people to not charge so frequently and weaken the battery prematurely?
There's no need to do that with properly designed circuitry and since discharging fully is damaging, encouraging this behavior is probably harmful to battery longevity. Battery University has so much information on this, but this part is a good summary chart (see the lithium-ion column) [1]
People wondering about the unit construction, lack of upgradability, etc. may be interested to know that the "sealed box appliance" mentality has been in Apple's DNA since the design of the _first_ Macintosh:
"Apple's other co-founder, Steve Jobs, didn't agree with Jef about many things, but they both felt the same way about hardware expandability: it was a bug instead of a feature. Steve was reportedly against having slots in the Apple II back in the days of yore, and felt even stronger about slots for the Mac. He decreed that the Macintosh would remain perpetually bereft of slots, enclosed in a tightly sealed case, with only the limited expandability of the two serial ports."
In defending their choice of macs, a decision that will likely come under fire no matter what, the author didn't even mention lower maintenance costs, which can mean more profit for the business if a machine would have been down otherwise. I suppose "solid and robust" would be the trait that leads to higher reliability. My boss personally hates macs, but he recently switched the entire office to macs because our IT service company's monthly maintenance fee is half price for macs.
Is this maintenance costs for traditional corporate use where your user base has a lot of "where's the any key" users, and you may have hundreds of hosts to maintain.
I suspect maintenance costs in a high end company with a handful of high end macs vs high end workstations is quite different note that the production company seems to be using the macs in a peer to peer set up.
And thus doesn't have all the overhead of domain controllers, exchange servers etc a traditional pc deployment.
I have looked at the diy tutorials for the later macs the ones that are made hard for end users to maintain - and they make installing custom cooling loops with hard tubing look simple.
There's no comparison to desktop PCs which, as you point out, are as simple as lego to build/fix and have part availability that puts lego itself to shame. I was really thinking more about laptops, where the need to peel/replace some glue is nothing next to the availability of parts.
Most enterprise laptops are vastly easier to fix than Apple hardware. The manufacturers of these machines have thought carefully about repairability and designed it in from the start.
For example, the Lenovo X270 opens with eight Philips screws. Once you're inside, you'll find an array of standard parts in standard sockets. Lenovo provide a detailed service manual and will sell you anything from a replacement fan to a new LCD from stock. The logic board isn't caked in BGA underfill and has a sensible level of density, so component-level repairs are vastly easier and less time-consuming.
It's good to hear that Lenovo treats you better than Dell treated me. Is this their policy in general or do you have to buy a megabuck enterprise service contract (i.e. the one bonestamp2 identified as not competitive)?
Completely standard across the Think* line. Service manuals and diagnostic software is available to download directly - you don't even have to log in. You can buy all parts directly from Lenovo or from a distributor without a service contract and regardless of warranty status. Opening the chassis doesn't automatically void your warranty and the terms specifically state which parts you're allowed to replace yourself.
The standard US warranty on ThinkPads is one year RTB, but you can upgrade to three years RTB for $99 or three years onsite for $159. International warranty coverage is an extra $10. The standard warranty in Europe is three years onsite. For minor faults, you can usually ask for replacement parts to be shipped for self-installation.
That's all standard on all the real ThinkPads (T and X series) since the IBM days. Most if not all of these models have 3 year warranty, and many with next business day on-site. Probably applies to the mobile workstation series too (used to be the 'W' series, now 'P' I think).
They also have spill resistant keyboards as standard (drainage channels to prevent catastrophe, rather than making colour-changing stickers so Apple can bill you for water damage should an accident happen).
I'm happy to see something with professional specs coming from Apple. I have serious concerns with the iMac Pro in terms of heat while rendering for many days of the year.
The Mac Pro trash can is notorious for overheating, so I'm not sad to see it go and I look forward to a new Mac Pro in 2018. The market for that unit will pay a premium for serviceability as well. I'm expecting a heavy price tag.
It's nice to see a real world example of the iMac Pro in use. I do a lot of sustained renders for TV and feature length content. Our PCs run laps around our trash cans, but a mixed environment is far from ideal. We spend more time troubleshooting with editors than anything.
I have more concerns about macOS High Sierra than anything at this point though. I've upgraded a few machines and we're seeing more crashes than ever. Sierra might have to be my staple for a long while.
> At the distance that the editors sit from a 27” display, there is simply little or no difference between the look of the 27” LED display and the 27” iMac Retina screens.
The author seems to be well in his middle age, and wears glasses. His visual acuity might make his subjective experience irrelevant on that specific point.
I can tell the difference between retina and non retina 27” on a desk (but I have 20/10 vision, which makes my subjective experience irrelevant for most people too)
Even if you never upgrade the machine during its lifetime, you still have to pay Apple’s notoriously high ram and hard drive upgrade costs.
Example: when I bought a new iMac last year, Apple was charging $1400 for 64GB of ram. That’s insane, so I went with the base and purchased my own. Took five minutes to install and saved around $800 if I recall
Sure, if they figure that upgrading themselves would have a higher total cost (lost productivity, internal costs, ...) than what they would otherwise save. But blindly spending more than you have to because your fixed costs are high is just irresponsible, no matter how big the organization is
I read in one review that, in addition to the non-upgradeability of the new iMac Pro, there is one real coup de grace: the beautiful integrated screen is not capable of being used as an external display for any future computer.
If true, doesn't this strike you all as truly unnecessarily egregious? Why should an iMac Pro stuffed full of chemicals to show beautiful 5K images, never be capable of being used as just a display? Is there a good reason Apple customers have to buy brand new displays?
This is worth asking because, unlike the components in the new machine, the display will probably be state of the art for a while.
I think you're right. But just as insane in that situation. Think of the work necessary to prevent that screen from ever being used to display a nasty video signal from –– gasp! –– another computer. It's just such an insult to customers. Am I crazy here? And isn't Apple supposed to be a company of environmentalists?
Think of the work necessary to prevent that screen from ever being used to display a nasty video signal from –– gasp! –– another computer. It's just such an insult to customers.
They just didn't bother adding the circuitry for "video in", due to cost; there is no active opposition here.
For many years now, you can buy "LVDS converter" boards[1] which will let you use any display panel as a monitor. I'm sure demand will mean such boards appear for the iMac Pro displays soon, if the existing ones aren't already compatible. I haven't looked too closely at this but the display might even be eDP or similar, which means only a physical converter is necessary if you want to connect a DP output to it. That's been done with iPad displays before[2]
I understood this to be a technical limitation of Thunderbolt 2: it can't push enough data fast enough to drive the 5k displays. One would have hoped that Thunderbolt 3 was sufficient.
This is the key thing that's keeping me on my 2012 (!) iMac. I was really looking forward to the Pro until I heard this.
Wouldn’t it be more cost effective to sell your iMac Pro, buy a monitor and pocket the difference? Surely a computer + monitor is more valuable than just used monitor.
I recall when the 25" iMac was first released, it was effectively the same price as a standalone monitor, although that calculation was probably somewhat influenced by the fact that only 1 or 2 such displays existed in the consumer space at the time.
MacBooks don't support 5K resolutions, so it would be a bit wasteful to buy a 5K monitor for one. Are you sure you're not confusing it with the MacBook Pro?
I want to see an apple designed chip running full out (no real thermal restrictions like in iPads/iPhone/Apple TV). Those chips already compare pretty favorably to Intel, and running at 3+GHz would likely have great single threaded performance.
I actually believe we're going to see MacOS for arm and a mac mini running an arm chip, it makes too much sense for Apple not to do it.
I know what you mean, but I'm not sure that's the way forward.
My understanding is that when you start really cranking up CPU speed -- let's say, with an imaginary 3.5ghz version of whatever their latest iPad Pro CPU is -- now you're just spending all your time waiting for RAM. So you need fancier and fancier speculative execution hardware in order for the CPU to have something, anything to do. You get into that game, and you wind up with as many transistors on a chip as Intel and AMD, hitting the same performance walls as Intel and AMD have been hitting for a long time.
What Apple could do is just go super wide, of course. Instead of bumping up the clock speed, they could cram 16 of those iPad Pro cores into a Macbook or whatever. Which would be fun, but for most workloads, those cores will just be idle most of the time.
> But is this really an issue? I’m sure Apple has user research numbers to justify their decisions.
They also have a direct incentive not to make user memory upgrades an option. It's a bit too charitable to think they're not supporting it because they somehow know nobody would want to do it anyway.
Plenty of people will be happy with the iMac Pro. If the number of people who'll pay $5k for a computer that will be much less useful in 3 years is relatively constrained, the number who'll pay $5k for a computer they expect to treat as a stopgap for a few months is miniscule.
There's no indication that Apple thinks it's a one-off: there's a lot of custom engineering in there.
"...there is simply little or no difference between the look of the 27” LED display and the 27” iMac Retina screens."
I feel sorry for the author, because your eyes have to be shit to come to come to this conclusion. If I remove my contact lenses, then yes, there is no difference.
I'm sure the difference in color gamut is no big deal to someone with colorblindness either.
Every pro review I've seen so far has been that's it's excellent - not because they're fawning over this or that, but just that's it's a solid option with a great screen at what's actually a reasonable price for the components inside. If you think the price is unreasonable, you're probably not the sales target for this machine - the ones who really need this, like in the case of this reviewer, are buying three at a time - just because.
It's interesting because upgradability doesn't seem like a very big concern to anyone buying this - they're happy with that they're getting at the price they're paying. It almost seems like Apple needn't have promised everyone a new modular Mac Pro. If they'd just announced / launched this machine alongside the pointless MacBook Pro update, everyone would have been happy and with no accusations of taking the name of the Pro in vain.