It's only a matter of time before Win 10 Unix features mature, and it makes me migrate back to Windows.
> designers, architects. Nice form factor is probably even a benefit to a lot of them, more than the ability to swap parts.
You're right. These are the professions that need a lot of physical space for whom the slim form factors makes sense. I just vaguely remember Phil Schiller saying that the new machines will better cater to developers.
For me, I would probably be happy with a 2 core macbook but I use a 4 core macbook pro. If I'm doing something that needs a lot of compute, it's in the cloud.
Yes I can offload stuff to the cloud. I just don't want to, just like you're happier with a laptop as opposed to a desktop. I don't want to deal with issues like latency and configuration or more importantly constantly thinking about whether or not I have enough resources to run something locally. There's enough things in my life that I have to budget.
I have a 16GB macbook pro and I run two different IDE's, firefox with about 20-30 tabs, Eagle, and various other programs. I rarely run into swapping problems.
"with occasional heavy processing" Yes.
Besides why would you want a powerful machine frozen in amber when it's not a laptop?
For me the real concern is with the GPU. As soon as apple comes out with a 32GB macbook pro I'll probably upgrade and just attach an eGPU to it.
This is actually my main concern as well which is atleast addressed with a solution. Upgrading SSDs is the other but I don't like using Thunderbolt.
I really want to stay with Mac OS, but Apple just makes it harder and harder with every year while Windows is getting a little better at the same time under Nadella.
At the price of a Macbook Pro one can hardly argue that it is a office productivity tool since it is clearly aimed at the "Pro" market. If the situation does not change then it is going to force pro users to look at alternatives.
Now the line-up for developers is just 'meh'. 1500-2000 Euro buys you a baseline configuration that is weak (my ~5 year old Dell workstation that I bought second hand for ~400 Euro still has three times as much memory, and is twice as fast for multi core loads as my MacBook Pro). To get a reasonable developer machine, you have to drop 3000-4000 Euro and you get something that is unexpandable and still does not have modern CUDA-supporting GPU except for a still somewhat experimental eGPU setup.
Apple is now a large consumer hardware company. Love letters to developers are a thing of a distant past.
(This year is my 10 year Mac anniversary, but there is not much to be happy about, except for the still excellent software from the Mac ISV ecosystem.)
Apparently Intel will fix this in their 2018 mobile SKUs, but until then Apple has chosen not to kneecap battery life or build a different logic board for the people who want 32GB.
This situation has a lot of parallels for me to the situation they faced with IBM and the G5. Everyone at the time wanted the G5 in a laptop to replace the G4, but IBM couldn't get the power consumption down.
Now over a decade later, Apple is taking shit for Intel's delays in supporting LPDDR4. I bet this is going to accelerate their plans to migrate Mac to their own ARM designs.
That's a great observation, and having witnessed the PowerPC -> Intel migration I'm disappointed I didn't make it myself. Both Motorola and IBM, who were supplying Power CPUs to Apple, sold off their microprocessor divisions after a litany of manufacturing difficulties. IIRC, that was what drove Apple to abandon PowerPC in the first place. It would be ironic if Intel, having benefitted so greatly from the manufacturing shortfalls of past competitors, would find itself in a similar situation.
Intel is already in this position, though not because they sold off their fabs.
All the money these days is going into mobile SoC manufacturing by the likes of TSMC and Samsung. Intel simply lost because Samsung and TSMC are able to outspend Intel on fab R&D and it's showing in Intel's numerous node shrink delays.
People will argue that TSMC/Samsung 7/10nm is not the same as Intel, and probably they're right, but only at present. TSMC/Samsung are eventually going to surpass Intel's fab technology because they're killing it in volume manufacturing chips for Apple, Qualcomm, Nvidia, AMD, and others.
Meanwhile Intel is fabbing for... Intel. Plus some Altera FPGA IP they don't seem to be integrating very well into their product stack. If Intel wants to survive the next 20 years the only option I see is that they start fabbing for other people too.
ARM is moving into servers, and once the perf/watt surpasses Intel, it won't be long for the hyperscale cloud companies like Amazon, Google, Microsoft, and Facebook to migrate away from Intel. For those guys, a 10% TCO reduction is a big deal, and while some of the perf/watt is due to the design, a lot of it comes from having the better process. If Intel loses their process lead, which is happening right now, then they're going to be second tier.
People will look back in 15 years at Intel snubbing Apple for the original iPhone SoC and mark that decision as the beginning of the end for Intel.
From Paul Otellini, Intel CEO at the time:
> At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.
The forthcoming Cannon Lake chips will support LPDDR4, enabling MacBook Pros with a higher RAM ceiling without sacrificing power utilization.
Upcoming Intel mobile chips will resolve this, allowing use of LP-DDR4.
That claims to support up to 32GB memory, and it claims to support, among other things, LPDDR3-1866. Are you saying that if you want to use LPDDR3-1866 with that CPU, you're limited to only 16GB? I can't find anything about that through some quick googling, but if it's true, I retract my snarky comment.
Also, I wish Intel would just list the maximum supported memory configuration for each memory type, instead of just having a worthless "(depending on memory type)".
Update that with current year components and shut up and take my money.
This machine is for designers and creative people.
A beefy Mac mini or powerful MacBook is what I need without a ridiculously high price tag.
Price wouldn't be such an issue if the thing was at all upgradable.
Except the lack of Nvidia GPU options means it's not for many creatives either - none of the three dominant GPU renderers will run on the Vega GPUs that Apple is offering.
This machine only serves rich youtubers who use Final Cut Pro.
Did you not have this problem?
Switch off that Spaces crap Apple does, use Hyperswitch as a replacement for alt-tab, use SizeUp for the window management (ctrl+alt+cmd as base combo, +m fullscreen, +arrow key=left/right/top/bottom half), Karabiner to switch Ctrl/Cmd C/V and use Accessibility to map cmd+q (which, on German keyboards is the @ symbol) to "invert display colors" so a mistype doesn't quit your app.
Voila, a Mac you can use for work ;)
Edit: Tips on how to disable the Spotlight indexer due to excessive CPU and RAM load without disabling the Outlook search are highly welcome...
You can't get that with a high core count, though. It's unfortunate that they didn't make a version of this with a cheap GPU and less storage (or alternatively a Xeon or i9 option on the normal iMac).
What BS. Isn't it literally an apology letter?
Apple: "Sorry we neglected and ignored you, but to make it up we didn't really listen to you, but got you a nice dinner of our favorite stuff (thinspo industrial design). Please don't leave us."
IIRC they used to do this with the tower Mac Pros; the base option had a cheapish graphics card. Base model price went up dramatically with the trashcan, which made high-end GPUs mandatory.
At home I have a 32GB 1800x system with a 2GB/s NVME drive and a 1080ti. I chose Nvidia after years of AMD use due to performance. Of course I use it for gaming as well. At work I have a Dell XPS 15 and we've been buying those or P-series Lenovo's for the devs there.
The new iMac is $6300CAD. There's no way I'm buying it for myself or my devs.
The iOS devs at work get Macbook Pros and prefer the portable nature of them. The UX team get those as well - for the same reason. Plus we'd never fork out that much money for a single machine.
Maybe people who work with 4k video or VFX and need OSX would go for these.
Even then, compared to buying an equivalent workstation hardware (including a 5K display) from a manufacturer such as HP, this is no more expensive.
I'd probably chose one only if I can get someone else (e.g. an employer) to pay for it; otherwise the same money can buy a combination of a much cheaper computer + a nice overseas vacation trip, and that'd be better.
The iMac Pro is beautiful, and ridiculously powerful. I am back to the days of wanting to be able to justify the absurd price (that's what power costs) simply to have that icon on my desk.
I would not be surprised to see a lot of these turn up on senior executive desks, or as a status symbol from companies to their developers - showing that nothing but the best will do.
Retina iMacs are amazing - I'm currently using a maxed out iMac from 2 years ago, with a 4K screen plugged in too. It was impossible to justify the cost at the time, but now I cannot imagine anything less.
I currently use an old mac with the last disc based iteration of Adobe Creative Suite - something I suspect a lot of designers are still using - on the rare occasions I need to sit down and create.
I'm wondering about the Apple pro lines and who might use a non upgradable, closed piece of expensive hardware going forward. It feels to me as though that market is shrinking unless they get serious about 'pro' meaning industrial strength and configurable
And, yeah, I do think Apple has in recent years been seriously underestimating the value of internal expandability, but I also think critics may overestimate that value to some degree. A lot of people really do buy whole new computers every four to six years; I'm not convinced "but this one is a really expensive computer!" is a definitive rebuttal in the iMac Pro's case. It seems like it'd be pretty easy to put together an iMac Pro that's going to last you past that six-year limit--and, let's face it, if you're in the market for a computer that could easily top $7K in a midrange configuration, you're probably at an income level where doing that again in six or seven years isn't going to kill you.
And, of course: we have yet to see what the 2018 Mac Pro is going to be like; the signs are that it's going to be closer to what people who don't like the closed box of the iMac would want.
(My biggest kvetch about the iMac Pro is that it seems kind of petty not to have user upgradeable RAM. Maybe there's a design reason for that, but it just feels like awfully low-hanging fruit, given that the "non-pro" 27" iMacs allow this.)
ps: my last year hackintosh build with 64GB RAM for < 5K is not impressed by those 128GB RAM
I hope they will fix the new macbook pro, that would be a love letter!
But the best thing is definitely the 4K screen on those HP/Dells: With the default scaled resolution that Apple introduced last year things look pixelated, while on 4K they don't. For a person who spends 12+ hours a day in front of laptop screen that's huge.
Maybe colors are not that saturated but I don't care if my VSCode theme displays 1 billion colors more. Neither iMac Pro nor MacBooks are developer machines anymore. Creative professionals -- yes, but not developers.