Hacker News new | past | comments | ask | show | jobs | submit login

Lack of a 32GB option in the MacBook Pros is because Intel's memory controller cannot handle LPDDR4 (except in U SKUs, which Apple does not use for MacBook Pro), and Apple decided using regular DDR4 to reach 32GB would consume too much power. [1]

Apparently Intel will fix this in their 2018 mobile SKUs, but until then Apple has chosen not to kneecap battery life or build a different logic board for the people who want 32GB.

This situation has a lot of parallels for me to the situation they faced with IBM and the G5. Everyone at the time wanted the G5 in a laptop to replace the G4, but IBM couldn't get the power consumption down.

Now over a decade later, Apple is taking shit for Intel's delays in supporting LPDDR4. I bet this is going to accelerate their plans to migrate Mac to their own ARM designs.

[1] http://www.idownloadblog.com/2016/10/31/macbook-pros-lack-of...




> This situation has a lot of parallels for me to the situation they faced with IBM and the G5.

That's a great observation, and having witnessed the PowerPC -> Intel migration I'm disappointed I didn't make it myself. Both Motorola and IBM, who were supplying Power CPUs to Apple, sold off their microprocessor divisions after a litany of manufacturing difficulties. IIRC, that was what drove Apple to abandon PowerPC in the first place. It would be ironic if Intel, having benefitted so greatly from the manufacturing shortfalls of past competitors, would find itself in a similar situation.


> It would be ironic if Intel, having benefitted so greatly from the manufacturing shortfalls of past competitors, would find itself in a similar situation.

Intel is already in this position, though not because they sold off their fabs.

All the money these days is going into mobile SoC manufacturing by the likes of TSMC and Samsung. Intel simply lost because Samsung and TSMC are able to outspend Intel on fab R&D and it's showing in Intel's numerous node shrink delays.

People will argue that TSMC/Samsung 7/10nm is not the same as Intel, and probably they're right, but only at present. TSMC/Samsung are eventually going to surpass Intel's fab technology because they're killing it in volume manufacturing chips for Apple, Qualcomm, Nvidia, AMD, and others.

Meanwhile Intel is fabbing for... Intel. Plus some Altera FPGA IP they don't seem to be integrating very well into their product stack. If Intel wants to survive the next 20 years the only option I see is that they start fabbing for other people too.

ARM is moving into servers, and once the perf/watt surpasses Intel, it won't be long for the hyperscale cloud companies like Amazon, Google, Microsoft, and Facebook to migrate away from Intel. For those guys, a 10% TCO reduction is a big deal, and while some of the perf/watt is due to the design, a lot of it comes from having the better process. If Intel loses their process lead, which is happening right now, then they're going to be second tier.

People will look back in 15 years at Intel snubbing Apple for the original iPhone SoC and mark that decision as the beginning of the end for Intel.

From Paul Otellini, Intel CEO at the time:

> At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.

https://www.theguardian.com/technology/2013/may/20/intel-sma...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: