
Macintel: The End Is Nigh - donmcc
http://www.mondaynote.com/2014/08/03/macintel-the-end-is-nigh
======
ggreer
_What would happen to the cost, battery life, and size of an A10-powered
MacBook Air?_

It would be worse in almost every aspect.

The cost wouldn't change much, and Apple wouldn't profit much more from the
switch. The cited CPU cost is the _suggested retail price_. Apple's volume
lets them negotiate huge discounts.

Battery life would go up a bit, but battery life is already pretty good on
Apple laptops. My 2013 11" Air gets 8 hours easily, but can go as long as 12
or as short as 2 depending on screen brightness and CPU usage. Unless you're
keeping the CPU busy, the screen is the biggest consumer of power. Cutting CPU
power consumption in half would only increase battery life by 20-ish%.

The PowerPC to Intel transition worked for several reasons. Most importantly,
Intel CPUs were much faster than PowerPC. In everyday usage, the fastest ARM
chips are 10x slower than a quad-core Haswell. Moving to x86 also increased
the features of Macs, allowing them to dual-boot Windows and efficiently
virtualize other x86-based OSes. Switching to ARM would backtrack on both
fronts. An ARM-based Apple laptop would have no boot camp, no Windows
virtualization, and no efficient emulation of legacy applications.

CPUs are only a small part of why tablets have longer battery life than
laptops. Tablets have no keyboard, trackpad, or hinge, so they can basically
be a giant battery with a screen attached. The iPad has a bigger battery than
the 11" MacBook Air, despite the Air weighing 50% more and taking up 30% more
volume. (Edit: This has recently changed. The 11" Air has a 38 watt-hour
battery. The iPad 3 and 4 had a 42 watt-hour battery, but the iPad Air has a
32 watt-hour battery. Still, it's even smaller than the earlier iPads, massing
less than half the 11" Air.)

In short, it doesn't seem worthwhile to do all this work and sacrifice so much
performance for some incremental increases in battery life and profit.

~~~
Symmetry
You say that ARM chips are 10x slower than Haswells, but the difference isn't
actually that large anymore and for Apple's A7 it's around 3x versus a normal
desktop chip like the 4770 and only 1.7X versus the 4250U you'd find in a
Macbook Air[1]. Diminishing returns is a constant factor in engineering and
I'd expect it to be very hard for Apple to make up the remaining distance
between themselves and Intel, but we shouldn't exaggerate how large that
difference is. And it is possible that some hypothetical A8 could be made
competitive with Intel in the 15W range due to not having to make the
compromises required to hit 4 GHz.

I do agree that the ~10W you'd save by using an A7 versus an 4250U in a
Mackbook Air would be a false economy.

EDIT: To explain that bit about targeted power ranges a bit more, to hit high
frequencies at a given process node (like 45 nm or whatever) you need to break
up your logic into a fairly high number of simple stages separated by clock
driven latches. A core that doesn't worry about hitting high frequencies can
divide up it's logic into fewer stages, simplifying design and requiring fewer
transistors for latching.

[1][http://www.computingcompendium.com/p/arm-vs-intel-
benchmarks...](http://www.computingcompendium.com/p/arm-vs-intel-
benchmarks.html)

~~~
ggreer
I think GeekBench is a poor benchmark, since it's measuring peak CPU
performance in a narrow domain, but the table you cited shows that the 4770 is
6.5x faster than the A7 (16774 vs 2564). Even at the same frequency and core
count, Haswell is 1.8x faster in the benchmark.

Whatever the exact number, it doesn't invalidate the point I was making:
Switching to ARM will ruin performance for x86 programs, which need to be
emulated. ARM has to be significantly faster than x86, or else the emulation
overhead will make users hate the transition.

This is a huge challenge for the Mac Pro and rMBP, which are all about
performance. Generally, a given microarchitecture can only scale TDP by a
factor of 10 by changing frequency and number of cores. Satisfying the pro
line would require a new microarchitecture. Considering an x86 to ARM
transition would take 1-2 years, that's asking a lot of Apple's chip
designers. If Apple was going to go through all the trouble of building a new
ARM core for high-performance, high-TDP applications, they might as well build
their own x86 CPU. It would save a transition and the necessary emulation
overhead. They could roll it out for the MacBook Airs first, then address the
Pro line at their leisure (or not and just use Intel CPUs).

~~~
Symmetry
I was looking at the single threaded numbers because those are usually more
relevant to what you would be doing on most Macbooks.

~~~
eliasv
Citation needed.

~~~
morenoh149
[https://en.wikipedia.org/wiki/Amdahl%27s_law](https://en.wikipedia.org/wiki/Amdahl%27s_law)
perhaps. Single threaded speed is most relevant to software because it's
difficult to make it take advantage of multiple cores.

------
fiatmoney
"The aging x86 architecture is beset by layers of architectural silt accreted
from a succession of additions to the instruction set... Because of this
excess baggage, an x86 chip needs more transistors than its ARM-based
equivalent"

I wish people would stop saying this. On the inside, x86 CPUs are basically
RISC. There is a translation layer from the publicly facing instruction layer
to the internal representation. The transistor and power budget for this
translation layer is absolutely trivial.

Intel has an enormous amount of expertise in the actual manufacturing and
design layers, as evidenced by the rate of improvement in their Atom CPUs
(striking range of ARM, actually), integrated GPUs, and quasi-GPU compute
cards. They are not in danger of "losing" in the long term in a performance or
performance / watt race. The risk is they get disrupted due to all CPUs
turning into a commodity in roughly the same way that RAM is a commodity.

~~~
higherpurpose
Intel chips even support 16-bit processing. How is that not baggage?

~~~
nknighthb
An 8086 had <30k transistors. Haswell is well over a billion. They could add a
hundred dedicated 8086 cores to Haswell and you wouldn't notice the
difference.

------
JohnBooty
From the article:

    
    
      > Because of this excess baggage, an x86 chip needs more
      > transistors than its ARM-based equivalent, and thus it
      > consumes more power and must dissipate more heat.
    

This is true but it ignores the primary reality of "desktop class" processor
design today: _RAM is the bottleneck in a really major way and most of a
desktop class CPU 's transistors are dedicated to overcoming this._

In the ancient days, CPUs ran synchronously (or close to it) with main memory.
Hasn't been that way for decades. CPU performance has ramped up so much more
quickly than that of main memory that it's ridiculous.

And this is where most of your transistors are spent these days - finding ways
to allow the CPU to do some useful work while it sits around waiting for main
memory. Look at a modern i5 CPU die:

[https://www.google.com/search?q=intel+core+i5](https://www.google.com/search?q=intel+core+i5)

Things to note: \- Tons of L1/L2/L3 cache so we can keep things in fast
memory. The transistors dedicated to cache _dwarf_ those allocated to the
actual processing cores, let alone the parts of those processing cores
dedicated to those crufty ol' x86 instructions \- Lots of transistors
dedicated to branch prediction and speculative execution so we can execute
instructions before we've even waited around for the data those instructions
depend upon to arrive from slow-ass main memory

Sure, mobile ARM chips are tiny and efficient! They run at 1-2GHZ while paired
with fast RAM that's not _that much_ slower than their CPUs. They don't need
to devote gobs and gobs of transistors to speculative execution and branch
prediction and cache.

But all that changes if you want to scale an ARM chip up to perform like a
"desktop-class" Intel chip. You want to add cores and execution units? If you
want to keep them fed with data and instructions you're going to need all that
extra transistor-heavy baggage and guess what -- now you're just barely more
efficient than Intel, _and_ you can't match Intel's superior process
technology that's been at least a transistor shrink or two ahead of the
competition since the dawn of the semiconductor industry.

Eventually, yes, the ARM chip makers will solve this. RAM will get faster and
processes will be enshrinkified. Just understand that transistor size and
pokey RAM are the bottlenecks, not that nasty old x86 instruction set.

~~~
zanny
Problem here is that you are mixing memory latency and memory bandwidth
together. We have memory that can easily sustain 16 simultaneous cores in
bandwidth (and honestly, memory bandwidth potential is mostly untapped - you
only see higher bandwidth benefits for integrated GPUs because they have many
more execution units demanding more data).

Meanwhile, the latency has been getting worse. The refresh rates increasing
abates it slightly but all the indirection to make high bandwidth ram, plus
the commoditization of ram to make high capacity rather than "fast" (transitor
only memory like cache shows what is possible for orders of magnitude more
complexity and cost).

Adding more cores doesn't impact that latency at all, it just demands more
bandwidth. If anything, the diminishing returns of what Intel has done -
dedicating a lot of per-core die to prediction just to throw away computations
because the per core power is too high - make less sense than just putting a
lot more dumb cores on the die.

But then you get GPUs. Shitty latency, huge bandwidth, huge flops, terrible
context switching, etc.

It is worth mentioning that both sides of the equation are doing the same
thing, though. RAM makers are dedicating a majority of the silicon on ram
modules to controllers to accelerate lookup, rather than actual capactive
storage.

For the average user, you don't need that hugely complex Haswell logic. Tablet
class performance for the web, office suites, and even programming sans
compiling are all perfectly competent. If we wrote better software that
utilized all the available cores sooner, we would have gone down the route of
16 - 32 core main CPUs instead of extreme precomputation. That has a lot more
potential performance, but it requires the software to use it.

ARM is kind of uniquely poised to do that as well. Most of its ecosystem is
fresh, it went through an extremely fast multicore expansion, and its
architecture lends itself to more cores instead of trying the "dedicate
everything to offsetting slow memory" problem. If software architects started
writing their programs to be core-variable as possible, ARM might be the first
realistic platform to break consumer 16 core computing, because the Windows
world is frozen in time.

~~~
CHY872
1\. Memory isn't just slow because they went for capacity not performance
(except vacuously), it's slow because of the laws of physics. c.f. L3 memory
is made of the same stuff as registers but takes about 30 times longer to
access. 2\. No, adding lots of dumb cores makes no sense. 3\. GPUs are useful
because many tasks are embarrassingly parallel. Many more are not. 4\. 'If we
wrote better software' adding many more cores increases the difficulty of
reasoning about software hugely. Many tasks are not easily performed in
parallel, or the speedup is not impressive enough. Most operating systems (my
guess is that OS X is included) will choke if you give them too many threads -
performance drops hugely, or many threads are left totally idle. This is due
to lock contention etc. 5\. Of course no-one 'needs' that Haswell logic - but
it's sure nice having my computer do stuff quickly. My top-of-line phone
struggles to play through its animations properly, and loading websites
frequently takes a while. Good-enough is not really a good place to be.
Furthermore, greater performance motivates more demanding applications. 6\. We
dedicate everything to offsetting slow memory because it's the only way to get
good performance from the majority of tasks. Sure if your task can be handled
by a GPU, by all means run it on a GPU. For those that cannot, we have a CPU.
There's a reason why the iPhone and iPad only have two cores - it's not worth
their while adding more but does add lots of cost and complexity.

~~~
JohnBooty

      > Memory isn't just slow because they went for capacity
      > not performance (except vacuously), it's slow because
      > of the laws of physics. 
    

Yes. The farther away RAM is from the CPU code, the more stuff needs to happen
before it can get into those precious, precious registers. Even if data from
main memory didn't have to travel over a bus/switch/etc between the DIMM and
the CPU, it's not physically possible (in any practical sense) to have main
memory running at anything close to the speed of the CPU once we're talking
about multi-GHZ CPUs. DIMMs and the CPU are running on separate clocks, you
have the sheer _distance_ and the speed of electrons through the metal to
consider, etc.

    
    
      > There's a reason why the iPhone and iPad only have two
      > cores - it's not worth their while adding more but does
      > add lots of cost and complexity.
    

Yes! There's a reason why the A7 in my iPhone 5S blows away the quad-core ARM
chip in my 2012 Nexus 7. That reason is because "adding more dumb cores" is
not the answer to anything, aside from marketing goals.

------
martingordon
I think it's funny that the main reason people cite the need to switch from
Intel to ARM is power consumption. A big reason to stay on Intel is that a lot
of people have real needs to run Windows software (whether in Bootcamp or a
VM). Switchers feel a lot better jumping to the Mac if they could fall back to
Windows if they wanted to.

Regarding battery life: The latest MacBooks get 9-12 hours of battery life. I
haven't experienced battery anxiety on a Mac in a long time. In contrast, my
iPhone is dead by the end of the day and watching it creep below 50% makes me
start thinking about the nearest Lightning adapter (yes, I understand the Mac
has a _much_ larger battery and cell radios are power hungry).

Put another way, max power consumption on an iPad Air is ~11W[1]. The max
power draw on a Haswell MacBook Air is 15-25W (~50% improvement in battery
life from 2012 to 2013, which had 21-34W max draw)[2][3]. Given that Macs have
more space available for batteries due to larger screens and the need for
keyboard and trackpad, I don't see power consumption argument holding water.

[1]: [http://www.anandtech.com/show/7460/apple-ipad-air-
review/3](http://www.anandtech.com/show/7460/apple-ipad-air-review/3)

[2]: [http://www.anandtech.com/show/6063/macbook-air-13inch-
mid-20...](http://www.anandtech.com/show/6063/macbook-air-13inch-
mid-2012-review/7)

[3]: [http://www.anandtech.com/show/7180/apple-macbook-
air-11-2013...](http://www.anandtech.com/show/7180/apple-macbook-
air-11-2013-review/2)

~~~
RexRollman
I recently switched back to Mac for the first time since 2006 and one of the
factors in my decision is that I can run other x86 compatable OSes. I haven't
felt the need to but it is nice to have the option.

~~~
Glide
Compatibility with x86 is a lot like why people don't buy purely electric
cars. Being able to run another OS is much like being able to take a very long
road trip.

~~~
sjwright
That's why I like BMW's solution for their i3. It's a purely electric car, but
if range anxiety is an issue, you can option a tiny petrol engine from a
scooter that acts purely as a generator to charge the battery.

------
PaulHoule
The real weakness of Intel in power consumption relative to ARM hasn't been so
much in the chips as the chip sets, software and the balance-of-system.

To create a low power system, ALL components need to be low power, and it is
easier to start from a system built to be low power and scale up capabilities
rather than go the other way around.

For instance, I have a Windows-based laptop which is a great machine, but if I
have a web browser open, any web browser, the fan spins, it gets hot, and
battery life is less than 1.5 hours.

Is it the fault of Windows, the browser vendors, the web platform, adware,
crapware, who knows what? It doesn't matter, but controlling power consumption
on a legacy platform is a game of whac-a-mole that doesn't end.

Because Windows users expect to plug in devices that draw power from USB, a
Windows tablet has to have a huge power transistor to regulate voltage, a
power supply system scaled up so it can supply enough power through the USB
port to charge an Android tablet, at this point you might add the fan and then
you are doomed.

~~~
sliverstorm
To me it seems mostly a problem of unwalled garden. When Apple owns the
platform from the silicon to the development environment, it's easier for them
to manage power. Wintel on the other hand has so much variety all the way down
the software stack and even into hardware, it is more difficult to control...
x86+Linux (Lintel?) suffers the same problem.

The unwalled garden can still be made low power. Android has set an example of
tackling that problem with the battery tracker, which profiles which
application is chewing up battery.

For power over USB, what do you mean? My Android cell phone can power
keyboards, flash drives, and the like with an OTG cable. It certainly has no
fan.

~~~
jp555
If you think Android isn't a walled garden (unless you mean non-Google
Android?) then the devil has convinced you he doesn't exist.

~~~
sliverstorm
It is at least much _less_ of one than iOS

~~~
jp555
Which may appeal to you as a developer, but from a user perspective, that
isn't a good thing.

Consider this analogy: Would you buy an "open-source" deadbolt for your house?
Great feature for aspiring locksmiths! But very likely not such a great
feature for 99.9% of home owners.

------
twotwotwo
Apple does look like it's making chips to handle heavy workloads, not just to
compete with the latest Krait or whatever. But I'm not sure ARM Macs are the
direction they'll take that.

It'd make some business sense for them to instead position iOS so it can take
over more and more traditional Mac duties. The IBM push could be an example of
that. Investing in iOS gives Apple the tight control and the 30% cut they're
used to from that space, and it avoids the Windows-RT-ish heartbreak of "why
is this ARM OS like my Intel OS but without all my apps?" (If there _were_
Mac-on-ARM, I'd expect it to be Mac App Store only.)

Anyway, the A7 is already a beast ([http://www.anandtech.com/show/7910/apples-
cyclone-microarchi...](http://www.anandtech.com/show/7910/apples-cyclone-
microarchitecture-detailed) does various measurements,
[http://cryptomaths.com/2014/04/29/benchmarking-symmetric-
cry...](http://cryptomaths.com/2014/04/29/benchmarking-symmetric-crypto-on-
the-apple-a7/) is an interesting case study), and there are still future
process nodes and microarchitecture changes that will let them make better
chips. I don't know if an ARM MacBook Air is specifically where this goes, but
they're certainly making ARM capable of more serious stuff.

~~~
geon
> why is this ARM OS like my Intel OS but without all my apps?

The problem should be much smaller on OS X than on Windows. Apple has a
culture of breaking things regularly, instead of worshipping backwards
compatabillity.

The result is that most Mac apps are kept up to date by the developer.
Recompiling for an arm-based OS X shouldn't be an issue.

~~~
deong
> The result is that most Mac apps are kept up to date by the developer.

That's _maybe_ true for indie developers. It's not generally true for things
like Photoshop, Microsoft Office, MATLAB, Mathematica, Skype, and dozens of
other major titles that would likely be complete dealbreakers for a pretty
significant portion of their userbase.

~~~
MBCook
Apple kept Carbon (the OS9 to OS X transitional bindings) around for so long
because of Photoshop and Office.

From personal experience, I can tell you that this does not happen with big
software. Apple released the first Intel Macs in early 2006. The next year
Intuit released Quicken 2007 which was PPC only.

The first version of Quicken to be x86 for Mac was Quicken Essentials 2010.
First of all that's _four_ years after the transition. Second, Quicken
Essentials was crippled. Here's a quote from Wikipedia:

> Some of the features of Quicken are not present in Quicken Essentials for
> Mac, such as the ability to track investment buys and sells or to pay bills
> online from the application.

So unless you wanted something ultra-basic you were screwed. Intuit's answer
was that they'd give you a free copy of Quick '08 for Windows. All you needed
to do was buy Parallels (etc.) and Windows.

This was a big problem because OS X finally dropped PPC emulation support,
called Rosetta, in 2011 with the release of Lion. So after 5 years Intuit, a
VERY big company, was unwilling to update their software or provide a real
equivalent for their Mac users.

If you look at the announcements for each version of Adobe Creative Suite for
Mac you'll find references to starting to use features of the OS that Apple
introduced years ago.

------
archagon
I'm saddened that nobody ever brings up video games when talking about these
architectural shifts. Practically all PC games in the past few decades have
been written for x86, and most of them will never be patched for ARM
compatibility. These games are as important to many of us as movies or music,
and yet I fear they're destined to disappear from our cultural memory if this
shift ever happens. Virtualization just won't cut it; most modern games barely
even run with Wine, to say nothing of performance. And given the slowing of
Moore's law, we can no longer count on emulation to give us seamless
reproductions a few years down the line. Does nobody care? Why doesn't anyone
say anything? I love my Mac, but if I have to choose between ARM and my Steam
library, I'll choose Steam and begrudgingly go back to whatever Windows
version still supports it.

On a related note, I think it's important to differentiate between utility
software that's assumed to be temporary, and one-off pieces of software that
are intended to live forever. I wish there was an easier way to write software
in such a way that it can easily be guaranteed to run in the future, no matter
the architecture. (Open source is not a guarantee. Ever try compiling the
source to a AAA game?)

~~~
eru
> I wish there was an easier way to write software in such a way that it can
> easily be guaranteed to run in the future, no matter the architecture.

Opening the source would help quite a lot with that.

------
jareds
If this happens I hope entry level Mac’s get below $500. I bought a MacBook
Air for use as a Windows machine with the plan to learn iOS development at
some point. It was worth the cost since I can use it as my day to day machine
running Windows. I wouldn’t be able to justify spending $1000 on a Mac Laptop
that could not run Windows. I could justify $400 for an entry level Mac Mini
running ARM but not a lot more then that.

~~~
joshstrange
I find the "Apple is too expensive" argument to hold little water when you
really look at it. I won't argue that up front you will pay more for the
lowest-end Apple laptop/desktop over a Windows laptop/desktop but in my
experience Apple products hold their value much better than any Dell/HP/etc
product. Even if you hate OS X then you still are better off buying an Apple
laptop and running Windows than you are buying a Windows laptop. The hardware
is more reliable, the resale value is greater, and they look 100x better than
anything else out there.

Comparing a Dell XPS to a Macbook Pro Retina with the very similar stats
(CPU/RAM/HD), the Mac has a few better specs, leaves us with a price
difference of $100 in Dell's favor but I can promise you that the MBPr will
resale for much higher than the XPS in 1-2 years time. I had friends in
college that always would joke that I paid a small fortune for my MBP ($1500)
however these were the same people who would buy a $500 laptop every year or
so because their bargain bin laptops just didn't last long before they started
having hardware issues or massive slowdowns. My MBP ran quite smoothly for 2.5
years before I needed a faster CPU (I'm a developer) and I sold my machine for
$900 which resulted in an operating cost of $240/yr for the period I owned it.

~~~
noir_lord
It is exactly for this reason my next laptop will be a MBP running Linux (I
have zero interest in MacOSX).

I bought a mid level Vostro a while back and it is absolute piece of shit.

Touch pad detects my palm from across the room but is inaccurate when I
actually touch it, screen is mediocre and lets dust in constantly, keyboard is
mushy with no positive click.

The spec looked good and Vostro's _used_ to be ok, the Vostro 1700 I had prior
was a fine machine.

It's so bad I've found myself using my ancient Thinkpad (Celeron M) when I
have to do a lot of typing.

~~~
vacri
Check out the current Thinkpads; they might also have something that suits
you.

~~~
noir_lord
Thanks I will :).

------
matthewmacleod
These arguments make no sense. Aside from anything else, the computing power
of the Mac I'm currently using—quad 2.4GHz Ivy Bridge—is so far in excess of
anything that is available in the ARM architecture that it's difficult to see
this being the case at any point.

~~~
justincormack
There are people building serious 64 bit arm chips, for server type
applications. No idea what actual performance will be yet.

------
overgard
I'm probably in the vast minority, but I tend to use apple hardware to run
windows... so I'm hoping this doesn't come to pass. I use OSX occasionally,
but the thing that got me to switch to apple in the first place was bootcamp.

~~~
millstone
It's quite possible that ARM Macs will run Windows. Windows RT already runs on
ARM, and Microsoft seems to be moving in the direction of embracing
alternative platforms (see how they push Azure for iOS). They'd sell more
Windows licenses and gain more users, and at worst they'd lose some Surface
sales.

Whether Apple would allow it is a different question.

~~~
shawn-butler
WindowsRT is a dead man walking.

I wouldn't predict a long maintenance life for it. Surface2 and the Nokia 2520
is about all the hardware left running it?

------
nsxwolf
Losing native Windows virtualization seems like a big deal. That was a huge
selling point with the switch to Intel. It's become indispensable for many Mac
users and was the reason many Windows users were able to switch.

Unless Microsoft also drops Intel, I don't see this happening.

------
ralphc
The "transitions" comments leave out a big piece of functionality; certainly
big for me, and I imagine big for others. That's VMs for Windows and Linux. I
have all kinds of Linux VMs I run on my Mac, and I imagine others that need
"that one Windows app" run Windows a lot in VMWare and VirtualBox. Going to
ARM would torch a big part of Mac functionality for me.

------
fsiefken
From what i've understood from benchmarks and reviews the latest Intel atom
processors have a better power/performance ratio then arm processors. I'm not
sure if the performance/price ratio is better though, does anyone know?

------
SyneRyder
If Apple were to switch from Intel, I'd probably have to (reluctantly) go back
to a Windows laptop. I love my MacBook, but most of my money is still earned
working with clients who use Windows environments. There's still some software
that never made the jump to Mac either, for which I still have to use
Parallels Desktop. The best bit of having a Mac is that I can run OS X,
Windows & Linux all on the same box.

Of course, if Apple is also making their own x86 compatible chips, that's a
different story. I don't need an Intel chip specifically, I just need
something that runs Windows / x86 perfectly....

------
ksec
As much as i want this to happen. I dont see this coming in the near future.
Why would they release the new MacPro with Intel Xeon if they had ever planned
to switch away? And would MacPro be staying in x86 land if Apple decide to
switch the rest to ARM?

Another obstacle is Thunderbolt. This DisplayPort + External PCI-Express Cable
is totally in control by Intel. AMD's version is based on USB 3, which is an
ugly hack that Apple will unlikely use.

The performance gap between Haswell and A7 is huge. Watts to Watts, at
Notebook / Desktop power range Intel wins hands down. Although the gap is
shrinking with each Ax SoC.

Then there is the part about Intel Atom losing on performance. Which is wrong.
Intel Atom SoC performs really well. It didn't get much wins simply because of
its ecosystem and prices.

The Mobile SoC market are operating at thin margin comparatively speaking.
Even if Intel are offering Atom at the same price, why would any OEM wants to
be bound by Intel & x86 again? So Intel decide to offer those SoC at a lost
and what happened? On the Western world it is dominated by Qualcomm where it
offer better solution or cheaper TC with its integrated Modem. In Eastern
World or China it is hit by "8 Core" marketing team from MediaTek. Everyone
thoughts more Core = better.

I dont see how Intel is going win this Mobile battle. Apple will pretty much
drive Intel to where they want, which is to Fab SoC for them.

------
Grynn
I blogged about this in April. [http://vishaldoshi.me/2014/04/25/apple-
intc/](http://vishaldoshi.me/2014/04/25/apple-intc/)

It looks like Chromebooks are a very popular format and I think an ARM based
'AirBook' could compete in that space.

Napkin Math

MacBook Air current generation (mid-2013) retail price: $999 ($1099 for 13″
model with same CPU).

Intel Core i5-4250U, Tray: $315,
[http://ark.intel.com/products/75028/](http://ark.intel.com/products/75028/)
(sure Apple will be getting large discounts on this, but it can't be that
large, since $INTC has ~60% gross margin overall)

Apple A7, Tray: $20 (estimated)

Intel Atom E3827, Tray: $41

Tray = 1000 pcs;

The Core i5 has a 15W TDP; 1.3 Ghz clock (turbo to 2.6Ghz); 2 cores, 4
threads. Sunspider 250ms.

The Apple A7 has a 2W TDP; 1.3Ghz clock; 2 cores, 2 threads. Sunspider 397 ms.

[http://cpuboss.com/cpus/Intel-Core-i5-4250U-vs-
Apple-A7](http://cpuboss.com/cpus/Intel-Core-i5-4250U-vs-Apple-A7)

It’s not looking all that different! Esp. when you take into account that the
A8 will be twice as fast (think Tegra K1) – i.e – a Sunspider score of 200ms
maybe?

------
gchucky
Had a hard time loading it, so here's the cached version:
[http://webcache.googleusercontent.com/search?q=cache:SDIaXc5...](http://webcache.googleusercontent.com/search?q=cache:SDIaXc5Z72sJ:www.mondaynote.com/2014/08/03/macintel-
the-end-is-nigh/)

------
kabdib
The Windows kernel guys were evaluating ARM systems for us, and the gist was
"Run away, don't walk; the memory systems on those things are terrible."

I like the ARM architecture a lot. It's simple and easy to write software for,
all the way from embedded controller stuff to "real" operating systems. But
they're not all that great at doing massive computation. We were going to use
one in the video path of a popular gaming system for a while, and it turned
out to be inadequate by at least an order of magnitude (probably a factor of
100, but comparing CPU cycles to GPU cycles is pretty unfair).

Intel is executing really well, and it'll probably take an alien invasion to
dethrone them.

~~~
aidenn0
When was this? ARM historically has abysmal memory systems, but there have
been significant improvements over the past few years (I don't know that
they've caught up, but they are in a different league from about 5 years ago
or so).

~~~
kabdib
Within the last three years, but projecting availability out about a year.

------
willyt
Apple needs x86 to run existing apps. Who would accept a a big slow down for a
slightly cheaper laptop. x86 intel chips are a RISC design with a glue layer
which grafts x86 on the outside. Could Apple design an x86 chip with an ARM
RISC core hidden inside? Sounds pretty complicated, I doubt they have a big
enough hardware engineering team. If there was an ARM core inside could it be
exposed for recompiled software to take advantage of it? Big issues managing
state/contention between 2 different ISAs?

------
davidgerard
This is a horribly low-quality article. The central proposition is _entirely
analyst speculation_ , with no actual information to hook it onto that doesn't
date back years. The author appears, from how he gets details subtly wrong,
not to actually know anything about the history of CPUs (and not to have
bothered e.g. checking Wikipedia).

There is nothing here that is backed-up news whatsoever.

------
bio4m
According to Geekbench the current CPU in the iPad Air (the A7) scores about
the same as a Core 2 Duo from 2006 (the E6600). Nothing to sneeze at and
definitely getting close to the low power Haswell CPU's in systems like the
Macbook Air. But I reckon it'll be a few generations of A series CPU's before
they reach the point where they can challenge Intel for the crown

------
sudhirj
For someone who's not very knowledgeable about architecture shifts, what
software changes would be require to make this happen? I'd assume that the Mac
OSX itself might be able to make the shift easily, given that Swift and ObjC
code seems to run on both iOS/ARM devices and MacOS/x386 machines. What else
will have to change? Or is this purely a hardware choice?

~~~
wyager
The fact that iOS shares a _lot_ of the same code as OS X will make this a
much easier transition. Apple can take a lot of the low-level stuff from iOS,
where they've had years to figure out how to get Darwin working smoothly on
ARM.

Beyond that, all apps on the OS X app store will have to be recompiled and
resubmitted. None of the binaries installed on current OS X machines will
still work, unless Apple includes an x86 emulator like they did for the
transition from PowerPC.

This will also open up the possibility of running iOS apps natively on OS X,
but I doubt Apple will pursue that at all (for UX reasons).

~~~
jiggy2011
The transition will be easy for applications that rely on Apple's APIs only,
the fun will be for programs that use proprietary third party components all
of which will need to be rebuilt, as well as all binary plugins to
applications.

I can imagine this might be a huge job for some of the more heavyweight
production programs for Mac , as well as AAA games.

It would basically spell doom for Parallels and other virtualization software
which give Mac switchers an escape hatch back to any Windows programs they
might use.

~~~
umanwizard
Another negative in the same vein is that it would prevent fast emulation of
Windows and many Linux/BSD distributions in something like VMWare.

~~~
dublinben
Most Linux and BSD distributions are available for ARM already.

------
imaginenore
_If you wish to make an apple pie from scratch, you must first invent the
universe_

How far does Apple want to go? The rabbit hole is pretty much infinite. Do
they want to make their own plastics and paints and metals? Do they want to
mine the materials for all of the above? Do they want to make the mining
machines for their mines? Etc, etc, etc.

------
sah88
An ARM based Air might make a nice facebook machine but what about
Pros/Desktop and the high performance market? I really doubt there are any ARM
chips anywhere near the performance of an i7. What are they going to do
segment their PC lineup or just abandon the high performance side? I doubt
they are willing to do either.

------
eurleif
I wonder if this will mean the end of Flash Player for Mac? Will Adobe bother
releasing an ARM version?

~~~
PhantomGremlin
> I wonder if this will mean the end of Flash Player for Mac?

You say that like it's a bad thing.

I wish Flash on Mac would die die die. Youtube has the ability of showing me
non-flash video on a Mac. And yet they often won't. The exact same video that
won't play on a Mac will play just fine on an iPad. Evil bastards! With Flash
dead, Google wouldn't be able to demand I use it.

~~~
demallien
Activate Developer Tools in the preferences of Safari. This gives you a
Developer menu in the toolbar. When a site refuses to play video on your Mac
because it needs Flash, you tell Safari to pretend it's an iPad. The site
invariably serves you the video you wanted, without Flash.

~~~
PhantomGremlin
I read this advice previously on HN, and it worked for a while. Then Google
changed something, and it stopped working. I don't know, maybe I was screwing
up. Or maybe it's just easier for Google to roll ads ahead of the video when
it's in Flash. Or maybe Google wants me to download Chrome, which has a built
in Flash player. Not gonna do that until it supports NoScript. YMMV,
obviously.

~~~
demallien
No, it still works just fine. I was using it just yesterday...

------
ggreer
While I think the article's prediction is incorrect (see my other comment for
why), it did give me an idea: Why doesn't Apple design their own x86 CPU?

Their work on the Ax series has probably taught them quite a bit that could be
ported over to x86-land. Also, Apple could leave out x86 cruft they don't use:
legacy addressing modes, PAE, etc. And of course, they could design the CPU
specifically for their products instead of searching for the closest match
sold by existing vendors (or cajoling Intel to tweak their designs).

Apple already has strong relationships with fab companies. They have the
talent and teams to design such a CPU. One wonders if they're already working
on such a thing. Even if it never shipped, it could be used to negotiate lower
prices from Intel.

~~~
rodgerd
> Why doesn't Apple design their own x86 CPU?

There's the small matter of getting an x86 license. Which would mean buying
Intel or AMD, since it's vanishingly unlikely that Intel will be handing them
out any time soon.

~~~
PhantomGremlin
> Which would mean buying Intel or AMD

Apple could afford to buy Intel, but antitrust considerations would probably
prevent that from happening.

Apple could buy AMD, but the Intel/AMD x86 cross license probably terminates
in a "change of control" situation. Which might mean a restart of the
Intel/AMD lawsuits of years gone by.

------
glitch003
What seems more likely to me is a hybrid ARM / x86 machine that switches
dynamically between processors, kind of like how some macs can switch
dynamically between their integrated and discrete GPUs.

------
Scalar
Very interesting, wouldn't surprise me if we saw an AMD/ARM acquisition by
Apple by this time next year.

------
tanvach
I actually see Apple branding A7 as 'desktop class' to leverage better CPU
price from Intel.

------
baq
apple has been known in the last decade for picking the best hardware for the
job. i don't see how ARM chips can compete with intel chips in terms of
performace or performance per watt - unless you want a 4W macbook. (hint: you
probably don't.) ARM competes very well in performance per dollar - but AMD
does, too, and I don't see macs with AMD CPUs in them.

i can see a laptop-sized ARM-based product from apple, but it won't replace
any of the macs, it'll be something completely new - let's call it a macpad.

------
jmmcd
> Googling “Mac running on ARM” gets you close to 10M results.

Why should we listen to someone who doesn't know what quotes do in search
queries?

------
lurkinggrue
Seriously, Not really.

------
lotsofmangos
Half right, but only by accident.

------
comrade1
The transition from powerpc to x86 was surprisingly relatively painless. There
was some pain but most apps ran fine in the translation layer and eventually
the native apps moved over.

It'll probably be even easier the next time since apple has done it once
already and knows how to provide the proper dev tools.

~~~
gsnedders
They've done it twice, not once. They moved from 68k to PPC in the early 90s,
again with full emulation. In that case it was even more extreme, as much of
the OS ran emulated in early PPC releases (yet it was _still_ quicker than
running on actual 68k hardware!).

~~~
bryanlarsen
You've highlighted the big difference this time. Switching from Intel to ARM
would be a step _down_ in performance, or at least not a step up. There's no
headroom for a legacy emulation layer.

~~~
bostik
However, it does work with the playbook of going ever closer to computing
appliances from general-purpose computers.

In the beginning there was the motherboard and a CPU. Then, before homogenic
PC era, we had dedicated chips that took care of certain operations. The SID
chip in C-64. The blitter chip in Amiga. (Can't remember the name, I'm sorry.)
Even the x87 math coprocessor in the 386/486 age!

With advent of PC and the megahertz wars, dedicated peripheral chips became
less common - except in SoC environment. Where the x86 world went with raw
processing power, embedded world had to find ways to fit specialised chips on
the board.

My experience is mostly centred around crypto accelerators, but I know from
_very painful_ experience that all Maemo devices had on-board DSP units to
handle some sound decoding, and pretty much all video processing. So the
pendulum swings: CPU for everything, then peripheral devices for specific
high-intensity jobs. Some of the most commonly used get integrated into CPU's,
making entire classes of chips irrelevant ... until the next CPU-intensive
thing comes up, and the main processor is again too slow.

Apple is banking on their ability to both predict _and_ dictate the direction
of near-future computing needs. I expect the A7/A10 boards to come up with all
kinds of integrated support chips to handle the heavier loads.

As long as their predictions are correct, all is well. Any bets on what's the
next CPU burner that will require a dedicated ASIC to preserve even the
semblance of battery longevity?

------
epynonymous
this is about the most poorly written article i've ever read! well, second
that to some of the trash on techcrunch. how about some benchmarks of
compiling code on intel vs arm instead of saying that since ipads (arm based)
cost more than macbook airs (intel), so therefore intel will fade from apple's
line of laptop/desktops. i think arm processor laptops are going to become
mainstream, but not for most of the arguments specified in the article. i
believe this fate is still far away as x86 still offers total raw compute
power over arm, even though arms are more energy efficient.

also, the self noted digressions in the article arent even funny, feels like
someone with an english degree and subscription to "i can spell x86" magazine
wrote this article

