
Intel dismisses ‘x86 tax’, sees no future for ARM or any of its competitors - evo_9
http://www.extremetech.com/computing/130552-intel-dismisses-x86-tax-sees-no-future-for-arm-or-any-of-its-competitors
======
Symmetry
_There is nothing in the instruction set that is more or less energy efficient
than any other instruction set_

Oh, come on. The problem with x86 is that it's a variable length instruction
encoding that isn't self synchronizing. That is, you can start reading at byte
0 and read one set of valid instruction or you can start reading at byte 1 and
read another completely different set of valid instructions.

This means that the logic to decode more than one instruction in a clock cycle
is really complicated, and complicated means power intensive. For instance, in
AMD's Bulldozer chip the instruction decode unit uses more power than the
integer execution cluster. Now, ARM has its troubles too (instruction
predication on every instruction means another logical dependency, meaning
that if you go out of order your scoreboard is more complicated) but they're
not as bad and don't really exist in alternate encoding like Thumb2 or the 64
bit one.

Intel is able to compete so well because they tend to have better fabs, they
tend to reach a given node like 22nm before everyone else, and their
performance on that node tends to be better than other players too. But x86
does have intrinsic disadvantages.

~~~
callan
Disclaimer: I'm a software guy @ Intel.

I feel that a lot of sw folks have a lack of imagination when it comes to hw.
A lot of very smart people here work on making an efficient, fast front-end.
There's lots of research in this area. To my eyes, the implementations are
stunning.

To say that a variable-length CISC instruction set can never be as fast or
effecient as ARM is sort-of like saying that an interpreted language can never
be as fast as C. And we all know how that goes over on HN.

~~~
Symmetry
I'm sure the front end has had a lot of work and cleverness put into it, and
I'm sure that it's able to, say, correctly guess where instruction boundaries
are more than 99% of the time without knowing for sure and then remove those
instructions from the queue if it turns out that the guess was wrong. But that
just means that 4-wide decode is only on the same order of power consumption
as execution, instead of being an order of magnitude more like if you'd done
thing the naive way. I'm sure that the work they do _is_ stunning, but its
work that doesn't even have to be done on more regular instruction sets.

It's not a problem with being variable length or CISC, its a problem with the
lack of self-synchronization. If you, say, had 8 or 16 bit blocks were first
bit of each block told you whether it was the beginning of an instruction you
wouldn't have this problem at all. I'd generally say that for a modern general
purpose computer variable length instructions and a large number of op-codes
are a good idea, though a large number of addressing modes seems to still be a
disadvantage. Its probably no accident that ARM, the "CISCiest" of the RISC
processors is so popular right now.

------
rbehrends
On a somewhat unrelated note, Intel's cleanroom ad [1] that is reproduced in
the article really sets my teeth on edge because of the sheer level of
misogyny involved (little girl with toys in a pink bedroom vs. grown men in
blue high-tech room).

Could it possibly send a stronger message that women don't belong in tech?

(That the ad designers may not have intended that message does not mean that
it's not there.)

[1] <http://cleanroom.net/?p=966>

------
thoughtsimple
Android is no problem because of the Dalvik byte code engine but Apple is a
whole other matter. Apple has over 600,000 apps in the App Store that are all
coded for ARM. Moving to a new instruction set architecture is going to
problematic.

If Intel had taken low power seriously back in the days of the Newton and the
StrongARM (which Intel actually owned for a while) they might have won the
Apple business instead of ARM back in 2007 but now it is probably too late.

Apple has switched CPU architectures before but never with the number of
applications that they are currently dealing with in the App Store.

~~~
jevinskie
I would think that there would be few problems porting your source from ARM to
x86. Heck, if you use the iOS simulator, it compiles to x86! Maybe a Rosetta
like shim could be used for apps where the authors do not update them.

~~~
thoughtsimple
I don't think it would take much to recompile existing apps but there are so
many apps, just approving them would take Apple years.

Then think of the all the apps that are tuned to take maximum advantage of the
platform. That code may work but it doesn't mean it will work optimally.

Intel is going to have to do a lot better than "we win some, lose some" on
battery and performance to convince Apple to switch.

~~~
objclxt
The last time Apple did this apps didn't _have_ to re-compile - the Rosetta
translation layer did a pretty decent job. If Apple were to switch
architectures (rather unlikely, but still) I imagine they'd be providing a
similar sort of translation layer for most apps to use.

~~~
thoughtsimple
You aren't going to use Rosetta style software on a battery constrained mobile
device. There would have to be a huge benefit to switching to make that kind
of problem for users. The iPhone is already bashed for having battery issues.
What would be the benefit of switching to Intel for Apple?

------
Codhisattva
_I begin to wonder what this means for Apple_

There is no doubt in my mind that Apple will jump to Intel mobile chips as
soon as Intel can meet Apple's needs.

Number one compelling reason: Samsung is Apple's foundry.

~~~
thoughtsimple
I can see Apple going with Intel as a foundry for ARM SoC but I doubt they
will switch to x86.

------
spiralpolitik
Generally when a company dismisses a competitor in this way it usually finds
itself on the wrong end of a disruptive collapse in 5 or so years. See Nokia,
RIM, Palm.

While its difficult to bet against Intel, it will be interesting to see how
quickly they can ramp up sales of their mobile processors as sales of their
desktop processors decline. That will be the key to Intel's survival.

~~~
adrianmsmith
"as sales of their desktop processors decline"

Why will their sales of desktop processors decline?

~~~
spiralpolitik
People switching from buying x86 laptops or desktops switching to buying
tablets instead.

If the iPad/Android/Windows 8 tablet doesn't have an Intel processor in it
then that's money lost for Intel.

------
mtgx
And what else would you expect them to say? Even if they actually believe it,
they're still seeing everything from their own eyes, from their point of view,
and their market, just like an incumbent of an "old" technology would.

------
excuse-me
Who cares about a tablet/cell phone's CPU? It's the GPU that matters.

So I buy an Nvidia Tegra2 chipset, it happens to have an ARM tucked in the
corner. I buy an Atom and it has the latest in a long line of Intel embedded
GPUS stuck int he corner as an afterthought

~~~
dmm
The latest Atom SoCs have PowerVR GPUs.

~~~
mtgx
Old ones. Almost 2 generations old. They can barely get half of the
performance of any other current high-end chip, and Mali T604 and Adreno 320
should be out in a few months with another 2-3x increase in performance.

