
The Intel Enigma - chmars
http://www.mondaynote.com/2014/12/14/the-intel-enigma/
======
fiatmoney
A few things to note:

\- Intel benefits from mobile growth even if it's not selling the phone
hardware. Each of those phones needs to connect to back-end servers for almost
everything, and those servers run X86. Every time they connect to a server, it
generates data, and that data is analyzed and monetized by servers that,
again, run X86.

\- Fabs, and by proxy fab time, is super expensive (and growing superlinearly
as process size shrinks). They may have made a strategic decision that they'd
rather be pumping out Xeons for backend servers with that capital than Atoms
for phone clients, except to the extent that they design and produce enough to
"stay in the game" should they wish to change direction.

\- Judging from the rate of improvement of the Intel low-power CPUs and GPUs,
they are on a relatively short path to being extremely competitive with ARM
and the associated graphics chipsets.

\- Intel has a lot of experience writing "sufficiently smart compilers", and
doing CPU / GPU integration. Both of those are extremely handy for low-power
graphics-heavy environments.

I think this is just one of those things where it doesn't look like a trend
until it's inevitable.

~~~
jacques_chester
Sufficiently smart compilers never showed up where it mattered most for Intel:

Turning branchy C/C++ system and apps code into explicitly parallel
instructions.

And arguably, they can't.

Sufficiently smart compilers for GPUs aren't sufficiently smart. They work by
imposing sufficiently dumb limits on programmers instead.

~~~
fiatmoney
Interestingly, Intel spends a lot of die budget on on-chip instruction level
parallelism, and their major instruction set-level enhancements have been
increasing vector operation lengths. They've also invested somewhat heavily in
GPGPU-style vector coprocessors in their Xeon Phi's.

~~~
jacques_chester
Right. But the whole point of EPIC was to do away with that part of the die
budget entirely, and a lot more besides.

GPGPU programming has worked because nobody proposed to run general code on it
from the beginning. It just so happens that there are some workloads that fit
the vector processing model.

But lots still don't. Lots and lots and lots.

------
ChuckMcM
Jean-Louis raises some good questions about Intel. They aren't new questions,
and the shape has changed, but they are good ones. When I worked there in the
80's I was validating a graphics chip (the 82786) which was a really cool
video chip. And tried to get Intel to consider these "Unix workstation" things
and they couldn't see past the risk of harming their Microsoft relationship.

To be fair, Intel has made a ton of money and they are widely regarded as the
premier chip manufacturing company, standing strong while people like Motorola
(the Freescale part of it), TI, NEC, Fairchild, and National faded. And by
that measure they are an unqualified gorilla in the market place.

So it is interesting to see things like ARM effectively encroaching into their
markets. I'm typing this on a desktop with an x86 processor, but I have way
more machines interacting with me running some OS on an ARM processor in my
house than I do a x86 machines. That bodes ill for the future.

I've been playing with a Lenovo x86 based Android tablet. Its nice, but it is
no better nor worse than an ARM based tablet. And apparently Lenovo was being
paid $51 each for them to make these. You cannot retake market share being
'just as good as' and a bribe. You really need a compelling reason to get
people to switch to your processor. I wonder what that will be for Intel.

~~~
serge2k
> And tried to get Intel to consider these "Unix workstation" things and they
> couldn't see past the risk of harming their Microsoft relationship.

and by not listening they made how many billions?

~~~
ChuckMcM
That is kind of the point. They _have_ made billions, an unqualified success,
and they have triumphed over all of their rivals of the day. The only problem
is the one the Jean-Louis points out, they are running out of road.

I was at Sun in 1988 when they decided to "go enterprise" in a big way. The
focused on killing DEC and all the other workstation vendors, and partnered
with AT&T to make UNIX a real Enterprise OS, combining System V, and BSD 4.2
(aka SunOS) into Solaris. It made them many billions of dollars, heck it paid
for most of my house! But that road did not lead to them being a force in the
Linux/*BSD world, nor supplying computers for that word. That was lost to the
other server manufacturers and Sun Microsystems ran out of road. Long before
Sun was effectively dead, they could have moved into a new space. They did
not. Now they exist only as assets in the Oracle ledger.

Jean-Louis points out that Intel could be headed this way. They say big things
but they seem unable to face the reality that the microprocessor world is
changing in fundamental ways that make it hard to have giant gross margins on
a single chip.

Intel has done great things, its a wonderful company, and it will die if it
cannot figure out how to compete with ARM.

~~~
e12e
Do you have any thoughts on the future (or lack there of) of AMD?

I thought it was very interesting that they won both the Sony Playstation 4
and the XBOX One - in my eyes the first win for their long transition towards
"real" on-die GPUs.

I'm thinking that they'll now have the whole game industry making graphic
engines that work great on their stack (both major consoles, and on PCs). If
they can leverage some of that towards tablet/mobile -- that could be
interesting.

So far it seems that Nvidia is winning there, though.

(Obviously, while I have a bit of a crush on AMD as the underdog, and love
their push for better open source drivers -- I don't really want to see a new
monopoly emerge. But anyway, I'd be curious to hear what you think?)

~~~
ChuckMcM
I like AMD. I told the CEO of NetApp he must fire me if going with an AMD
processor (Opteron) in the high end (aka most profitable, flag ship) filer
turned out to be the wrong decision. That said, I hate AMD, they can't execute
for crap.

It wasn't for lack of vision. Fred Weber, their CTO at the time, had solid
ideas about moving AMD into the lead for x86 scale chips. AMD kept thinking
they were a chip making company and didn't unload what became Global Foundries
fast enough. Not to mention that everyone who has to play with x86 is playing
with the deck stacked against them. Massive memory machines are coming, all
the cool kids are over at HP building one on "the machine" project. AMD could
have built that machine 10 years ago, or at least laid the ground work for it.
AMD owned a huge chunk of the FLASH market, they could have spun silicon in a
way that would allow for mixed FLASH and DRAM on the "north bridge", it would
have changed a lot of the economics of things. But when you're constantly
about to die, it is really really hard to make strategic moves. So unless AMD
stops perceiving itself as being on life support it won't be able to muster
the courage to move forward on things where it could make a huge difference.
And its not clear to me if it has fallen under the power curve or not. Fred
took a huge risk when they did the Sledgehammer (aka Opteron) architecture.
And I watched how much pain people went through trying not to piss off Intel
to use it (including NetApp). And got to see how it totally stomped the living
crap out Intel's Netburst designs. (Intel was pushing Itanium as their 64 bit
/ large memory answer, which also underperformed Opteron machines)

But the reverberations of that triumph were not good, AMD's inability to
execute meant they stumbled on the next few bits. They got the jitters, canned
Fred, and went for something "safe". Not a good plan for world domination.

~~~
raverbashing
"AMD kept thinking they were a chip making company and didn't unload what
became Global Foundries fast enough"

Ah but they were only capable of do that after their settlement with Intel (I
mean, the Intel anti-trust settlement)

Before that if you wanted to make x86 chips the condition is that you
manufactured the chips or something like that

~~~
ChuckMcM
Very valid point. The albatross of x86 licensing/patents is a hard one to git
rid of.

------
higherpurpose
> Apple might feel that Intel’s process needs to mature before it can deliver
> 300M units.

We don't have to wonder about that at all. Apple will either use Samsung's
14nm or TSMC's 16nm (or both) for its chips next year.

It's funny that Intel keeps bragging about its process advantage, yet even
with that process advantage they have _no real advantage_ over the 28nm planar
ARM chips in terms of performance/power consumption (sorry, Anand, it seems
the "x86 myth" hasn't been "busted" after all. Otherwise Atom on 22nm FinFET
should've _wiped the floor_ with _any_ 28nm planar ARM chip. But it doesn't.
Not even close. In fact many current high-end ARM chips on older process beat
Atom on the newer process).

Instead they have to beg tablet makers to take their chips away for free (or
they _pay them_ to take the chips).

Also, Intel's actions speak volumes. They've started _licensing out_ their
Atom micro-architecture to Rockchip and Spreadtrum. Think about what that
means for a second - it means Intel thinks it _can 't_ succeed in the mobile
market _making its own chips_. Instead it has to give their designs out as IP,
just like ARM, so _other companies_ make Atom chips. Even if Intel is
"successful" with this strategy, they'll be making pennies on the dollar in
mobile, just like ARM Holdings does (ARM is totally fine with that, given the
company structure - I doubt Intel would be).

Oh - and those companies aren't going to use Intel's "huge process advantage"
either. They're going to use TSMC's 28nm planar process - late next year. If
Intel is doing this then Intel must think that having others make Atom chips
on an obsolete 28nm process (while ARM chip makers move to 20nm and 14nm
FinFet next year), is going to be "more successful" than themselves making
them with that "huge process advantage". Let that hopeless Intel strategy sink
in for a moment.

~~~
beagle3
While I personally am confused about this part of Intel's strategy, I can't
avoid the feeling that Intel (and Rockchip) know something I don't.

Maybe it's some kind of "white-label" conspiracy - Intel wanting to sell Atoms
cheaply through a 3rd party so they don't need to dilute the Intel brand (and
Wall-Street observed profit margin).

Maybe they are planning subsidies for the next 20 years (Intel was always
playing the long game), and this is their way to avoid anti-trust, which would
surely hit if they are actually successful.

They are at a huge disadvantage in the market, and they seem to have no
technical advantage in mobile and tablets so far. But this does not seem like
a desperate strategy - it seems the goal is different than "sell more now". It
might just be red queen style "keep running in order to stay in place until we
can figure out something else".

HP RISC cpus are gone. Alpha AXP is gone. AMD is mostly playing catch-up. IBM
Power cpus are sort-of alive, but it is not an Intel competitor at this point.
I hope Intel's dominance wanes and competition arrives again - but I'm not
holding my breath.

~~~
yaantc
Making ARM chips for low/medium end tablets is a cut throat market. Rockchip
did well initially but now Allwinner has passed them, and they seem to be in a
challenging situation. Just look at Allwinner vs. Rockchip growth. Not to
mention Mediatek that will eventually come with integrated cellular (including
4G). So for Rockchip, it may simply be a defensive move: life is too hard and
competitive on the ARM side, so try to find a niche in the x86 side. I'm very
doubtful it'll work out: the volume is in the low/mid tiers, and the average
consumer doesn't care about the CPU architecture. It's just a basic Android
tablet, period. I don't see a worthwhile differentiation on Intel side, except
the current crazy subsidies but those can't last forever. And this market is
not faithful: the chip makers need to provide mostly everything to the ODMs
(Mediatek is quite famous there, providing even the production and test
process and tooling ready to deploy), and this makes changing chip provider
rather easy. There is no loyalty, only a focus on low prices.

------
jacques_chester
Itanium was not "an adoption" of PA-RISC. It was a completely novel design co-
developed with HP after HP researchers argued that VLIW was the next step
forward.

It did, however, manage to kill several of Intel's potential high end
competitors with nothing more than hype.

~~~
beagle3
> nothing more than hype.

at least $10 Billions of dollars (still bleeding, though at a much lower rate
these days), some goodwill from customers, and letting AMD become the premier
64-bit platform from a couple of years and having to play catchup -- Intel was
practically forced to adopt the AMD64 architecture or they would have lost the
lucrative market for PCs and servers.

All things considered, I'm not sure that the price Intel paid for killing PA-
RISC was good value for money - it was a lot more than hype, even if in the
grand scheme of things it wasn't much.

~~~
jacques_chester
They spent billions of dollars on a dream that didn't pan out. And, in
retrospect, was a bad idea.

And maybe a few hundred thousand on press junkets to hype the dream. Millions,
tops.

Given that Alpha and MIPS dropped out before there was anything at all
concrete, I'd say that hype was the one that mattered.

But I guess it's easy for me to say that in hindsight.

~~~
beagle3
Oh, it was a bad idea from the get-go. That was the sentiment I had myself,
and heard from everyone who had actually looked at the details -- long before
actual hardware was available.

The market did NOT want Itanium, and Intel knew that - but intel believed that
they were strong enough to dictate. They weren't, and their internal culture
made it hard for them to accept that for a long time.

I actually find it fascinating - Intel had an OO processor (name escapes me
now) between the 286 and the 386 that has a story surprisingly similar to the
Itanium: Radical departure towards unproven instruction set which, when
arrived, delivered too-little-too-late, causing the company to scramble into
retrofitting the older cash-cow for the future before someone else manages to
eat their lunch.

History does not repeat, but it often rhymes.

~~~
zvrba
It was iAPX 432. IIRC, its main problem was that it was ahead of its time. It
was complex, and the manufacturing process was not developed enough, so it had
to be manufactured as _three_ separate chips. This made motherboards more
complex, and also performance suffered.

~~~
beagle3
I don't think "ahead of its time" is a good description (or, alternatively, it
was so far ahead that its time had not yet arrived).

There was one CPU architecture designed since, that I'm aware of, that has OO
baked in as well as a stack-machine model (the defining features of the
software side of the 432, even if the hardware was perfect). The Java CPUs
faired about as well as the 432, which is "not well at all". I think "bad
architecture" is a better description than "ahead of its time", which is just
as true for the Itanium.

------
huxley
A bit surprising that Jean-Louis confused Itanium with PA-RISC. Itanium was
started at HP as a successor to PA-RISC and provided an emulation mode for PA-
RISC legacy users but it's EPIC (VLIW) architecture was pretty much a clean
slate.

~~~
abrowne
I noticed this too, and thought it was especially funny considering both names
are linked to their respective Wikipedia entries.

------
bobajeff
I wonder why Intel doesn't just design their own custom Arm core like Apple
and Qualcomm do. Why so focused on moving manufacturers to x86?

~~~
pjmlp
They had it (XScale) and sold the unit, maybe due to internal politics.

~~~
jotm
I'm surprised that so many people don't know this. Back in the day, StrongARM
and XScale were used in most (if not all?) of the PDAs. Intel could've easily
had a monopoly today if they continued developing the technology.

~~~
wvenable
Except that they didn't own the technology like they own x86.

~~~
jotm
Well, they don't own x86-64, either - and that worked out incredibly well for
them :-)

~~~
wvenable
Actually they do. AMD licenses x86 from Intel on which x86-64 is based.

------
adwf
Isn't it the case that Intel Atom has been just that little bit too power
hungry compared to Arm chips? They can't quite convince anyone to use them
without large subsidies.

I know they've been making some headway in this regard, but the perception is
still there that Atom is too heavyweight to stick in a phone.

------
programminggeek
The danger Intel faces is at what point are ARM chips "good enough" to
encroach on the PC and server space. At some point the billions Apple and
Samsung are spending to improve manufacturing and R&D will likely make ARM
chips fast enough to handle all kinds of general usage in laptops and servers.

If Apple took their PC business away from Intel, they might not care, but if
Samsung decided to start selling $10 desktop ARM chips while the equivalent PC
chip was $100, that would potentially put a huge dent in Intel's core
business.

I don't know if this will happen soon, but the economics of it are not in
Intel's favor. If I were Intel, I'd be more worried about ARM encroaching on
desktop and server than getting a foothold in mobile. That is the risk Intel
faces.

------
wsxcde
> _I see three possible answers._

How about a fourth option? Intel's organizational structure doesn't really
allow it to build profitable mobile chips that are competitive with the ARM
chips, $51 subsidy notwithstanding.

This is company that has gotten used to having literally thousands of
engineers work on each processor. And note this is a single _processor_ not an
SoC, and going from a processor to an SoC is a ton more work. I think they
simply don't have the organizational dexterity to effectively compete with the
lean and mean ARM shops.

------
saosebastiao
I'm surprised there was no mention of POWER8 and IBM's adoption of the
licensing model, similar to ARM. For the first time in at least 5 years, Intel
has a Xeon competitor.

------
joezydeco
If Intel thinks a 65% margin for IoT silicon will be achievable, they'll
probably lose again.

~~~
minthd
The new strategy in the IOT isn't to sell silicon - they try to create higher
value goods - an IOT gateway with secure software, wearables, smart glasses
together with famous optical brands, etc.

They're probably hoping to create some value due to hardware software
integration, ALA apple. But hardware integration(from the hardware side)
played only a small part in apple's success,at least as far as i understand
it.

So we'll see about intel.

~~~
joezydeco
_Intel_ might want to enter the IoT market this way, but all the other
chipmakers I've seen aren't going this route. They're trying to drop the power
consumption and cost as fast as they can and let someone else design the fancy
glasses.

Intel's record in consumer electronics to date is, well, pretty awful. So the
expectations are pretty low here. And, again, nobody gets 65% margin on
consumer products.

------
raverbashing
Or Intel is still betting on x86 and thinking ARM is "just a fad"

A lot of companies went down because of similar thinking

------
personZ
"Essentially no revenue for Mobile and Communications"

While technically correct, this misses the fact that Intel has been making
some huge inroads into mobile (particularly tablets). This isn't reflected in
revenue because Intel has quite literally been giving them away, lubricating
their use to make up for the industry being heavily ARM-centric.

[http://www.zdnet.com/article/intel-to-hit-40-million-
mobile-...](http://www.zdnet.com/article/intel-to-hit-40-million-mobile-chip-
units-with-aid-of-subsidies/)

There are tens of millions of devices out there running Intel chips, and it is
lubricating the world for the platform. Right now there are people who are
running Android on Intel devices and they have absolutely no idea that they
are -- the various Memo Pads, for instance. The Dell Venues.

The Memo Pad ME572C runs an Atom Z3560, a 64-bit x86-64 processor with four
cores, SSE4.2, and a powerful onboard graphics solution.

Intel is laying groundwork. Anyone who looks at the revenue number and counts
Intel out is being fooled. In a few years I suspect that many of the same
people will be crying foul about Intel having bought themselves into a market
they missed.

~~~
higherpurpose
How can you "make inroads" into a market, when you're _giving away your
product_ in a non-sustainable way. Sure, tablet makers will take Intel's "just
as good" 0$ Atom chip. But will they take it when it's "just as good" for $30
or $40?

~~~
beagle3
They might be trying to get some "lock-in" back, though I have no idea how.

Ballmer's Microsoft apparently missed the Mobile/Tablet revolution assuming
that the desktop lock-in they still have could be leveraged. That didn't work
for years, but seems to have been ignored until Nadella came in.

Intel might be trying to gain a lock-in by somehow putting the familiar x86
into mobiles/tablets, in the hope that developers would release x86 binaries
that would make ARM undesirable. But that's an 80 degree inclination uphill
battle. They either have a crazy card up their sleeve, or are unable to
respond properly (and the $0 cost + $50 subsidy) is the best they can do at
present while trying to craft that crazy card.

~~~
personZ
Quite the opposite: It's eliminating lock- _out_ , which is where Intel was in
an ARM-only mobile world.

Intel has the most advanced fabs in the world, and the limits of their own
devices has primarily been their concern about competing with themselves
(which I'm sure is still the case. With each new chip they probably have
internal negotiations about how to ensure it doesn't threaten their desktop
and server chips). If Intel isn't disadvantaged in mobile, only a fool would
count them out.

