
Intel made a huge mistake 10 years ago - molecule
http://www.vox.com/2016/4/20/11463818/intel-iphone-mobile-revolution
======
AceJohnny2
TL;DR: Intel turned down the opportunity to make the iPhone's chip and gain a
foothold in the mobile market.

A more direct source, from Intel's own CEO (at the time):
[http://www.theinquirer.net/inquirer/news/2268985/outgoing-
in...](http://www.theinquirer.net/inquirer/news/2268985/outgoing-intel-ceo-
paul-otellini-says-he-turned-down-apples-iphone-business)

    
    
        Otellini said Intel passed on the opportunity to supply Apple because the economics did not make sense at the time given the forecast product cost and expected volume. He told The Atlantic,     "The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it.
        "It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."
    

But the thing I don't understand is why Intel gave up on XScale, their ARM-
compatible effort (they held one of the few expensive ARM licenses that
allowed them to expand the core architecture). How's Atom doing nowadays? Last
I heard, Intel had partnered with Dell to make the Atom-powered Venue android
tablets. Can't say they're grabbing headlines with them...

~~~
dogma1138
They don't need Atom, as far as mobile devices go they are bringing Core to
ARM's power envelope. You can get Core M CPU's today with 3W power envelope
with considerable performance lead over most ARM SOC's.

Intel has made a bet that it would take just as much time for ARM to reach x86
level of performance as it would take Intel to bring x86 power consumption
down to SOC levels. Now Intel can play on both sides with their low power x86
parts, Xeon-D with upto 16 cores and Core-M in 2c/4t configuration.

~~~
agumonkey
Single core Atom (2012 Z2480 on Razr I) at 2GHz has significant perceived
performance advantage even compared to processors released after. It's way
snappier than the Moto G quadcore ARM cpu from 2013. I don't know how much
battery life is sacrificed for this though. But I'd be very eager to try a
Core M smartphone, since Atom has always been the low-end of Intel cpu
architecture. That said I lost track of Intel marketing, maybe Core M is just
a rebranding of the same or similar arch.

~~~
mtgx
There's a very large difference between a Cortex-A7 in a Moto G and a
Cortex-A57 or Cortex a72. The latter are about 3x faster.

Atom has barely increased its single-threaded performance from like 300 pts in
Passmark to about 540 right now (in Pentiums, in PCs, which cost $160 a pop).

Intel is not competitive on price. That's why Atom didn't pan out in mobile.
They tried to do it by heavily subsidizing its _high-end_ mobile chips, which
typically cost close to $50, to compete against a $15 mid-range Qualcomm chip.
That's why some people were "impressed" by what an Asus Zenfone could do for
instance, for its mid-range price overall.

And if Intel can't compete with Atom on price, then there's NO CHANCE it will
ever compete with Core M in mobile. The only reason it even exists in so
called "tablets" that go for $1000 right now, is because Microsoft failed to
make a good case for ARM-based Windows machines with its poor app support. But
that will always remain a niche.

~~~
creshal
> And if Intel can't compete with Atom on price, then there's NO CHANCE it
> will ever compete with Core M in mobile. The only reason it even exists in
> so called "tablets" that go for $1000 right now, is because Microsoft failed
> to make a good case for ARM-based Windows machines with its poor app
> support. But that will always remain a niche.

And outside the Surface those aren't selling well either.

(IMO rightfully so, the Venue 11 Pro is the worst device I've seen in years –
hypothetical hardware dickwaving aside, even a $50 OEM Android tablet has
better UX and usability than Windows 10 without mouse/keyboard.)

~~~
B1FF_PSUVM
Asus T-100 TA series work pretty well as Windows 8/10 tablets.

~~~
creshal
Yes, because they ship a keyboard and touchpad. But you wouldn't buy the Asus
ZenPad instead and try to use Windows 8/10 solely with touchscreen, would you?

It's the complete opposite of how iOS/Android tablets operate.

------
osweiller
For the past decade plus, Intel has been their own biggest competitor. Atom
processors aren't weak and built on an old process because Intel can't make
them better, but rather because Intel's greatest fear is undercutting their
more lucrative markets. Their very high profit markets.

So if you go back ten years and say "what if Intel did _this_ " (which in that
case was making a processor for Apple that Apple was paying maybe $20 each
for, estimating on the very high side), it is over simplified to just imagine
that it's additive. Intel has been rolling on profit margins that the hyper-
competitive ARM market can only dream about. It may be time for them to adapt
(and arguably they have been), but those 12,000 didn't lose their job because
Intel didn't do something different ten years ago. They, and thousands others,
might never have had an Intel job in the first place if Intel made different
choices.

------
kristianp
It's not just intel's (and ipads) fault that pc sales are down.

I think the big mistake PC makers are making right now, is that the PCs they
make aren't improving from generation to generation for their mass market
products. Sure the processors aren't doubling in MHz like they used to, but
the rest of the machine isn't improving either. If I go into a shop with
$3-400 today and buy a laptop, the machine I get is the same as I would have
gotten 3 years ago:

1\. 768 line display

2\. 5400 rpm hdd

3\. 2 GB of ram (4 if I'm lucky)

4\. Similar weight

5\. Similar poor battery life

6\. loads of crapware.

The pc manufacturers aren't pushing hardware manufacturers to improve the
cheapest spec. Why don't cheap new laptops have greater DPI on their LCDs than
3 years ago? Because manufacturers haven't changed their main production
lines. They are saving money on retooling, but on the other hand their product
isn't improving, and now they're paying the price. Apple is doing the same
thing with their Air line, which is only improving the processor generation,
it has the same body and screen as years ago.

If manufacturers improved their cheapest line every 3 years, people would see
enough of an improvement in their price range to buy a new machine every 3
years like they used to.

~~~
ac29
Exactly the same complaints could be made about the bottom of the smartphone
market ($100-200 range). Higher end smartphones may have improved more than
high end laptops over the past few years, but mid-range to high end laptops
are definitely nicer than they were 3 years ago.

You're also a bit disingenuous on your laptop complaints, the best selling
laptop on Amazon in that price range has a 1080P screen, 4GB RAM, a decent i3
Broadwell (which is faster and uses less power than its equivalent Sandy/Ivy
Bridge processor you would have gotten 3 years ago). Crapware is only a valid
complaint if you choose to use Windows and are incapable of taking an hour to
install a clean copy when you unbox it.

~~~
pjmlp
> Crapware is only a valid complaint if you choose to use Windows and are
> incapable of taking an hour to install a clean copy when you unbox it.

Assuming you get a Laptop that actually works with GNU/Linux.

The only laptop that I still run with GNU/Linux on is an Asus netbook, which
was explicitly sold with Linux support.

Guess what, it took more than one year for Ubuntu to properly support its WiFi
chipset and I was forced to use a network cable if I wanted any form of
networking.

My first GNU/Linux kernel was 1.0.9 with Slackware 2.0, so it is not I am that
dumb in GNU/Linux land.

Nowadays I stop bothering and use Windows on all other laptops that I have.

~~~
ajford
My 2014 X1 Carbon runs perfectly fine on Debian Wheezy, Jessie, and Testing.
Also ran Linux Mint on it (forgot version, circa late 2014). Not a single
problem, from touchpad to sleep mode to wifi. Even saw better battery life.

------
hga
Beware, for this article includes a gem like this:

 _Instead, these companies turned to a standard called ARM. Created by a once-
obscure British company, it was designed from the ground up for low-power
mobile uses._

Nope, their price budget required plastic instead of ceramic packaging, which
had a 1 watt power budget. They were sufficiently conservative that it ended
up dissipating 1/10 of a watt. The usefulness for mobile applications came
later.

On the other hand, if Intel turned down an offer from Apple to supply the
iPhone CPU, well, that sounds like a mistake. Then again, it's such a
different business that it's not clear it would have worked for them,
especially given the opportunity cost. So different, their FPGA aquisition
Alteria is still having their lower end more price sensitive chips fabricated
by TSMC, apparently because Intel is just too expensive for that market.

And Apple could well have changed to ARM later, Macs are now on their third
CPU architecture.

~~~
bsder
> The usefulness for mobile applications came later.

True, but not quite the whole story.

Some enterprising engineers wrote a cellular protocol stack in ARM assembly
language that let everybody use a far cheaper core than anything else.

Very quickly, ARM became entrenched in the feature phones. Then, the evolution
occurred to smartphones, and ARM was already _in the phone_.

------
ChuckMcM
Waaaaay back when I worked at Intel it was pretty clear they didn't stop doing
things that worked. And when the going got tough they stuck with what worked.
In the 80's Intel had a really remarkable set of of computing products, from
high integration "soc" type x86 chips (80186), high end graphics chips,
(8276x), embedded chips (8051), and "server" chips (431 series). Plus a memory
business and a whole passel of support chips.

But the chips in the PC had the best margin _by far_. So the more of those
they made, the more profitable they became, and when the chip recession was in
full swing in the late 80's and early 90's that is what they kept, shedding
all the rest.

In the early 2000's when Moore's law ran right smack into the power wall,
Intel was betting they could have an "enterprise" line (Itanium) and a
"desktop" line (Pentium), and an embedded line (8051). They guessed wrong and
for a brief time AMD's Opteron was kicking their butt. But once they realized
the writing on the wall they realigned around 64 bit in the Pentium line and
got back on track.

The problem with the ARM assault is that unlike AMD, which could be killed by
messing with other users of the chipset and patent attacks and contract
shennanigans, killing off someone making an ARM chip does nothing but make the
other ARM chip vendors stronger. And they can't kill all of them at once. And
worse, to compete with them they have to sacrifice margin on their x86 line
and that is something they have never done, it breaks their business model.

Its a real conundrum for them, they don't have a low power, reasonable
performance SOC architecture to compete with these guys. And that is driving
volumes these days. Further, the A53 (ARM 64 bit) killed off the chance of
trying to use 32 bit only ATOM microarchitecture chips in that niche without
impacting the value of the higher end Pentiums.

One of the things Web2.0 taught us was that it doesn't matter how "big" the
implementation of a node is if your going to put 50,000 of them in a data
center to run your "cloud." Ethernet as an interconnect is fast enough for a
lot of things.

It definitely makes for an interesting future.

~~~
dba7dba
_The problem with the ARM assault is that unlike AMD, which could be killed by
messing with other users of the chipset and patent attacks and contract
shennanigans, killing off someone making an ARM chip does nothing but make the
other ARM chip vendors stronger. And they can 't kill all of them at once. And
worse, to compete with them they have to sacrifice margin on their x86 line
and that is something they have never done, it breaks their business model._

Someone who was in IT before MS and Intel showed up said MS and Intel got big
even though they did not have the best technologies. They got big essentially
with smart alliances and good business practices (which included messing with
competitors, some practices that can be called dirty).

~~~
ralfd
Recently I talked to a 20 year old kid who didn't know that Apple existed
before the iPhone...

Microsoft and Intel got big for one and only one reason: IBM chose them as
suppliers for the IBM PC. Had IBM used their own in-house chip or licensed
CP/M as operating system computer history would have been different (the
Kildall link is especially interesting, Microsoft got really lucky here):

[https://en.wikipedia.org/wiki/IBM_801](https://en.wikipedia.org/wiki/IBM_801)
[https://en.wikipedia.org/wiki/Gary_Kildall](https://en.wikipedia.org/wiki/Gary_Kildall)

For much of the 80s the Intel chips were inferior to other designs like the
Motorola 68K. That is why the Macintosh, Atari (today only known for Pac-Man,
but yes, Atari made computers rivaling Apple) and Commodore Amiga used more
powerful Motorolas.

[http://www.skepticfiles.org/cowtext/comput~1/486vs040.htm](http://www.skepticfiles.org/cowtext/comput~1/486vs040.htm)

But the "IBM-compatible" architecture won despite its inferiority through path
dependency and the clones driving price down.

~~~
sitkack
Sadly, Motorola came out with the expensive part (16 bit bus 68k) late. Had
they shipped the 8 bit bus version earlier, I think things would have looked a
lot different. Had they followed up with the 68000 and 020 earlier, things
would be a whole lot different.

[https://en.wikipedia.org/wiki/Motorola_68008](https://en.wikipedia.org/wiki/Motorola_68008)

~~~
kjs3
Motorola did have an inexpensive next-gen 8/16-bit chip. It was called the
mc6809. That said, the whole _point_ of the 68000 was that it's 16-bits wide,
so I'm not sure a cheaper, narrow bus version would have made any difference.

------
btilly
The most important part of the article is easily missed unless you've read
_The Innovator 's Solution_. Which is the follow up book and spends a lot of
time looking inside of organizations to see why it is so darned hard to catch
the disruptive train.

A company with a profitable niche and a profitable technology will wind up
with high internal costs. That's fine in their main business because they have
a profit margin to play with. But it is surprisingly hard to trim back that
"fat" to go after much lower margin revenue with a cheaper technology. (Fat is
in quotes because it isn't really fat. It is necessary for the high margin
business.) It is common to try, and to conclude that it is a failure.

That is why Intel made this mistake.

~~~
bsder
Until Intel isn't producing chips with _way_ more profit margin than an ARM at
100% capacity, it's not a mistake.

ARM has volume, but x86 has _profit_.

~~~
btilly
It can be a good business decision to run a business for maximum profit and a
quick exit. However when you do that with a major corporation like Intel, it
tends to make people unhappy.

Intel will continue with the high profit margin right until the chips don't
sell. And then Intel gets to go out of business.

~~~
coldtea
So, just like Apple, with the same "high margin" strategy, their 19th year of
record revenues and $600 billion in store?

And how exactly would them (Intel in this case) selling commoditized, low
margin, products make it better for them?

Isn't it even worse, even quicker, for low margin players when their stuff
doesn't sell?

~~~
btilly
The challenge of disruption is when you have a clear value proposition that
everyone agrees on. The genius of Steve Jobs is that every year or three he'd
introduce a new product line with a new value proposition and cannibalize his
existing products in the process.

He's gone, and Apple has stopped doing that. Apple is now losing marketshare.
(Android is estimated at 82%.) Their app store is globally under half of the
market. They are projecting a year over year decline this quarter.

It probably won't be visible to the untrained eye in the next 5 years. It
won't be missable by anyone in the next 10. But Apple's best days are behind
it.

~~~
coldtea
> _He 's gone, and Apple has stopped doing that_

Actually they did just that with the Apple Watch -- which added ~6 billions to
their revenues, and even as an early v1.0 eclipsed all "wearables" to date.
[http://www.cnet.com/news/thanks-to-apple-watch-smartwatch-
sa...](http://www.cnet.com/news/thanks-to-apple-watch-smartwatch-sales-could-
hit-11-5-billion-this-year/) [http://www.macrumors.com/2016/01/26/apple-watch-
apple-tv-rec...](http://www.macrumors.com/2016/01/26/apple-watch-apple-tv-
record-sales/)

And their services dept doesn't do that bad either:
[http://appleinsider.com/articles/16/04/20/as-a-standalone-
co...](http://appleinsider.com/articles/16/04/20/as-a-standalone-company-
apples-services-business-could-be-worth-as-much-as-260b-piper-jaffray-says)

They've also keep improving the Apple TV (people who haven't followed Apple
forget how slow and incremental the rise of the iPod was -- from 2002 to 2007
people gathered at Keynotes to cheer if it got silly features like Wifi or a
color screen or some smaller sibling), working on a car, and other things
besides.

> _Apple is now losing marketshare._

Barely -- from April 2015 to now, they've gone up and down, some quarters
winning over Android, others losing. [http://www.comscore.com/Insights/Market-
Rankings/comScore-Re...](http://www.comscore.com/Insights/Market-
Rankings/comScore-Reports-June-2015-US-Smartphone-Subscriber-Market-Share)

Besides I've never understood this "let's pin Apple, a single company, against
the whole of the industry put together". They've never had the "most" market
share -- just the most of the most lucrative (higher end, high margins)
segment of the market.

> _It probably won 't be visible to the untrained eye in the next 5 years. It
> won't be missable by anyone in the next 10. But Apple's best days are behind
> it._

I think I've read that again, in 1997, 2000, 2002, 2004, 2006, 2007, 2009,
2012, 2015 etc. A.k.a "Apple is doomed".

~~~
HappyTypist
The iPhone meant you no longer needed an iPod. The iPad meant grandma no
longer needed a Mac. The Apple Watch requires an iPhone.

~~~
coldtea
What kind of bizarro metric is that?

The iPod never meant you don't need a Mac.

The iPad never meant you don't need an iPhone (and hardly ever meant you don't
need a Mac/PC).

The idea that a device should replace previous devices was never much of a
concern. The only thing that qualifies 100% in that story is that the iPhone
was by nature also a portable music player -- and if you had one obviously you
didn't need the iPod. Apart from that, all were individual lines, with their
own strengths and limitations -- not supposed to replace one another.

------
FullyFunctional
I would agree with the analysis, but I think it's missing an interesting fact:
The ARM threat was non-existant until DEC Alpha engineers created StrongARM
and showed the world that you could make a fast ARM. StrongARM was effectively
renamed XScale around the time Intel got hold of the IP.

~~~
jakub_h
> and showed the world that you could make a fast ARM.

My memory may be a bit hazy, but wasn't one of the demonstrations of the
original ARM a program in interpreted BASIC that did in twenty seconds
something for which a compiled C program on a 80386 needed thirty seconds? Or
something like that?

~~~
FullyFunctional
That's demonstrating you can make a fast BASIC on an ARM. What I'm talking
about is making a faster ARM cpu. According to Wikipedia, StrongArm debuted at
233 MHz. I can't remember how fast the contemporary implementations were at
the time, but I remember that 233 MHz was a _lot_ faster. (For the record: I'm
neither fan of ARM nor x86).

~~~
eonwe
It seems to have been introduced the same year as Pentium 2 which had clock
rates of 233-300 MHz. I don't know about the architectural differences at that
time, but at least currently Intel processors smoke ARMs with same clock rate.
The difference was probably a lot less pronounced at that time as the
architectures were simpler.

~~~
rjsw
The Pentium 2 was an out-of-order CPU, StrongARM was in-order.

------
jbb555
" The PC era was about to end."

Not bothering to read the rest. This is entirely 100% wrong. The PC era has
not "ended". It's just that we only upgrade every few years instead of every
year. And grandma now reads her email on a tablet instead but that was never
what PCs were really for.

PCs are still just as much used as ever. We just use other things too, and
don't buy a new one every year.

If they can't get this basic fact right then I have no hope for the rest of
the article.

~~~
lovemenot
I disagree. The PC era ended when the PC was no longer the _dominant_
computing platform. We can argue on dates specific markets etc.

Eras are ill-defined, but if you can assert that we once were in a PC era, you
must also accept a definition that allows, in principle, for an end to an era.

The dominant platform of its era is the one with the greatest user and
developer person hours, sales volume, zeitgeist and so on.

We are in the mobile era.

~~~
arielb1
PCs are still _used_ more than mobile. It's just that the average person had
bought a Pentium 4 PC running XP a decade ago and is not going to upgrade it
until it breaks.

PCs are now a mature technology, just like cars, and have the resulting long
lifecycle.

Smartphones are also getting into this stage.

------
amist
>Now 12,000 workers are paying the price

I guess there are 12,000 other workers somewhere else in the world that now
have a job because they get to create what Intel doesn't. BTW, according to
past statistics, most of the workers that are now "paying the price", wasn't
even Intel's employees 10 years ago.

------
asah
Sigh, the author forgets that Intel tried to leave x86 with Itanium, which was
an expensive disaster, and they vowed to never make that mistake again.

------
tdsamardzhiev
Ugh, in 2005 AMD X2's were wiping the floor with any desktop processor Intel
had. The only reason Intel stayed in business was that being much, much bigger
company than AMD, so they could 1) outsell AMD on availability basis and 2)
ditch NetBurst and come up with newer architecture (which was a glorified
version of their mobile/older architecture).

~~~
mrpippy
Don't forget the giant (possibly illegal) payouts to keep big customers (Dell
in particular) Intel-only. $4.3 __billion __to Dell between 2003 and 2006.

[http://money.cnn.com/2010/07/23/technology/dell_intel/](http://money.cnn.com/2010/07/23/technology/dell_intel/)

------
woodandsteel
The article uses Clayton Christensen's theory of disruption to explain why
Intel missed the mobile phone market and gave it away to ARM. I would just add
that I think the same is happening in the Internet of Things.

~~~
zeeshanm
Would be interested in hearing your analysis on IoT.

~~~
woodandsteel
What I meant is I think that Intel is focused on producing high-profit margin
chips, so it is missing the boat on IoT because its chips cost too much, and
ARM is dominant. But I'm not an expert in this area, so I might be wrong.

------
fauria
According to the article, back in the 1990s DEC was forced out of business
because they underestimated the imapact that PCs would later make on the
market, leaving Intel as a leader.

In the 2000s smartphones and mobile devices outnumber PCs. Intel missed that
and so ARM dominates the business.

Maybe that same pattern will repeat again with the rising of IoT and
wearables, where smaller and cheaper chips become ubiquitous.

The development of Edison and Curie processors might indicate that Intel is
betting on this, gearing up for the next "disruptive innovation".

~~~
TheOtherHobbes
In which case we should all buy shares in Broadcom and learn to program
Raspberry Pis.

Or possibly more likely - the next transition which will be much messier, with
no single winner.

The mainframe -> mini -> desktop -> laptop -> mobile -> SBC path was about
miniaturisation. The devices all provide general computing facilities, but
they get smaller and faster and use less power, while the UI becomes more
accessible to non-expert users.

I'm not seeing how that translates into wearables, because IoT and wearables
aren't general purpose computing devices. They're more like embedded hardware
and/or thin network clients.

So I don't think that's where the disruption will happen. It's maybe more
likely the disruption will happen in consumer AI, where the UI gets simpler
still through speech recognition and NLP, and the screen/mouse/keyboard start
to disappear.

My guess is traditional processors will become front-ends and glue for
hardware AI systems, and the companies to bet on are the ones producing the
subsystems. The hardware will follow the same path as general purpose
processors, but they'll move from "mainframe" to "SBC" much more quickly.

------
peterclary
Ironic, given that Intel's long-time leader famously said "Success breeds
complacency. Complacency breeds failure. Only the paranoid survive."

------
pj_mukh
Doesn't this mean Qualcomm should be doing splendidly? But its not[1] :(. What
gives?

[1]
[http://www.sandiegouniontribune.com/news/2015/sep/17/Qualcom...](http://www.sandiegouniontribune.com/news/2015/sep/17/Qualcomm-
layoffs-workers-samsung-cost-cutting/)

~~~
mtgx
Qualcomm did poorly last year, because of its failed Snapdragon 810 chip. It
lost a ton of sales. Otherwise Qualcomm was and still is dominant in the
mobile chip market with close to 50% market share.

~~~
Grazester
Yep, those things were throttling like mad due to heat issues and performance
was not on par with its competition, the A9 and the Exynos processors.

------
corford
Early on, Microsoft missed the internet revolution but were big enough and
good enough to survive that early misstep. Intel is big enough and good enough
to survive their mobile blunder (though, admittedly, they're taking more time
than MS did to get back on the horse).

~~~
twoodfin
_Early on, Microsoft missed the internet revolution but were big enough and
good enough to survive that early misstep._

This has become popular myth, but I don't think it's true. Sure, Microsoft
wasn't far ahead of the curve on the internet, or they wouldn't have been
developing proprietary online services designed to compete with AOL and
CompuServe.

But the Bill Gates "Tidal Wave" internet memo was sent in May of 1995!
Netscape's IPO was still months in the future, Amazon had maybe a few million
$ worth of revenue, and essentially every other web-based company still in
business today hadn't been founded yet.

IMHO, billg's memo should be seen as an example of great prescience, rather
than a belated corrective maneuver.

~~~
corford
I guess I'm looking at it from where I was back then (a spotty young
teenager). I remember Windows 3.1 needed Trumpet Winsock, then came IE 1,2,3
and they were awful, when IIS appeared it took about a decade to become any
good versus Apache. From my point of view things started changing with IE4 and
then Windows 98 getting automatic updates. For me that's when MS turned it
around (i.e. ~4 years after Mosaic).

------
skynetv2
everyone is a pundit with the benefit of hindsight. Intel made the best
decision with the available best information. Moreover, Apple is a notoriously
difficult partner that will extract every penny from its suppliers. What if
Intel did invest few billions to support Apple and then Apple went ahead and
did their own chip, like they do now, leaving Intel with costs that cannot be
recouped. Same pundit will say "Intel was stupid to spend so much on Apple."

Love it or hate it, Intel still has the right technology and products to
appeal to a broad market and make good money. One cannot expect to win every
market, you can try if it makes sense and you should know when to walk away.

------
bobwaycott
$2 billion in profits is a lot. 12K jobs cut is a lot. I can't help but find
it a bit crazy that 12K people who helped make $2 billion in profits are
suddenly extraneous. Are that many people really redundant within Intel?

------
ricksplat
I think the culture of Intel is such that they'll turn this around - sadly it
had to come to job losses first. But it was the same (if less severe) in the
late 90s when AMD nearly stole their lunch.

------
spiritomb
don't forget that Intel also made a big bet at a critical time on a
partnership with Nokia. that was another couple of years wasted .. further
behind Intel fell.

~~~
rasz_pl
Intel? What about Nokia making miserable 'smartphones' running 24MHz i386???

------
xiphias
I don't think 10000 people have problems finding jobs..if you worked at Intel,
I'm pretty sure that sounds good in your resumé

------
hinkley
I think in the long run this is one of those mistakes that we will all benefit
from.

"Monopolist Missteps and Loses Monopoly Position"

------
alanh
A very long-winded article with a clickbait title saying… nothing more than
that they missed the mobile revolution.

~~~
mkehrt
I disagree. This is an interesting article explaining what is happening to
Intel and why. But you're not the target audience if you know this already;
every piece of information in this article would be novel and interesting to
someone outside of tech.

------
a_imho
Is there any new information on the 3DXP technology?

~~~
touristtam
Do you mean this -> [http://www.anandtech.com/show/9470/intel-and-micron-
announce...](http://www.anandtech.com/show/9470/intel-and-micron-
announce-3d-xpoint-nonvolatile-memory-technology-1000x-higher-performance-
endurance-than-nand) ?

~~~
a_imho
yes, I know it is a bit OT, but there were other Intel related discussions,
saw no harm in asking.

------
Retric
Intel's largest mistake was integrating graphics on their CPU. This cost them
more than the entire cellphone CPU market is worth.

This ate up valuable chip real estate, RAM bandwidth, thermal overhead etc.
Worse it cemented the idea that Intel was crap at graphics while slowing down
the PC upgrade cycle.

~~~
dragontamer
What the hell else do you think Intel should put on those chips?

Note that ~15% of Steam Players are gaming on Intel iGPUs, as awful as they
are.

[http://store.steampowered.com/hwsurvey/](http://store.steampowered.com/hwsurvey/)

~~~
Retric
A pure blank space would work as they save on manufacturing costs and get
faster CPU's due to heat limits and more RAM bandwidth.

~~~
chadgeidel
No, that's not how it works.

Intel had the ability to put more transistors on the same die size with the
same power requirements. This was long after they reached thermal/clockspeed
limits (with the P4). They started putting additional cores in there and
bumped up the L2 and L3 caches, but there was still space left on the die.

What do you do with those extra transistors? It would be absurd to "leave them
blank" as you are basically throwing money down the drain.

~~~
im3w1l
Even more cores?

~~~
masklinn
Most non-server systems are barely able to use 2 cores let alone 4, what would
they do with 8 or 16? More efficient & powerful graphics built-in are a much
better use of the die space. Even more so considering the rise of GPGPU and
hardware decoding.

