
How Intel missed the smartphone market - JeremyMorgan
https://mondaynote.com/intel-culture-just-ate-12-000-jobs-305674fb1274#.8xjcs7g8b
======
vtail
Expect the same fate for many more tech companies.

Although not many people realize, tech is a _mature_ sector of the U.S.
economy now - it grows at about 3% a year, pretty much in line with GDP. But
the perception is different - just ask your friends - and the reason for this
is that there are highly visible _pockets of growth_, like Uber/SpaceX/name-
your-favorite-unicorn, and people often extrapolate their growth to the rest
of the industry.

Now what happens with many tech companies is that they often have a product
with exceptional margins and double-digit growth rate, and it makes sense to
invest all the available resources into it - better from ROIC perspective -
ignoring all the alternatives which either lack volume or high margin to look
attractive. This inevitably leads to problems once your exceptional product
stops growing, and you realize you have barely invested in anything else.

Just much like Intel with x86 and ARM chips, or Qualcomm, or EMC, or RIM, ...
- the list goes on and on.

Even when you look at Google, most of their resources are invested into
search/ad business, so when that's stops growing - or, rather, they got all
the share from the TV and start growing at a GDP rate - they will be in the
same boat.

Edit: typos.

~~~
PaulHoule
That's not entirely true.

Intel has invested in plenty of things that haven't worked out, for instance,
Itanic, Ultrabooks, Realsense, etc.

Also companies like Intel and Microsoft have bought into the Clayton
Christensen theory of "disruption" to the point where they really are throwing
their existing customers under the bus. That is, Windows 8 was perceived as a
tablet operating system, Intel only sells high-end chips under the Xeon brand,
etc.

Had Intel stuck more piggishly to making better PC's, we might have mainstream
water-cooled desktops running at 5Ghz+ and more reasons for people to upgrade.

~~~
minikites
> Had Intel stuck more piggishly to making better PC's, we might have
> mainstream water-cooled desktops running at 5Ghz+ and more reasons for
> people to upgrade.

Can you think of what those reasons might be? I can't. My office computer is
an i5-3570 and it sits idle most of the time because it's just not that hard
to have Outlook and Firefox open. A/V work and AAA gaming drive upgrades now
like they always have, what other reasons would there be for Intel to push
beyond their current efforts like you suggest?

~~~
yourapostasy
>> we might have mainstream water-cooled desktops running at 5Ghz+ and more
reasons for people to upgrade.

> Can you think of what those reasons might be?

Some applications that I would want that might need to wait for high-cpu (some
high-memory and/or high-disk) to become more mainstream, since I want them all
to run locally until FTTH reaches my part of the boonies:

A scanner I can literally wave a piece of paper at, have it process all of the
ultra-high speed video frame grabs into a high-dpi scan, then reconstitute the
scan into an appropriate LibreOffice document or PDF, with better OCR and
better vectorization of raster pictures. Then I auto-file all my invoices,
bills, statements, receipts, _etc._

A personal digital assistant that transcribes, annotates, indexes and
categorizes all of my conversations across all mediums (voice, text, chat,
social, _etc._ ). This doesn't need to be ML-fancy at first (except for the
voice recognition piece), just lots of memory+cpu+disk to drive intensive,
continuous indexing.

A millimeter-accurate 3D positioning system, using differential GPS off of a
geodetic mark on my property, with relays of signals to indoors. This would
drive my robotic chicken tractor, curb edger, and vacuum cleaner. I could keep
a detailed inventory of household items with this, then query the house
computer over a voice link the next time I forgot where I set down $thing
(some RFID and 3D positioning integration needed here). Outside, it keeps
track of exactly where various pipes, outlets, irrigation heads, _etc._ are
located.

A software-defined radio that scans for chatter on interesting bands like
police and fire, performs voice recognition on them, and pings me when it
hears keywords about my neighborhood, street, nearby streets, or region when
combined with other keywords (like "tornado").

A house computer that can tell whether or not I'm in the house or not, if I'm
asleep, getting ready to sleep, or still undergoing my morning routine, at my
computer, at my tablet, on my phone, _etc._ And do this for all occupants and
visitors. Lots of visual recognition from lots of camera input. For myself, I
want this information to drive what information is presented to me, when, and
in what context. Bloomberg Radio after I've been up for 10 minutes, and after
I've put on my headset, before my first call (incoming or outgoing).

A process that continuously combs through my Emacs Org agenda, and compares
against my current "state". If I'm driving away from the house, and confirm a
query from the computer that I'm going out to buy groceries and run errands
for the day (the query is launched by continuously-computed probability cones
narrowing possible agenda matches against my GPS position until a threshhold
is met), and no one else is in the house, then automatically put the house
HVAC system into energy-conserving mode.

A dumb robotic mechanics driven by updateable software in my desktop computer
to fold my laundry, turn over my compost bed, rotate stop valves
monthly/quarterly (otherwise they freeze in place, the primary challenge the
automatic flood sensors face today), change HVAC filters, open and file
incoming snail mail, _etc._

Intensive AR to turn nearly all parts of my house into smart surfaces. "What's
inside that drawer?" "What's behind that wall?" "What's inside that freezer?"

If we are talking only about pure desktop applications constrained to the
physical desktop/laptop computer itself, then yes, you may have a point.
However, when we include extending the reach of those systems to our
environment, and there is an explosion of data to crunch, then today's typical
consumer desktop computer (4-8 GB RAM, 2-3 GHz x86_64, 2-4 TB at the top end)
doesn't have the capacity to manage all of those demands simultaneously.

~~~
lowbloodsugar
I love this. This is all great. But with the exception of the scanner, all
this _can_ be done across an internet connection, and so therefore all of this
_will_ be done across an internet connection. For entrepreneurs building an
ANI (or AGI), it makes little sense to let anyone else get their hands on the
code, even in binary form, and then to have to support that code running on
arbitrary PC configurations.

~~~
yourapostasy
I hear you, and I hope to see these applications arrive in _both_ cloud and
on-prem versions, perhaps from different origins/vendors. That signals the
software ecosystem is strong, vigorous and robustly healthy enough to support
many different solution angles. Let users choose what best fits their needs,
and let the diversity seed further innovation in a virtuous spiral.

I sometimes muse if on-prem will get a second look as containerization, self-
hosted AWS Lambda-like microservices, and similar advances become more
mainstream on servers, and we'll see more hybrid solution ecosystems evolve. I
strongly suspect we will look back in decades hence and lament that
governments are Why We Can't Have Nice Things In The Cloud; while the US has
been particularly egregiously public, we shouldn't be surprised at other
nation state actor reveals in the future, either. If I'm on point about that,
then we'll probably see hybrid cloud+on-prem (COP?) instead of an overwhelming
dominance of one or the other for the near future at least.

------
Rezo
If you don't cannibalize yourself, someone else will.

Intentionally limiting Atom performance, selling of their ARM division, etc.
was all done in order to not harm their main cash cow. By the time they woke
up and really tried to push for x86 on Android it was too little too late.

Just from an engineering perspective it was always going to be a monumental
task. Because guess what, that small dev shop with the hit-of-the-month mobile
game is not going to bother cross-compiling or testing on the 1% of non-ARM
devices. And if "Ridiculous Fishing" or whatever doesn't work flawlessly, your
device is broken from the consumer perspective.

But what should really have Intel pissing their pants is the recent AMD x86
license deal with Chinese manufacturer's to pump out x86 server class chips.
I'l love to hear if they're taking it seriously at all, or dismissing it as
usual.

~~~
BooneJS
Paul Otellini came up on the x86 marketing side. He fully believed that Intel
Architecture was the ultimate, therefore everything else was no better than
2nd place.

BK came up through the fabs, and therefore is more open to fabbing others'
designs (Intel Custom Foundry) because it drives volume. However, he has
overseen one hell of an awkward reduction in force (I'm ex-Intel, but still
have many friends there). Small offices were told they were closing, and a few
weeks later they were told where they should relocate to keep their jobs, and
a few weeks later they might learn about the relocation package. It's almost
as if they drew a line on a spreadsheet filled with numbers and now they're
struggling to figure out how it affects the stuff they still want to keep.
Odd.

~~~
AndyNemmity
I wish it was odd, if anything it sounds pretty normal when companies change
top line strategy in a broad scale, and then have to spend the next year
figuring out what the hell that even means, and how to do it.

~~~
r00fus
I think this is more symptomatic of larger, more conglomerated organizations.
I doubt this would apply to say, Apple or Google which have, respectively, 95%
& 50%+ of their revenues coming from one product line.

------
sp332
Looks like the article just got deleted? Here's a cache
[http://webcache.googleusercontent.com/search?strip=1&q=cache...](http://webcache.googleusercontent.com/search?strip=1&q=cache:https%3A%2F%2Fmondaynote.com%2Fintel-
culture-just-ate-12-000-jobs-305674fb1274) (It's not available in the Internet
Archive because of robots.txt, which makes me wonder how Google justifies
caching it but whatever)

~~~
StringyBob
Deletion was accidental according to Jean-Louis Gassée (author) Twitter
feed...

~~~
kyrra
link for context:
[https://twitter.com/gassee/status/732280308049936384](https://twitter.com/gassee/status/732280308049936384)

> @sebbrochet @fdevillamil Embarrassing mistake, don’t know what to do other
> than grovel and hope Medium can undo.

~~~
fjarlq
Happy followup:
[https://twitter.com/gassee/status/732368830190620673](https://twitter.com/gassee/status/732368830190620673)

> Big Thanks to @Medium. My fat fingers killed 9 yrs of Monday Notes this am.
> Awful feeling. They fixed it quickly. Much appreciated.

------
samfisher83
Lets look at QCT which I think is the biggest if not the biggest ARM company
out there.

[http://marketrealist.com/2016/04/qualcomms-chipset-margin-
hi...](http://marketrealist.com/2016/04/qualcomms-chipset-margin-higher-end-
guidance/)

QCT’s operating margin fell from 16.9% in fiscal 1Q16 to 5% in fiscal 2Q16.
The margin was on the higher end of the low- to mid-single-digit guidance as
the ASP (average selling price) of 3G and 4G handsets equipped with Qualcomm
chipsets rose by 6% YoY to $205–$211. The price rose due to a favorable
product mix and higher content per device.

The Margins on cell phone chips are terrible. QCT made 2.5 bil on 17bil in
revenue.

[http://investor.qualcomm.com/secfiling.cfm?filingID=1234452-...](http://investor.qualcomm.com/secfiling.cfm?filingID=1234452-15-271&CIK=804328)

Would it really make sense to invest in Cellphone business when every dollar
you put in gets you less ROI compared to what you have now. From a finance
perspective it would make more sense to return it to the shareholders and let
them invest in QCOM if they want to.

~~~
santoshalper
The existing ARM fabrication market is very balkanized. Presumably if Intel
had dived in with both feet, they would have been able to dominate it using
their incredible aptitude for iterating process technology.

There is certainly that third path though, where Intel embraced fabrication,
then regretted it.

~~~
makomk
I'm not sure if Intel's process lead is what it once was. For the last year or
so Samsung have been shipping phones that use their own 14nm FinFET process,
very similar on paper to Intel's best. It's entirely possible that Samsung
will actually beat Intel to 10nm.

~~~
CountSessine
A point that's been made is that Samsung's 14nm process isn't really
equivalent to Intel's- many gate feature on Samsung's 14nm process are much
larger than 14nm.

Still, just the fact that anyone is in a race with Intel at all would have
been unthinkable 10 years ago.

~~~
petra
On the one hand it's true, but on the other hand, Intel opened it's fabs to
others but nobody accepted, so intel's process might be more expensive.

------
markbnj
Take any company that once rode atop a huge market and then suffered through
declines as that market changed and you could filter back through the history
of their decision making and say the exact same thing Gassee is saying here
about Intel. So I'm not sure what point he is trying to make when he says
their "culture ate 12,000 jobs." Does he mean their culture of focusing on the
market they were currently very successful in, and their culture of not seeing
huge paradigm shifts happening just under the surface of it? I suspect he just
had a dull axe, and decided the time was finally ripe to sharpen it.

~~~
tonyedgecombe
I'm not sure they even made the wrong decision, take a look at Kodak who were
pioneers in early digital photography. You could argue that it was a
distraction for them and they should have concentrated on milking film as long
as possible.

~~~
lallysingh
No. Any shareholder would have wanted (in retrospect) Kodak to lead any
technology that had the potential of killing their primary business.

~~~
mark-r
Kodak is the perfect example. They managed to collect for every single
_picture_ you took, between the film, processing, and printing. There's no
revenue stream in digital that had the potential to replace it, so they were
doomed from the start. They certainly did see it coming.

~~~
lallysingh
With that kind of money, they could've been investing in consumer storage
(e.g., NAS and even flash for phones) and point & shoot digital camera
components. With perfect hindsight, of course.

~~~
ghaff
>they could've been investing in consumer storage

Which has also been a razor-thin margin commodity business. Fujifilm does
provide some insights into how Kodak could have moved forward but they
struggled too and were a smaller company.

The high-end camera makers have done OK in digital photography (although
there's been churn in that space as well) but Kodak had long ceded that market
to the Japanese by the time digital photography was much more than a glimmer.

The print consumables market was also pretty good for a while but that ended
up being relatively ephemeral.

Even with the advantage of 20-20 hindsight, Kodak was in a tough position to
directly leverage either their existing tech or their channels.

------
Mendenhall
I found this quote to be one of particular interest.

“I couldn’t see it. It wasn’t one of these things you can make up on volume.
And in hindsight, the forecasted cost was wrong and the volume was 100x what
anyone thought.”

…and, perhaps more importantly:

“The lesson I took away from that was, while we like to speak with data around
here, so many times in my career I’ve ended up making decisions with my gut,
and I should have followed my gut. My gut told me to say yes.”

I personally love data,facts,hard science, but I find too often many can
ignore gut feelings. I almost always follow my gut instinct, It has proven
itself to me over and over again even while taking what is often perceived as
long shots but somehow my gut tells me "you got this".

In particular I would say gather your own data on how often or not your gut
instinct is correct and use that as a data point in addition to the hard
science, facts etc.

Instinct evolved to keep you alive, it is often wise to not ignore it.

~~~
lotsofpulp
Thinking that your gut makes more optimal decisions than a random process
might just be feeding one's ego.

[https://en.wikipedia.org/wiki/Confirmation_bias](https://en.wikipedia.org/wiki/Confirmation_bias)

~~~
wallacoloo
And regardless, while instincts _are_ useful, I would rather analyze them and
extract the underlying reasons for my instinctual reactions, rather than
blindly acting on them at the emotional level. And at that point, I'm back to
acting on data (though potentially drawing it from more sources, now).

In any case, to believe that your [survival] instincts are equally applicable
to making sound business decisions is a bit silly, as that's pretty far
removed from what they've been optimized to do.

------
skynetv2
Many are focussing on the 12,000 number. Many forget Intel has acquired a
whole bunch of companies over the past few years including McAfee, Altera etc.
Now mix in a change in strategy, you find yourself with a bunch of people you
no longer need. Redundancy of services like HR, IT, test, design, management
etc along with another class of engineers whose skill set you no longer need
in your new direction.

Organizations continue to evolve and change direction. I am not saying
leadership has not failed spectacularly in mobile, but some of the 12,000 jobs
are a natural progression of acquisitions.

~~~
Grishnakh
>Many forget Intel has acquired a whole bunch of companies over the past few
years including McAfee,

They should just fire everyone from that acquisition and close down the whole
division. It's nothing more than a huge stain on the rest of the company. It's
really sad that the world's largest chipmaker reduced itself to being a
malware purveyor.

~~~
MichaelGG
Don't know, but I'd assume Intel's McAfee idea was more for the enterprise
market vs selling junk to consumers. At the enterprise level it's somewhat
more defensible, plus it opens up paths to desktop management capabilities.

------
voiceclonr
Intel's problems are long time coming. They lost track of real customer needs
and have been operating in their own x86 centric world. It was always "let me
build it first and then try to look for a good usage". I am ex-Intel and used
to do software things. None of the leadership had a real sense of what
customers really wanted and the h/w architects just built things that they are
really good at. And they justified it in unnatural ways ("You know why you
need such a fast processor ? You can run this flash game at 10K fps").

Things have been getting very efficient both on the client and the server
side. With Cloud, they will have some momentum behind - but long term, I think
the glory days are gone where they can just produce chips and someone would
take it.

------
kazinator
This goes back farther than 2005!

At one point at the height of the bubble, Intel was involved in mobile
devices.

I worked for a company that developed secure, efficient wireless communication
middleware for that space. Our client hardware was mainly Pocket PC and
Windows CE at that time.

We partnered with Intel to port our stack to the Linux device they were
developing (codenamed "PAWS"). This was around 2000-2001, if I recall.

Thing were going very well, when practically overnight, Intel decided to pull
out of this market entirely. They shut down the project and that was that.

It didn't bode very well for that little company; we gambled on this Intel
partnership and put a lot of resource into that project in hopes that there
would be revenue, of course. Oops!

------
udkl
The post was deleted.

Here is the google cache link for quick access :
[http://webcache.googleusercontent.com/search?q=cache:https:/...](http://webcache.googleusercontent.com/search?q=cache:https://mondaynote.com/intel-
culture-just-ate-12-000-jobs-305674fb1274)

And here is the raw text(without any links) :
[http://pastebin.com/e10Yw0zi](http://pastebin.com/e10Yw0zi)

------
jonstokes
I hate this piece. I hate that in 2016, people still believe in Magical ARM
ISA Performance Elves. I hate that this is like my sixth or seventh post-Ars
comment on HN in as many years debunking the ARM Performance Elves theory.

Back when the PPro was launched there was a low double-digits percentage of
the die taken up with the x86 translation hardware (for turning x86
instructions into internal, RISC-like micro-ops). Now that number is some
small fraction of a percent. It kills me that I still read this nonsense.

The reason that Intel didn't get into cell phone chips is because margins were
(and are) crap, and Intel is addicted to margins. The reason everyone else
went ARM is because a) licenses are dirt cheap, and b) everyone knows that
Intel is addicted to margins, and they know about all of Intel's dirty tricks
and the way Intel leveraged Windows to keep its margins up in the PC space and
screw other component vendors, so everyone has seen what happens when you tie
yourself to x86 and was like "no thanks".

Of course Intel didn't want to make ARM chips for Apple (or anyone else),
because even if they had correctly forecast the volume they'd have no control
over the margins because ARM gives them no leverage the way x86 does. If Intel
decides it wants to make more money per chip and it starts trying to squeeze
Apple, Apple can just bail and go to another ARM vendor. But if Apple went x86
in the iPhone, then Intel could start ratcheting up the unit cost per CPU and
all of the other ICs in that phone would have to get cheaper (i.e. shrink
their margins) for Apple to keep its own overall unit cost the same (either
that or give up its own margin to Intel). Again, this is what Intel did with
the PC and how they screwed Nvidia, ATI, etc. -- they just ask for a bigger
share of the overall PC unit cost by jacking up their CPU price, and they get
it because while you can always pit GPU vendors against each other to see who
will take less money, you're stuck with Intel.

(What about AMD? Hahaha... Intel would actually share some of that money back
with you via a little kickback scheme called the "Intel Inside" program. So
Intel gives up a little margin back to the PC vendor, but they still get to
starve their competitors anyway while keeping AMD locked out. So the game was,
"jack up the cost of the CPU, all the other PC components take a hit, and then
share some of the loot back with the PC vendor in exchange for exclusivity.")

Anyway, the mobile guys had seen this whole movie before, and weren't eager to
see the sequel play out in the phone space. So, ARM it was.

The only mystery to me was why Intel never just sucked it up and made a low-
margin x86 chip to compete directly with ARM. I assure you there was no
technical or ISA-related reason that this didn't happen, because as I said
above that is flat-earth nonsense. More likely Intel didn't just didn't want
to get into a low-margin business at all (Wall St. would clobber them, and
low-margin x86 would cannibalize high-margin x86), and by the time it was
clear that they had missed the smartphone boat ARM was so entrenched that
there was no clear route to enough volume to make it worthwhile, again,
especially given that any phone maker that Intel approaches with a x86 phone
chip is going to run the other way because they don't want x86 lock-in.

~~~
adekok
> The only mystery to me was why Intel never just sucked it up and made a low-
> margin x86 chip to compete directly with ARM.

Culture. Some companies are willing to sacrifice today's revenue for
tomorrow's growth. Intel isn't.

A low-end, low-power, low-margin x86 system would cut a _huge_ swath thru
Intel's margins. The trend of "upgrade to newer! better! faster!" CPUs would
be hard-hit.

Even Apple is seeing that now with slowing iPhone sales. The phones are
powerful enough that upgrading doesn't get a lot of benefit. So people aren't.

~~~
gfody
> A low-end, low-power, low-margin x86 system

AMD A-Series?

~~~
r00fus
Intel was complicit for a decade or longer for essentially undercutting AMD by
bribing the manufacturers like Dell. There were quarters where Intel's
"marketing budget" were all that kept Dell and other manufacturers in the
black. [1]

AMD has no impact on the overall market because it's not a serious option for
the big box manufacturers.

[1]
[http://www.theatlantic.com/technology/archive/2010/07/dells-...](http://www.theatlantic.com/technology/archive/2010/07/dells-
shame-intel-payola-propped-up-companys-earnings-sec-says/60447/)

------
petra
The main issue here is: Intel doesn't want to work on stuff with lower
margin(even though it can make money and it's strategically valuable). That's
common to many businesses.

Isn't there any financial engineering trick to enable companies to solve this
dillema that often leads to their death?

~~~
lmm
Every innovation eventually becomes commoditized. So if you want to remain
high-margin you have to continually innovate.

(But to be fair to Intel they did, and do - they keep producing better and
better CPUs. It's just that demand for that whole category has shifted)

~~~
pm90
Exactly. Intel invests ginormous amounts into R&D. If they had adopted ARM
earlier, they could have captured the mobile market and .... then who knows
what? Innovation is unpredictable, especially in tech.

They started our making DRAM's then moved to making microprocessors...I wonder
why they didn't realize they couldn't rely on the same business model forever.

~~~
Grishnakh
They _did_ adopt ARM. Did you forget about StrongARM? (I believe they acquired
it from their DEC acquisition.) They ran with that for a while, made an
updated version called "XScale", made a bunch of mobile processors with it,
and then sold it all off to Marvell when they decided to refocus and shed a
lot of other things that weren't core to the business.

~~~
thrownaway2424
History is not the strong suit of this site's readers. Yes, Intel was selling
ARM CPUs into the mobile market (Compaq IPAQ, Sharp Zaurus, etc) when Apple
was still making pink iMacs.

------
adekok
Intel also bet big on Wimax. Which (mostly) ended up going nowhere. That was
another multi-billion dollar mistake. And one which cost thousands of jobs at
Intel.

------
chillingeffect
FWIW, AMD also struck out HARD in the mobile market. When I interviewed there
(just before release of iPhone 3G), I was jazzed about the "imminent"
opportunity for mobile CPUs. I expected AMD to scale right down and into the
phone, tablet and CPU market. We watched the world tumble around us as if on a
ferry boat.

When I voiced the words, "mobile CPU" to anyone there, people were oblivious
and silent. The company was just out of touch, thinking everyone was going to
keep buying tower PCs and laptops. It seemed the only variable they thought
customers cared about was performance. People would buy AMD if it was just a
little faster. They didn't realize it was simply the wrong product/market.
Performance wasn't nearly as important, sigh.

~~~
petecox
one manufacturer that did anticipate the 'death of desktop' was nVidia, as
powering the "iPad killer", Pixel C.

------
shmerl
It is sad. Because in comparison with sickening ARM scene, Intel actually put
effort into making open GPU drivers.

------
frik
I was shocked the other day the Intel discontinued Intel Atom. So if one needs
a cheap SoC board, there are now only ARM like Raspberry Pi, and no cheap
($50) x84/x64 compatible boards with Windows drivers. The Intel Atom 230
(1.6GHz dual core with HT) of 2008 was cheap ($50) and fast (Pentium M based).
And on the high-end side there is still no 5GHz and there is little single-
core improvement since 2004, since the Pentium 4 3GHz. We need more
competition with x84/x64 to get lower prices and faster cores.

~~~
thrownaway2424
What do you mean discontinued? What about the Atom that was just introduced
earlier in 2016? e.g.
[http://www.newegg.com/Product/Product.aspx?Item=N82E16813138...](http://www.newegg.com/Product/Product.aspx?Item=N82E16813138429&cm_re=j3060-_-13-138-429-_-Product)
?

~~~
frik
It's not an Intel Atom, but a low spec low power Celeron. The J3060 has no
hyperthreading and without the boost it's as slow as the Atom 330 (which has
HT). So after 8 years, there is no Atom brand anymore, and the next best
option is as slow as back then and costs about the same.

[http://www.cpu-
world.com/Compare/979/Intel_Atom_330_vs_Intel...](http://www.cpu-
world.com/Compare/979/Intel_Atom_330_vs_Intel_Celeron_J3060.html)

------
tracker1
I'm not sure the "Just You Wait" is without merit, just taken longer than
thought... I'm not sure why... given the transitions that have happened
through process advancements the past 6 years or so, we now have CPUs that
burst to higher speeds very transparently, and at really low power compared to
a decade ago. While not suitable for phones, Intel is a pretty great choice
for laptops and tablets (OS concerns aside).

I do think that ChromeOS (or whatever it's called) offers a distinct
difference to most users from what Windows is. That changes things up a lot.

I feel that within 4 years, Intel will have some competative x86 offerings
compared to ARM... On the flip side, by then, ARM will be competitive in the
server/desktop space much more than today. It's kind of weird, but it will be
another round of competition between different vendors all around. I'm not
sure what other competitors will come around again.

That's not even mentioning AMD's work at their hybrid CPUs... also, some more
competition in the GPU space would be nice.

It really reminds me of the mid-late 90's when you had half a dozen choices
for server/workstation architectures to target. These days, thanks to Linux's
dominance on the server, and flexibility and even MS's Windows refactoring,
it's easy to target different platforms in higher-level languages.

There will be some very interesting times in the next few years.

------
protomyth
I do wonder what would have happened if Intel had concentrated on being a
custom chip maker. With a continuing of advancement in a low-powered x86 core
and custom options, it would have been interesting. I guess when both game
machine manufactures skip Intel to get AMD you really have to wonder where
they thought things were going. A company so against customization that it
closed its bus from third-parties probably isn't going to do well in a custom
world.

------
hoi
The mobile space was aready hotting up. Nokia, alone in 2005 was already
selling over 250M devices at around 35-40% market share. They were already
using ARM chips. Forecast for smartphones was already high, since the whole
mobile industry was posiioning itself to upgrade. It was always a question of
getting the telcos to improve their 2G networks to 3G before the smartphone
explosion was expected. Intel should have known this.

------
dredmorbius
Article error: "In 1985, fresh from moving the Macintosh to the x86 processor
family, Steve Jobs asked Intel to fabricate the processor that would inspirit
the future iPhone. The catch: This wouldn’t be an Intel design, but a smaller,
less expensive variant of the emerging ARM architecture, with the moderate
power appetite required for a phone."

That should be 2005.

------
bsder
Intel didn't _miss_ the smartphone market. It simply made no sense (and still
makes no sense) to go after a less profitable market.

When the margins on x86 cross below that of ARM chips, Intel will come in and
destroy all the ARM manufacturers.

~~~
wklauss
It did. Even Otellini said so:

[http://www.theatlantic.com/technology/archive/2013/05/paul-o...](http://www.theatlantic.com/technology/archive/2013/05/paul-
otellinis-intel-can-the-company-that-built-the-future-survive-it/275825/)

And I'm not so sure they'll be able to go into ARM fabbing that easily.
Margins of X86 are high, but they are selling less of them, so expanding to
new market would have been a wise move.

To their credit, they tried twice and couldn't make it work. Not sure if
things could have been different considering ARM is a licensable architecture
other people can use.

------
stephenmm
They made several attempts ([http://www.cnet.com/news/intel-sells-off-
communications-chip...](http://www.cnet.com/news/intel-sells-off-
communications-chip-unit/)) but I think Intel has a huge cultural problem that
does not allow for true innovation outside of core competencies. I worked for
the Marvell group and while Intel did not know how to be nimble Marvell had
there own issues which is in part (IMHO) why blackberry lost its leadership
position. But the biggest problem with Intel is the culture as you are truly
just a cog in the wheel there. Just my $.02

------
inlined
I'd love help understanding how Intel's failure in the smartphone market
didn't also doom the IoT market. The x86 chips never took off because they
were expensive and took too much energy. If IoT is to be ubiquitous, those
barriers seem even more important.

Intel's comment about IoT makes me wonder: do they think they just lost the
first mover advantage for mobile & the industry got hooked on ARM ISAs? Do
they still believe the story that their chips will become cheaper and more
powerful than ARM if they change nothing?

~~~
makomk
I reckon it already has doomed Intel in the IoT market, investors just haven't
realized it yet. Their answer to the IoT involved buggy shrunk-down 486s with
poor performance and power consumption and very limited onboard peripherals
compared to their ARM competitors. (I can't even find power consumption or
instruction set information for the newest ones, which is worrying.) They've
put a lot of money into high-profile publicity stunts like an Intel-funded TV
series ("America's Greatest Makers"), BMX bikers, Arduino boards, etc but
apparently if anyone wants to integrate their chips into the kind of IoT
product they're being promoted for, they can't actually get them unless
they're a big-name brand. They mostly seem to be relying on the press
regurgitating their PR and not looking into the details.

------
cowardlydragon
Article gone?

Well, I'd guess fundamentally it was about tying yourself to the Microsoft
sociopath mothership.

Intel could have taken Linux by the reigns and made an OSX-equivalent and
certainly windows-beating decades ago.

But they didn't.

So they missed the boat.

~~~
klodolph
> Intel could have taken Linux by the reigns and made an OSX-equivalent and
> certainly windows-beating decades ago.

I disagree. Hardware companies have a long history of trying and failing to
make good software. It's not in their core business, so the company culture
and talent pool isn't right. The same is true the other way around, just look
at how rough Microsoft's entry into the console market was.

On top of that, the "last mile" of usability between the current state of
Linux and something that can realistically compete with Windows requires
experienced UX designers and an infusion of design knowledge into the software
engineering team. This isn't realistic for most companies.

Just to be clear, not saying it would have been impossible, but I would bet
against it.

~~~
cowardlydragon
I'd be really interested to know what the team size of the original OSX people
was. Granted they were probably all NextSTEPpers that really knew the OSX core
inside and out to begin with. But still, Apple at the time had 1/100 the
resources of Microsoft and Intel. Intel could have bought BeOS after Apple
passed.

Think about that: BeOS with real backing.

------
tmaly
Intel has some amazing research. I was at an internal fair back in 1999 when I
was doing an internship there in the Product Dev group that did the Itanium
chipset.

They has this handheld touch screen device that resembled a modern smart phone
that was driven by voice recognition. This was 1999 way before the iphone came
around.

In my opinion, it is the internal politics and the leadership that has caused
them to lose out.

Chip design books in academia were already moving in the direction of low
power designs during the late 90s. They just did not take any action.

~~~
shas3
Just saying, working touch screens have been around on devices for 30 years or
more. Most 'PDA's, including < $100 phonebooks had decent resistive touch
screens as far back as the mid-90s. Also, as such, speech recognition too has
been around for at least 25 years. What's new in today's smartphones is the
cloud+deep learning driven high-accuracy. You could probably have easily found
a handheld with touchscreen and speech recognition at Circuit City in 1999.

~~~
slapresta
Working touch screens are useless without usable touch UIs.

------
oldmanjay
I guess the point of the article is that this is a failure of some sort but it
never really makes a strong enough case. The closest it comes is trying to
make some sort of assertion that the executives at Intel should have been able
to predict the future. If I were inclined to hate the company for some reason,
I would find this emotionally satisfying, but I'm not, so I just find it to be
a recap of things everyone already knew, expressed with an unjustified tone of
smug finger-wagging

------
payne92
I wonder if Intel saying "no" to Apple re: ARM will go down as the same scale
decision as Digital Research saying "no" to IBM for an operating system.

------
darklajid
Time to get out my Maemo (RIP Nokia) and Meego (too bad, Intel) merchandise
and go cry in a corner.

    
    
      Maemo/Meego
      WebOS
      Firefox OS
    

All of these were interesting, all failed. I'm still convinced that there's a
market for a non Android/iOS solution.

------
brisance
Article is back up. He accidentally deleted his 9 years' worth of posts.
[https://twitter.com/gassee/status/732368830190620673](https://twitter.com/gassee/status/732368830190620673)

------
dhimes
_“The lesson I took away from that was, while we like to speak with data
around here, so many times in my career I’ve ended up making decisions with my
gut, and I should have followed my gut. My gut told me to say yes.”_

Killer, for us fans of rational strategy.

------
draw_down
This case is a real problem for the "data-driven thinking" set. Or rather, the
problem with that idea is you have no way to tell which data you should be
looking at. Just because you have numbers doesn't make them the right ones.

------
lquist
A friend of mine at Intel told me that Intel doesn't care that much about not
owning this market because the size of the profit pool around server chips is
orders of magnitude greater and they have monopolistic control of that market.

~~~
ryao
That is the same thing that the mainframe vendors said about minicomputers and
the same thing that the minicomputer vendors said about microprocessing.
Intel's failure to recognize itself as the next link in the cycle was a
colossal mistake. They should have gone to Steve Jobs and offered to rectify
their mistake the moment they realized how popular the iPhone was. Instead,
they tried the Itanium strategy of promising things they could not deliver.
What made them think that would work after the first failure?

------
elmar
they just forgot that only the paranoid survive.

------
mastazi
OT but somehow related:

The article contains the sentence "Jobs asked Intel to fabricate the
processor" and "Intel styled itself as a designer of microprocessors, not mere
fabricator".

Question: according to your own experience, is it common in English to use
"fabricate" and "fabricator" as synonyms of "manufacture" and "maker"?

I am much more familiar with the negative meaning (e.g. "fabricated lies") but
since I'm not a native speaker my vocabulary might be limited.

~~~
dragonwriter
Specifically in the case of chips, "fabricate" seems to be fairly common.

------
ricardobeat
Why was the title editorialized by mods?

------
nkjoep
Unable to read the story. Seems delete from medium.

------
albasha
> The author deleted this Medium story

------
EGreg
The author deleted this Medium story.

------
blhack
...what?

Intel didn't miss anything, they sell the hardware that powers the
infrastructure behind these new, always-connected devices.

~~~
shortsightedsid
They were in that even before smartphones were the norm. Considering that they
had to compete a lot more in the server space (SPARC comes to mind) and then
won that market doesn't mean they won the smartphone market.

AFAIK, Paul Otellini also made the same argument about servers however, the
point is really about processors that run inside consumer products and not
server infrastructure. Intel Inside just isn't the case.

------
crorella
He just deleted the post :/

~~~
smrtinsert
Why would he do that?

~~~
reiichiroh
ALL of his posts are gone from MondayNote now strangely.

~~~
erbo
And his entire Medium account is gone, too.

------
Agathos
Hey, it's the BeOS guy!

------
visarga
... and also the GPU market.

------
known
The analogy is
[https://en.wikipedia.org/wiki/Glass_ceiling](https://en.wikipedia.org/wiki/Glass_ceiling)

