
Intel to cut IoT jobs - rbanffy
https://www.electronicsweekly.com/news/business/intel-cut-iot-jobs-2017-07/
======
jd75
It's been said already, but these products were destroyed by support-forum-as-
documentation. You can get a light blinking on an Arduino in five minutes. On
the Edison, it took all my free time for a couple days to even figure out what
pins were what. During that time, the excitement of building something more
complicated on it evaporated. We are programmers, not documentation detectives
on a scavenger hunt. I hope anyone who threw a bureaucratic monkey wrench into
a proper documentation hub uses their unemployment to reflect.

~~~
laydn
I completely agree. Unfortunately, so many semiconductor vendors are going
this way (for ICs) as well.

For example, there are so many parts on the Texas Instruments web site for
which you can not get any support, unless you are a heavy volume customer.
They even turned some of their product forums to "read-only". You can't even
post a public question anymore.

Similarly, Xilinx had an official support channel (Webcase). A few years ago,
they made to decision to only have that support channel open to their Tier 1
(or whatever they call them) customers. We can no longer ask for direct
support from Xilinx with the money we spend on their chips (~100K USD worth of
FPGAs) every year.

... and don't get me started on Qualcomm.

~~~
tachyonbeam
Not sure what these companies are thinking. They might not realize, but some
small companies get big. Some of the young engineers of today will be the CEOs
of tomorrow. The people who are getting shitty support from Xilinx and
Qualcomm right now are likely not to want to do business with them later.
Furthermore, some big companies actually do listen to what the engineers have
to say. IMO, big tech companies are failing to understand why you should try
to have the technical people be on your side.

------
kogepathic
_> IoT accounts for less than 5% of Intel’s sales. The group had revenues of
$721 million last year_

This isn't a surprise to anyone who's encountered any Intel IoT product. I
think for most of us, the biggest question was why it took Intel so long to
pull the plug.

Intel released products for IoT, then didn't update them forever (look at how
many times they announced successors [0] [1] to the Quark) and they wonder why
no one was interested to build products with their solutions.

As an aside, this article is terribly written (grammatical errors everywhere,
spelling mistakes like "msrket").

[0] [http://www.cpu-
world.com/news_2014/2014070801__Dublin_Bay_So...](http://www.cpu-
world.com/news_2014/2014070801__Dublin_Bay_SoCs_to_replace_Quark_processors_in_2015.html)

[1] [http://wccftech.com/intel-quark-liffy-island-seal-beach-
proc...](http://wccftech.com/intel-quark-liffy-island-seal-beach-processors-
roadmap-details-leaked/)

~~~
m-j-fox
What surprises me is $721m seems like a lot for a product no one buys. Also
surprised it's not worth employing 100 people bringing in $7m/yr/head. I would
employing such people.

~~~
baybal2
Not a big sum for a semi business. There are quite a few semiconductor
companies of less than 100 people and billion buck sized revenues

~~~
baybal2
one waffer for less than a 100 waffers tapeout will cost at around 9k if we
talk about 14nm.

it is very likely that they were actually making losses

~~~
throwawaybbq1
How many useful die per wafer? Is there a ballpark yield for 14nm?

I too suspect this was not a profit-in-the-now division. They were probably
playing a long-term game. I think they ceded the game to someone else .. but
who?

~~~
baybal2
>How many useful die per wafer? Is there a ballpark yield for 14nm?

~550/580

------
heisenbit
Intel rode too long on the x86 and the horse is now tired. It has been tired a
while but there were deep moats around Intel. Manufacturing technology was one
but others have been catching up as mobile provided the market and cash for
sustained investments. Wintel was another but both Microsoft and Apple are
both looking at ARM. On the server side ARM, graphic chips and FPGAs are
taking away and are threatening to take even more business.

Intel seems to know they need to break into new business. But doing that with
the mindset and within the constraints of a quasi monopolist is very, very
hard. Building something new requires not just resources but calendar time,
commitment and freedom not just on the technical but also on the business
model side. Focusing on your core strength is an excellent mantra until one is
in a corner and by then it is hard to think different.

~~~
calafrax
Intel's entire business is based on high margins. Trying to take that
orientation and apply it to low-margin markets like mobile and IOT has been a
complete failure for them.

I don't think they actually care (yet) because a few billions is peanuts to
them as long as they have their high margin business but the wolves are at the
door.

AMD is actually much better positioned because they have gone through a very
painful restructuring and reorientation toward a business model that works
with low-margins and they now seem to be coming out the other side.

Intel will eventually have to go through the same restructuring and it will be
even more painful for them because they have a lot further to fall.

~~~
godzillabrennus
The only other high margin semiconductor business I see as a competitor to
them is not AMD, it's Nvidia.

Gpu technology from Nvidia is continuing to unlock new capabilities that make
upgrading a CPU less important and as more code becomes gpu dependent it'll
matter less what CPU architecture you are running.

That opens the door for arm to enter the server market.

That is doom for Intel.

~~~
baybal2
Ja, even now Nvidia can simply put an average general purpose core into GPU to
run Windows and release something like libGPU to let devs access computations
on GPU directly.

I believe that they are acutely aware of that, but at the moment they chose
"to not to slaughter a hen laying golden eggs"

------
pjmlp
Intel's main problem is that in the age of commodity OSes, bytecode executable
formats, bare metal runtimes and SoC that rival a Pentium class processor, the
processors doing the work are kind of irrelevant.

Sure someone still needs to write those AOT/JIT compiler backends and
virtualization layers, but it has become a thin layer in the overall stack.

~~~
dom0
Even without bytecode and VMs, a x86 machine has virtually no advantage over
$otherARchMachine in these niches.

~~~
candiodari
In practice that's just not true. There are a lot of manufactured products
using $otherARchMachine, but good luck making a decent self-built video player
using anything but x86 (kodi on raspi is simply too slow, hell, even retropie
cannot actually run n64 games on anything but the most modern pi, whereas
they're perfectly smooth on a 10 year old x86 laptop)

Having a mini-ITX for everything, including even nas storage (I hate the
people calling this cloud storage), is so much better it's ridiculous.

~~~
pjc50
> decent self-built video player using anything but x86

The Pi chip was originally designed for set-top-box usage. Much of the die is
video decoder, I believe. So if you have the right drivers and maybe an MPEG2
license it should be fast. [https://www.raspberrypi.org/blog/new-video-
features/](https://www.raspberrypi.org/blog/new-video-features/)

~~~
raverbashing
Correct

Your average set-top-box (cable or whatever) doesn't have an x86 and it runs
fine. But accelerated video play requires proprietary software

Not sure how it works on Android, but I guess it's more or less integrated
there (so you can have accelerated video play easily and "in the system")

~~~
pjmlp
The majority of users don't care what kind of drivers are on their IoT
gadgets, as long as they fulfil their purpose.

~~~
throwaway2048
they will care when their devices get hacked because the kernel is impossible
to update for anyone, including the vendor.

~~~
pjmlp
Android, home routers and webcams are the living proof they don't care.

~~~
throwaway2048
Only because we haven't have a Sasser level event yet, an android is far worse
on the security front than XP ever was.

~~~
pjmlp
Stagefright comes to mind.

~~~
bonzini
Not even close to Sasser or SQL slammer. Imagine every other person not being
able to turn on their phone all of a sudden tomorrow morning.

------
threepipeproblm
I blame
[https://twitter.com/InternetOfShit](https://twitter.com/InternetOfShit)

~~~
TeMPOraL
They're fighting the good fight.

------
Jedi72
People in this thread are flaming Intel, I suspect this is just the first of
many major vendors/companies quietly droppping IoT (whilst proudly announcing
their new AI/VR/TLA offerings!)

~~~
fellellor
What is TLA?

~~~
majewsky
I guess TLA is supposed to expand to "three-letter acronym", meaning that it
does not stand for any specific buzzword technology, but the whole buzzword-
powered innovation cycle itself.

------
cctan
From what I heard in Malaysia's Intel sites, IoT are used as reasons to open
new departments or new turfs for some second class managers. No wonder it is
missing the mark in what it is doing.

~~~
MrBuddyCasino
That would explain a lot. But why does Intel allow it? They already lost
mobile.

------
reacweb
IMHO, the market is evolving toward tinker. Many arduino and raspberry pi have
been sold. Many people are buying 3d printers. Many android phones have been
produced at small scale. We love the idea that we can build (or fix) stuff
ourselves.

This would mean an open market with a fierce concurrency (and low margins).

I hope that this is the future, but Intel like many chip makers are fighting
against it by blocking documentation of their products (using NDA and/or huge
prices).

~~~
delazeur
I love that stuff, but I have a hard time seeing it move beyond early
adopters. I just don't think that more than 5-10% of the population wants to
mess around with the guts of their tech.

Some versions of those things may go mainstream, but only after they get to
the point where people don't have to do the technical and creative work (e.g.,
click a button and your 3D printer makes the new iPhone case you want, so you
don't have to wait for it to be delivered).

~~~
Animats
_I just don 't think that more than 5-10% of the population wants to mess
around with the guts of their tech._

Maybe 1%-2%.

Most "Internet of Things" devices just don't _do_ very much. They lack any
actuators.

There are lots of things that can be done to make heating, ventilation, and
air conditioning more efficient and comfortable. But the good stuff requires
installation. Mechanisms to open and close windows. Dampers. Sensors for
humidity, CO2, CO, temperature, rain, and fire. Wired connections to
everything. (Battery replacement is unacceptable in commercial environments.)
All this is available from major HVAC vendors. Usually for too much money.
(Window openers seem to start at $500 per window, so nobody buys them.)

A few years ago I went to an IoT meetup in SF, in the Dogpatch area. This was
in an old industrial building. Skylights overhead had openable windows on
manual chain falls. Outside windows overlooking the bay were openable. There
were overhead fans and a standard HVAC system. None of this was powered and
controlled. So the place overheated and became uncomfortable with too many
people inside.

In a space like that, as more people come in and the CO2 level goes up, the
overhead windows and side windows should open and the overhead fans should go
into reverse to bring down the CO2 level. Then the side windows should be
adjusted to maintain temperature. As evening comes and the outside temperature
drops, the side windows should close more. At some point, heat may be
required. As people start to leave, the CO2 level will drop, the overhead
windows start to close and the fan speed drops. When everyone leaves, and the
motion detectors see no movement, everything closes up and the temperature is
allowed to drop to 60F or so for the night.

Some hotels have systems like that in function rooms. Any space with a widely
variable people load, such as a classroom or conference room, should be
equipped with CO2, humidity, and temperature sensors connected to actuators
which can do something about it.

------
bhouston
So Intel is now confirmed to be missing the mark in both the mobile and now
the IoT market. Interesting.

------
pjmlp
Ironically, I just saw an ad to this webcast:

"Get into IoT with Intel" \-
[https://register.gotowebinar.com/register/108829149494036582...](https://register.gotowebinar.com/register/1088291494940365825)

------
gruturo
The IoT market is going to grow significantly and Intel drops it? Why not
reboot it, after a "lessons learned" and some serious study of the e.g.
Beagleboard, Pi and Arduino ecosystems, to check what works (hint: price
matters. Lots of easily driven 3.3v GPIOs matter. Battery life matters.
Sometimes integrated connectivity matters, if it doesn't kill price or battery
life).

Have they not learned from the Xscale fiasco that dropping a soon-to-be hugely
important market is not the best strategic move?

And it's not like both predictions are/were any difficult to make.

They have the technical chops to do it right, their mainstream CPUs are ample
proof of that, and this sector _will_ get a lot bigger. Their top management
either lacks the balls/vision to drive the idea to success, or is acutely
aware that they would never, ever be profitable in a market where the end
product is under $40 (sometimes under $5).

I don't know which alternative is sadder.

~~~
pjc50
Many big corporates, and indeed people, when faced with a choice between "give
up" and "admit mistakes then learn from them", will choose "give up".

~~~
rubyfan
Why is it a given that Intel have to be there? It maybe be a market that they
are in fact not geared toward. Just because there is a market doesn't mean
they need to be there, doesn't mean they can make money and doesn't
necessarily mean their core market will go away.

Many times companies will try to get into new markets or channels as a hedge,
just in case their cash cow goes tits up. I give them credit for seeing that
it was right to bail.

Closing a $700m unit and laying people off is a pretty big admission of a
mistake - they just see there's no viable course correction. Giving up is
sometimes the best option.

~~~
gruturo
> Why is it a given that Intel have to be there?

Because their name stands for INTegrated ELectronics and they just failed at
it.

> Closing a $700m unit and laying people off is a pretty big admission of a
> mistake - they just see there's no viable course correction. Giving up is
> sometimes the best option.

Agreed, not disputing that, just trying to analyze the reasons behind the
failure.

------
ausjke
strategically, the selling of its ARM department was a big mistake, the
x86-IoT obviously can not make up for that and take off.

Intel has tried multiple times for the embedded market, so far not that great.

I'm now watching the yocto project for what Intel is the main force behind, it
is a bad idea(over engineered to say the least) with a few good people running
it.

~~~
versale
I'd love to know more on why Yocto is a bad idea and what could be an
alternative. no sarcasm.

~~~
Zigurd
Yocto is not a bad idea in a vacuum. But Yocto doesn't have a good hardware
platform for low cost IoT, so it is limited to a narrow category where Intel's
mainstream CPUs can play. And to the slice of that market that won't go to
Microsoft's IoT variant of Windows, nor to Android Things. I doubt there will
be, for example, a Yocto-based car infotainment system.

~~~
sriram_sun
Automotive Grade Linux is a Yocto based distro for exactly that. I even saw
job postings on LinkedIn for "AGL experience!" After I dug in a little bit, I
found that it was just built on top of Yocto/OE.

[http://docs.automotivelinux.org/docs/getting_started/en/dev/...](http://docs.automotivelinux.org/docs/getting_started/en/dev/reference/source-
code.html)

------
microcolonel
The hilarious thing is, Intel probably still has enough top talent to compete
in an open ISA market, and they still have some edge in process technology.
Then again, AMD can probably beat them in the medium to long run.

If they want to tend to these markets, they are going to have to give up on
being the single source.

------
patrickg_zill
I seem to recall that IBM would not even try to enter a new market unless they
could count on a sustainable 1 billion per year in sales.

Seems that Intel, sales $721 million after several years of work, has a
similar metric for new business opportunities.

------
mtgx
Another major Intel "experiment", following the failure in the mobile market,
bites the dust. Next-up, Intel's $16 billion FPGA bet? Machine learning chips
seem to become _more_ specialized, not less.

~~~
neurotech1
To design, prototype & manufacture a Machine Learning ASIC with decent
performance costs $10m+ to develop which is why FPGAs still have a large share
of the low volume and/or high end market.

------
xchip
They are late to the party and their stuff is too complicated and consumes too
much. Also BLE devices with an 8051 (a CPU from 1980) won the game.

Also despite of the hype IoT was not such a big thing (despite for the
hobbyist market, where people(me included) get quite excited by building a
temperature logger)

------
oneplane
More like "Intel to still shoot itself in the foot continuously for 10 years".

We had a comparable theme on the Curie/Edison/Quark post a few weeks back.
They just don't seem to get it.

------
seanwilson
> I'm not sure what AMP is, but it's missed its mark.

AMP pages load super fast for me on mobile so not sure how you can say that if
you're just basing that on not understanding the design choices behind AMP.

Would have been nice if the author had investigated why Google is recommending
so many JavaScript libraries that seem so large instead of just assuming it
was a very bad idea. For one, AMP will be loading the JavaScript in a non-
blocking manner so the page will appear to load fast and if you've visited an
AMP page before the JavaScript will be cached (unless there's so many
different versions...?).

------
brooklyntribe
That's all right. They don't get it. Just the way life rolls. The creative
people are gone. Just the nature of innovation.

------
JustFinishedBSG
Except in X86 CPUs it really seems Intel has absolutely no idea what they are
doing.

