
Intel discontinues Joule, Galileo, and Edison product lines - rbanffy
http://hackaday.com/2017/06/19/intel-discontinues-joule-galileo-and-edison-product-lines/
======
nickpeterson
I'm calling it, Intel is going to be a shadow of it's former self in 5 years.
They have had tremendous issues competing in almost anything outside of x86
processors over their entire lifespan. They consistently get outmanuevered in
gpus, ssds, low power socs, machine learning, et cetera. The x86 platform that
they currently own, is mostly due to intertia and fab facility advantages.

AMD, was able to launch a pretty competitive CPU despite massive delays
because Intel has barely improved the ipc of their processors over the last 5
years.

Meanwhile Apple is betting on iPads being the future computer of the Everyman
and they make their own chips. Microsoft recently acknowledged that windows
basically has to run on arm for the future proofing of their platform. I
guarantee you start seeing more arm based windows computers soon.

Intel recently told everyone they're willing to sue for patent money, the last
desperate act.

Intel better have a leapfrog cpu in the pipeline or it's over.

~~~
Twirrim
Intel has been "dead" before. AMD has "beaten it" before.

AMD knocked it out of the park with x86_64, which allowed a seamless
transition to 64bit. Intel ended up having to license x86_64 implementation
from AMD.

AMD beat Intel with their K6 and similar series of chips where, just like this
time around, they were able to get way more performance per tick out of the
CPU. Intel was supposedly dead in the water due to their toaster era P4 chips
that ran hot as hell, and consumed way more power to get the same job done.
AMD started making some serious inroads in the server CPU market with early
era Opteron processors. Following that era, out came Centrino era of mobile
processors, which took a different approach to the CPU architecture from the
P4 and set things up for the Core2Duo etc series of processors and on in to
the i7s and the like.

I'm highly skeptical that Intel is any more dead now than it was then. It has
a track record of going away and completely changing the whole story all over
again, and they've got the financial resources to keep on doing so.

~~~
kbenson
> AMD beat Intel with their K6 and similar series of chips where, just like
> this time around, they were able to get way more performance per tick out of
> the CPU.

There was in interesting submission the other day about performance of the
Ryzen vs i7, and how their AVX2 instruction support isn't what it's racked up
to be[1]. I'm not really qualified to assess the source or claims accurately,
so I'll let others read it themselves and come to their own conclusions, but
it was interesting.

1:
[https://hashcat.net/forum/thread-6534-post-35415.html](https://hashcat.net/forum/thread-6534-post-35415.html)

~~~
adrian_b
Those benchmarks mean absolutely nothing, because they obviously knew almost
nothing about the processors they were trying to use.

1\. "Ryzen's AVX2 support is a bold-faced lie" To say this only shows complete
ignorance. It was publicly known for many years that Ryzen will have only
128-bit AVX units, compared to the 256-bit AVX units of Haswell and its
successors.

Nevertheless, using AVX-256 is still preferable on Ryzen, to reduce the number
of instructions, even if the top speed per core is half of that reached by
Intel.

2\. The benchmark results just show incompetence. While the top speed per core
is half, the number of cores is double, so you just need to run twice more
threads for a Ryzen to match the speed of Intel.

It is true that an i7 7700K will retain a small advantage, because of higher
IPC and higher clock frequency, but the advantage for correct programs is
small, not like the large advantages of those incompetent benchmarks. I have
both a 3.6 GHz /4.0 GHz Ryzen and a 3.6 GHz / 4.0 GHz Skylake Xeon, so I know
their behavior from direct experience.

While 4-core Intel retains a small advantage in AVX2 computations over 8-core
Ryzen, there are a lot of other tasks, e.g. source program compilations, where
Ryzen has almost a double speed, so you should choose your processor depending
on what is important for you.

3\. The most stupid benchmark results are for SHA-1 and SHA-256. Ryzen already
implements the SHA instructions that are also implemented in Intel Apollo Lake
processors (to boost the GeekBench results against ARM) and will also be
implemented in the future Intel Cannonlake processors (whose 2-core version is
expected to be introduced this year).

If they had benchmarked a correct program that uses the SHA instructions,
Ryzen would have trounced any Kaby Lake processor.

------
kardianos
FYI, as someone who worked with the Edison I can't say I'm surprised. Flashing
the Edison was nearly impossible and a big pain. The hardware routinely
crashed. Much of the low power "specs" came from (overly) aggressive power
management which introduced momentary delays and pauses. I'm happy to see them
go and never want to work on such a platform again. The GPIO stopped working
reliably at higher speeds despite their spec claims.

By contrast, the raspberry pis and even the Ci20 are significantly more stable
and easier to work with. Their specs far more truthful.

~~~
mwambua
I tried using the Galileo when it came out and was similarly disappointed. It
claimed Arduino compatibility but had IO so slow that it couldn't interface
with a DHT11 temperature/humidity sensor. In the end I got the feeling that
Intel hadn't really thought things through :/

~~~
diabeetusman
You understate the issue-- the GPIO pins had a default throughput of 230 Hz
[[https://communities.intel.com/message/207904#207904](https://communities.intel.com/message/207904#207904)]

------
edmundhuber
Why did Edison fail:

    
    
      - it was too expensive compared to other BLE and WiFi capable SoCs or combinations of chips.
    
      - x86 compatibility doesn't matter.
    
      - power draw (~1W) is too high for the places where one would want to use this SoC.
    
      - the Yocto -based SDK was a mess. Every feature had a caveat and it was a pain to build.
    
      - there was never a clear commitment from Intel that they would make these in bulk for manufacturing.
    

The new hotness are the Espressif (ESP32) and MediaTek (mt7697) SoCs.

    
    
      - low power draw (~300mW), even lower at sleep (50mA - nA depending on what kind of sleep),
    
      - SDK is FreeRTOS based,
    
      - the "MCU features" like GPIO, PWM, etc, actually work all the time.

~~~
userbinator
_\- x86 compatibility doesn 't matter._

On the contrary, I'll say that it _does_ matter --- and that's why Edison
failed. It was x86, but not truly "IBM PC-compatible". Those who didn't care
about PC-compatibility were unlikely to choose x86 over something like ARM,
and for those who did, the Edison was useless.

If Intel had chosen to put an entire "real" PC on the SoC with, yes, plenty of
legacy peripherals and such so that it could --- with suitable I/O interfaces
attached --- basically act as a lower-powered desktop or laptop, I'm almost
willing to bet it could've turned out very differently. They could've found
applications in things like this now-dead product, for example:
[http://www.pcworld.com/article/2873118/mouse-box-wants-to-
st...](http://www.pcworld.com/article/2873118/mouse-box-wants-to-stuff-a-full-
pc-into-a-mouse.html) (discussed at
[https://news.ycombinator.com/item?id=8931999](https://news.ycombinator.com/item?id=8931999)
)

Intel's strength is the immense backwards-compatibility of x86 and the PC
architecture, but in trying to make a not-quite-PC platform, they basically
threw away their competitive advantage.

~~~
pawadu
> not truly "IBM PC-compatible"

It's an embedded system, not a as low-power desktop.

If you are using it like an ordinary PC then you are probably using it wrong.

~~~
vertex-four
Well that's the point. Nobody wants an x86 IoT device. They _might_ want an
x86 embedded device - where embedded is defined in the large, i.e. systems
that control lots of integrated peripherals with user interactivity built into
the device, and lots of complex built-in functionality.

But at that point you're usually looking at proper Linux-based boards with
lots of standard IO - even HDMI output - in the ARM space, and Edison provides
absolutely nothing over those. It might've provided something if you could
reasonably treat it as a bog-standard x86 computer with some extra
functionality attached.

~~~
pawadu
> Nobody wants an x86 IoT device.

No, you can't define other peoples use cases.

For myself, as long as the device meets my requirements (peripherals, power,
tools, size, price) it can use power8 or PDP/11 for all I care.

~~~
vertex-four
> For myself, as long as the device meets my requirements (peripherals, power,
> tools, size, price) it can use power8 or PDP/11 for all I care.

In which case, you'll likely be using a _significantly_ cheaper ARM board
which does all the same stuff, uses less power, probably uses a more standard
distro, etc etc. Which brings us back to - nobody wants an x86 IoT device.
They don't fit in anywhere ARM doesn't fit better.

~~~
pawadu
Please read my comment again.

~~~
jokr004
He did a perfectly fine job of reading your comment... you implied that you
only cared about certain variables, "peripherals, power, tools, size, price".
vertex-four points out that "x86 IoT devices" (ie these Intel chips we are
talking about) are a bad choice in regard to those things that you care
about...

~~~
pawadu
I am not sure about that...

First of all, you don't know my requirements yet you declare ARM winner. What
if for my particular use case Tensilica is the best choice?

Furthermore, you generally cannot define what IoT means for other people. For
one person it could mean a 8-bit garage door opener, for someone else it could
be an octa-core 64-bit monster.

Finally, Intel has a simplified x86 design with very good power usage for use
in IoT. This CPU is not used in Galileo and alike today, but it exists.

~~~
jokr004
I didn't declare anything... I have no opinion here, I was just explaining
what the other guy was saying.

------
JonathonW
Here's the PCN for discontinuation of Edison:
[http://qdms.intel.com/dm/i.aspx/C5E58142-4E04-4CBD-A7A6-BF33...](http://qdms.intel.com/dm/i.aspx/C5E58142-4E04-4CBD-A7A6-BF330573055D/PCN115579-00.pdf)

And Joule:
[http://qdms.intel.com/dm/i.aspx/C3391A8F-693F-418B-B9B5-03A7...](http://qdms.intel.com/dm/i.aspx/C3391A8F-693F-418B-B9B5-03A75113F08B/PCN115580-00.pdf)

------
crusso
This is unsurprising. My experience with the Edison: Cool little product with
a lot of potential, but the stability problems, lack of timely releases of
updates, lack of support for common libraries in their package management
system, etc. were all bad signs. I never got the feeling that it would be safe
to build a product around the Edison.

------
SEJeff
That's too bad, the Edison was a nice small developer board. The NUC is quite
a few steps up and not quite for the same target market (I've got a few). I
wonder if they're going to continue attempting to compete with ARM or they
have just realized they lost the low end battle with x86

~~~
pjmlp
Intel's problem is that there are lots of boards to chose from and x86
compatibility is irrelevant on IoT space.

Specially when talking about CPUs good enough for high level languages like
ESP32 (hello PCW 1512).

~~~
djsumdog
I don't think x86/PC comparability is irrelevant. ARM is not an architecture.
It's just a SoC where manufactures hook random crap to random pins and make
patched to hell, non-upstreamable kernels:

[http://penguindreams.org/blog/android-
fragmentation/](http://penguindreams.org/blog/android-fragmentation/)

Windows Mobile ARM at least required UEFI, but their bootloaders are locked.
Most mobile phones don't support device tree. Even on ARM boards that support
device tree, hardware support is still hit and miss:

[http://penguindreams.org/blog/review-clearfog-
pro/](http://penguindreams.org/blog/review-clearfog-pro/)

I think there is a space for an x86/UEFI embedded devices. Maybe AMD should
try to jump back into this space. A newer Geode?

~~~
pjmlp
My remark about it being irrelevant is due to the fact that for IoT
applications, one doesn't care for backwards compatibility of existing
applications.

IoT deployments are usually software developed for a special use case.

Then if the target platform is powerful enough to allow C, C++, Rust, Java,
Lua, MicroPython, Pascal, Basic, <whatever language with rich library>, then
the actual OS is also kind of irrelevant.

I am not thinking of boards to run GNU/Linux or Windows, mimicking a desktop
experience.

Maybe it shows my 80's background, but for many use cases an Arduino like bare
metal development is more than good enough, hence x86 being irrelevant when
one has an high level language with a nice abstractions SDK.

------
sbierwagen
Fortunately this doesn't mention anything off the Euclid line, like their
moderately cool single-box CV thing: [https://click.intel.com/intelr-euclidtm-
development-kit.html](https://click.intel.com/intelr-euclidtm-development-
kit.html)

We can only hope that someone at Intel has realized IoT is a total tarpit, and
is getting out of the product segment entirely.

~~~
lowglow
We had a chance to play with the euclid yesterday! That thing is really
exiting, but gets quite hot. One of our participants was hacking an autonomous
agent with it:
[https://www.facebook.com/radbotsapp/videos/310382552738200/?...](https://www.facebook.com/radbotsapp/videos/310382552738200/?fref=mentions)

------
quickben
And by the look of it, when the server and hedt CPUs hit the market, they'll
discontinue a lot more.

They got cozy with the monopoly, seems the bills arrived.

~~~
sitkack
They should have spun this line out to a new company. Intels behavior around
x86 mirrors Microsofts and Win32. Classic innovators dilemma. This move only
hastens the exodus.

~~~
pjmlp
At least Microsoft seems to have a plan with .NET, Azure and to certain extent
UWP.

Intel on the other hand is still searching apparently.

~~~
sitkack
The new Microsoft has made amazing strides in transforming themselves .

------
baybal2
How ironic, the most succesful Intel MCU was 8051. It is 40 years old, yet
still rockin

~~~
boznz
In the 1980's the Intel 8031,8051,8052 were the #1 embedded processors (Its
still in the keyboard your probably typing on) However the patents expired and
its now made by everyone else, but ironically not Intel!

Intel currently has nothing for the smaller (non-operating system) embedded
market which is still mostly 8-bit and low pin-count and as everyone has
stated ARM has already won the fight for 32-bit (though I do also use PIC32
which is MIPS)

~~~
leggomylibro
What about a 64-bit architecture for the Cortex-M market? If they could pull
off some wizardry in focusing on energy efficiency, and maybe target
peripheral functions involving GPU-like parallel processors for small-scale
ML/AI/etc. purposes?

I dunno, it seems like there might be a market for that sort of thing. You
train your model, pop it on a chip that consumes microwatts per megahertz?
Something like that could be appealing.

It might also be impossible. I don't design chips. But I do think targeting
both mobility and parallel processing could be cool. Maybe something like what
Parallella is doing.

~~~
boznz
64-Bit architectures and GPUs are totally different beasts requiring MMU's and
OS's and whole development teams.

My last project used a 14 pin processor with 195 lines of bare metal C code
compiling to 486 bytes of Flash memory and running on an internal 32Khz clock.
This is more the target 8051 market though I must admit some cortex M0
processors are getting as cheap to use here.

The Propeller Chip is awesome (no interrupts and 8 processors is a really cool
concept) but at $8 it is going against the big boys (Freescale/ST/Microchip)
with their more flexible memory, power management and rich peripheral sets. I
would love for one of the big players to licence the propeller core but it
wont happen.

~~~
boznz
.. Forgot the propeller 1 core IS open source now and still no takers, real
lack of vision out there.

------
franciscop
Ah and I am so lucky I never switched to Galileo while I was tempted during
its release. For some reason I thought it wouldn't work out. Something felt
off for some reason and I could kind of get by with Raspberry Pi.

Reading hackaday comments it's probably from the documentation and Intel's
doing, not for the technology on itself. I am guessing that open source OR
community > closed source or company (as in Raspberry Pi with a great
community vs Galileo or Arduino vs anything else) for these kind of things.

------
oneplane
Intel just doesn't seem to get how this works. You can't just make a platform
and then throw it away expecting people to like your brand...

~~~
FlorianRappl
Not the first time they did it. Now they have been on "IoT" and "Machine
Learning". Beforehand it was about "NUI" and "AppStore". Before that they also
tried in other areas. What happened to their cross platform endeavor with the
XDK?

~~~
CamperBob2
B....b....but _FPGAs!_

Intel is flopping around on the beach like a dying fish. They rested on their
laurels for far too long.

~~~
yaschobob
They did $59bn in revenue last year lol

~~~
CamperBob2
The first derivative of that number is what's important.

[https://media.ycharts.com/charts/62ec44ed2571caa3dbba144b0c7...](https://media.ycharts.com/charts/62ec44ed2571caa3dbba144b0c7f6a9f.png)

Watch it over the next few years, let's see what happens.

------
etqwzutewzu
What does it mean for Android Things (former Brillo) project? Intel Edison and
Intel Joule were ones of the few supported boards.

~~~
pjmlp
Android Things is cool to reuse Android knowledge, but one is better off with
a board that supports GNU/Linux directly, as there is support for whatever
programming language one feels like using.

~~~
naikrovek
Ehh, not always. It is often best to avoid a full OS in favor of something
with less complexity. Android Things and Windows 10 IoT are attractive for
this reason, among others.

~~~
pjmlp
Android Things and Windows 10 IoT are a full OS.

Given that you mention it, from a hobby developer perspective, I would rather
pick W10 IoT, because at least Microsoft does offer proper support for C++,
including easy integration with .NET, unlike the dev experience with the NDK.

------
thrillgore
I will never understand why Intel sold off StrongARM/XScale, it seemed like it
could have been pivoted into their own IoT offering.

~~~
duskwuff
That would have required Intel to predict the IoT craze in 2006.

~~~
thrillgore
What's there to predict? ARM has always had a stronger low-power presence than
x86. Instead of hedging all bets on Atom, they should have kept XScale on hand
as a second option. They still have a license to make ARM chips, but the
capability that XScale presented never should have been sold.

------
noen
I worked on a handful of products using Edison, and was speaking to Intel less
than a month ago about their Joule line at a conference, where they assured me
Joule was the future.

All of these chipsets had (and still have) huge promise, but have been mired
in really puzzling and terrible board design issues.

You can tell that there are two different groups at Intel, the "Core" and the
"Iot".

The Edison was super powerful, price competitive, and an honestly wonderful
platform to dev on. YOCTO, while a weird decision, was a pretty vanilla Linux
flavor and easy to pick up.

With all that promise though, the botched the silicon. The 2nd cpu on Edison,
the Quark 100mhz one, never actually worked. They were shutoff in firmware
from day 1 because of presumed hardware issues.

Even worse (and the reason we stopped using Edison), the SPI bus had so much
electrical crosstalk on it from not being properly routed or shielded, you
couldn't use it at anything over 25hz with a SINGLE bus endpoint. This removed
90% of the real-world uses for the Edison to drive displays, sensor and motor
arrayset al. Intel knew it was a problem and consciously decided not to Rev
the board to fix it.

Gallileo and Joule are both underpowered and incredibly overpriced devices.
Today, the raspberry pi 3 is the hobby standard, and in nearly every real
world use case, it is orders of magnitude more performant at 10% or less of
the cost.

Intel IS is trouble, because this is their third botched attempt to enter the
world of embedded computing and mobile computing.

First was the Atom, which isn't bad, but is too power constrained to compete
with ARM. They made some good efforts here, but the cost is higher and
perf/watt significantly lower than ARM.

Second was their foray into mobile, trying to branch from Atom. Anyone here
ever use an Intel powered phone? Well they spent billions on it, never to have
a mass market device actually appear. Same problems - while have equivalent
performance to ARM, prices were 30-50% higher and performance per watt was
significantly worse.

Now here we are with attempt 3. With the same issues. Intel fundamentally
doesnt know how to design, manufacture or sell embedded chips.

It's a completely different market motion, different customers, different
constraints, shorter cycles and much much different competitive landscape.

AMD isn't going to "beat" Intel. They have fundamentally the same problems.
Both AMD and Intel aren't going to go bankrupt, but they are going to continue
the slide into much smaller scale manufacture.

They are both being eaten by the dozens of ARM vendors, by the FPGA movement,
and by public cloud data centers. It's a reduction by a thousand cuts, making
it that much more difficult to do anything about it.

~~~
canada_dry
> assured me Joule was the future

It's certain that the decision came from the finance dept, not from the
sales/marketing folks. Those folks were there because they truly wanted Intel
to be a leader in IoT.

Just like Texas Instruments failed attempts, Intel got into this game thinking
they could make decent margins and that their brand would clobber the little
guys (e.g. Eben and Massimo).

Turns out supporting the IoT community properly actually requires passion and
expensive commitment.

On a side note, my take is that Arduino is quickly heading towards
irrelevance. With their myriad of products they are spread too thin. New
products (beginning as far back as the Yun) get very little in the way of
proper support/documentation and the company infighting is a terrible
distraction that is hurting the brand.

------
rocky1138
Boo! I really liked my time working with the Edison. A great platform and so
tiny!

------
georgeburdell
Long overdue. I had hope that Galileo might be an Arduino/Raspberry Pi
competitor, but Joule was blatantly an attempt to recoup some of the cost of
their already-cancelled smartphone SoC program.

Why now? They just announced that they're cutting spending down to 30% of
revenue by 2020: [https://www.fool.com/investing/2017/05/12/intel-
corporation-...](https://www.fool.com/investing/2017/05/12/intel-corporation-
puts-out-long-term-spending-targ.aspx)

------
wastedhours
All in on Compute Cards then? Or at least, the next attempt to get into the
IoT sphere with an overpriced-in-the-market and poorly community supported
ecosystem...

~~~
5ilv3r
You mean the eoma68 knockoff intel announced right after the crowdsupply
campaign was fully funded?
[https://www.crowdsupply.com/eoma68](https://www.crowdsupply.com/eoma68)

------
jaboutboul
This document only talks about the Galileo boards. Where did you see mention
of Edison and Joule?

That would totally suck as we are pretty heavily invested in Edisons.

~~~
protomyth
A better link [http://hackaday.com/2017/06/19/intel-discontinues-joule-
gali...](http://hackaday.com/2017/06/19/intel-discontinues-joule-galileo-and-
edison-product-lines/) with the links to all the cancellations.

~~~
kleiba
From the relevant document:

Intel Corporation will discontinue manufacturing and selling all skus of the
Intel® Edison compute modules and developer kits. Shipment of all Intel®
Edison product skus ordered before the last order date will continue to be
available from Intel until December 16, 2017. Last time orders (LTO) for any
Intel® Edison products must be placed with Intel by September 16, 2017\. All
orders placed with Intel for Intel® Edison products are non-cancelable and
non-returnable after September 16, 2017.

~~~
gravypod
Will this ruin the usability for some people? What happens to the thrown away
stock?

~~~
rbanffy
I'd buy a dozen ;-)

------
zwieback
It's not like Intel hasn't tried new things over the years, before becoming a
huge ARM CPU vendor (XScale) they had the i960 MCU, which was pretty good and
the failed i860 VLIW, which was super-promising for graphics and image
processing but the compilers never delivered.

It's just that the x86 was always so huge that all the other projects never
got traction.

------
Dylan16807
What ever happened to that original idea for an SD card form factor for
Edison?

~~~
digi_owl
[https://en.wikipedia.org/wiki/Intel_Edison](https://en.wikipedia.org/wiki/Intel_Edison)

~~~
Dylan16807
That just says "they changed it", with no elaboration?

~~~
kbumsik
Yeah. They changed their mind without any explanation.

------
aflam
IoT does not yet seem to stand for Intel of Tomorrow! At least not when it
comes to boards or processors. Maybe there are just more convinced by the
potential of vertical solutions like MobilEye's?

------
kodfodrasz
Not surprised: these products have only seen press release. In my bubble they
were not available on any products, not to mention costs.

------
hellofunk
The Edison was quite a neat little machine. Fun to work with and super
powerful despite its tiny size. Sorry to see Intel abandon it.

------
signa11
here is a weird thought : how easy or hard would it be for intel to just
buyout arm from softbank ? iirc, softbank bought arm for around 32b usd, intel
with around 160b of market cap can 'easily' buy it from softbank, no ?

~~~
lunchables
I don't think they could "easily" buy a 32B company, no. I also think it might
run afoul of regulators, ie anti-trust.

------
0xakhil
I wonder what happens to those who built products with these boards inside.

~~~
svens_
Obviously they're fucked. Luckily we are still in the hardware design phase,
otherwise we'd be even more pissed.

I don't think anyone built a product with enough volume that Intel would
reconsider the discontinuation.

Long-term component availability is a major issue for hardware products.
Discontinuing a product basically over-night is not a nice move from Intel and
I hope people will remember this when Intel launches their next IoT/robotics
product.

------
bayofpigs
I enjoy experimenting with IoT boards but have never understood the pricing of
Intel's offerings. Joule, Galileo, & Edison were many times the price of their
ARM brethren. The only reason to pay so much was if you were stuck in Windows.
The Curie, however, is a powerful board at a decent price.

------
mspokoiny
they lost position forever

------
deepnotderp
Surprising exactly 0 people.

