
Apple Is Working on a Dedicated Chip to Power AI on Devices - coloneltcb
https://www.bloomberg.com/news/articles/2017-05-26/apple-said-to-plan-dedicated-chip-to-power-ai-on-devices
======
highd
This is probably going to be a hyper parallel fixed point / integer engine
like TPU gen1. Doing fast matrix multiply over really small fields is very
subpar on CPUs and GPUs. That was the initial reasoning behind TPU gen1 -
improving runtime performance.

One question is if it will architecturally be closer to a GPU or an FPGA. The
field moves so fast that it might make sense to "future-proof" a bit with a
live-reconfigurable FPGA.

~~~
jacquesm
I'd bet on this being an ASIC, doing this on an FPGA with any serious size
matrix would require a very expensive FPGA, whereas an ASIC would allow more
gates in a smaller volume and would consume less power to boot.

From the manufacturers point of view phones not being future proof is a
feature, not a bug, that way you'll upgrade to that new shiny item which will
keep the profits rolling in.

~~~
wklauss
> From the manufacturers point of view phones not being future proof is a
> feature, not a bug, that way you'll upgrade to that new shiny item which
> will keep the profits rolling in.

I don't think a better AI chip will be a convincing argument to change a phone
one year later.

~~~
RandomBK
Not if you put it that way. Apple can simply make new AI features exclusive to
newer phones with updated versions of the chips. If they open the chip up to
developers, this effect can spill out to the app store. That would then
provide the impetus to push consumers to upgrade.

~~~
LrnByTeach
You nailed the Apple plan, they did exact same thing staring from Introduction
of Siri (4S?), ... then Force Touch etc.

With each iteration of phone, they add a shiny new feature that is ONLY
available on the newer model.

~~~
moonforeshot
i don't see how force touch can be added to older hardwares...

~~~
ferbivore
It can't be. Night Shift, on the other hand...

------
JumpCrisscross
Could this mean on-device ANI? My deal breaker with Amazon, Google, Microsoft
and even Siri is their role in normalising the hoovering up of sensitive data.

~~~
yellow_postit
More work needs to be done on training models with less data, differential
privacy, and unsupervised learning, but so long as supervised learning
continues to be the main path forward for the current set of "AI" centralizing
the data into ginormous data sets will continue to be the norm.

~~~
halflings
I don't see how unsupervised learning makes this any better? That data you're
training on in an unsupervised manner is still collected somewhere, and could
contain as much private information as a labeled dataset.

~~~
visarga
Yep, whether it's supervised or not has no bearing on privacy. What counts is
where the data is processed and who has it.

~~~
yellow_postit
Labels for supervised training tend to come from humans in the loop. I think
many would consider another human looking at their photos, searches, etc. To
be a loss of privacy albeit with a small surface area.

------
jchw
Why is Bloomberg not mentioning that Google announced it was working on the
same thing? They mentioned vaguely that Amazon and Google both were working on
AI, but nothing about the seemingly similar TPU and how Google announced they
were going to bring it to phones at I/O just a bit ago. Am I wrong to be
thinking that's pretty relevant here?

~~~
ex3ndr
Because Google's TPU is for servers and not for Mobile.

~~~
jchw
Pretty sure they announced during I/O that they were working on making a TPU
for mobile.

~~~
hn_throwaway_99
Do you have a link? I don't remember this, and I didn't find anything through
googling.

~~~
TechRomancer
I'm definitely doubting myself now. I remember watching the keynote and
thinking how they didn't make a bigger deal out of the on device chip.

The only thing similar I can find is at about 1:22:00 in the keynote here
[https://youtu.be/Y2VF8tmLFHw](https://youtu.be/Y2VF8tmLFHw) but all he
actually says is "silicon specific accelerators".

So honestly at this point it could mean anything.

~~~
xxgreg
Related:

"Google has clearly committed to this vision of AI on the phone. At I/O, the
company also unveiled a custom-built chip for both training and running neural
networks in its data centers. I asked Google CEO Sundar Pichai if the company
might build its own mobile chip along the same lines. He said the company had
no plans to do so, but he didn’t rule it out either. “If we don’t find the
state-of-the-art available on the outside,” he said, “we try to push it
there.”"

"Companies such as Intel are already working on this kind of mobile AI
processor."

[https://www.wired.com/2017/05/google-really-wants-put-ai-
poc...](https://www.wired.com/2017/05/google-really-wants-put-ai-pocket/)

Edit - also:

"There’s already one mobile processor with a machine learning-specific DSP on
the market today. The Qualcomm Snapdragon 835 system-on-a-chip sports the
Hexagon DSP that supports TensorFlow. DSPs are also used for providing
functionality like recognizing the “OK, Google” wake phrase for the Google
Assistant, according to Moorhead."

[http://www.pcworld.com/article/3197412/mobile/heres-how-
goog...](http://www.pcworld.com/article/3197412/mobile/heres-how-google-is-
preparing-android-for-the-ai-laden-future.html)

------
shauder
Knowing them this will be pretty good. The A10 is a beast.

~~~
johansch
Knowing Apple's (software) prowess in AI the end-result will still likely be
shit compared to Google.

(I think what we are seeing here is the usual thing where Apple's
software/product/design people decide the iPhone hardware roadmap, rather than
the hardware people.)

~~~
yalogin
What is your basis for this? Am curious.

~~~
midhunsezhi
Siri vs Google assistant. Google assistant being way better in understanding
voice and performing tasks. I've used both iPhone and Pixel and I can confirm
that Google assistant is way more smart and intelligent than Siri at this
point.

~~~
mathieuh
Don't know why you're being downvoted, this has been my experience too. Siri
feels more like a toy with its infuriating jokes and stunted capabilities,
Google Assistant feels much more polished.

I have confidence that attempting a command I've never tried before with
Google Assistant will work, with Siri it's potluck.

~~~
Grazester
Yeah it's not even up for debate. Google is way ahead of Apple in Ai and just
about nearly another we could name.

------
hackuser
My assumption has been that Google, Amazon, and Microsoft run the heavy-duty
AI in the cloud when possible, benefiting from huge scale and easier updates.
Maybe that assumption is wrong?

If it's right, is Apple adopting a more decentralized model, with AI (or more
AI) running locally? Could that compete with cloud-based AI's advantages?
Obviously it would be better for offline usage, for responsiveness when
transmission to the cloud is a significant part the latency, and for
confidentiality.

~~~
happycube
Google's been working on distributed training as well.

~~~
hackuser
Why? What is the benefit to Google?

Also, are they doing training for the local user or for Google's 'general'
systems or for both?

~~~
bitmapbrother
So they can do on device AI/ML with TensorFlow Lite, through the use of
specialized neural network DSP's, as discussed during the keynote at I/O 2017.

[https://youtu.be/Y2VF8tmLFHw?t=1h22m8s](https://youtu.be/Y2VF8tmLFHw?t=1h22m8s)

------
deepnotderp
Well....

This is interesting indeed, although I suppose it was somewhat inevitable.

I'm definitely interested in the architectural details of the chip, but I
doubt Apple will open up. Apple has control of the software stack and by
extension, what models will run on this chip, so I expect that it will be a
little bit more special purpose than general purpose.

------
cft
I have been worried about this trend: if they don't open it up, things like
this introduce a disparity between startups that can only have access to GPUs
and big companies that make their own proprietary ASICs for their proprietary
software, such that startups cannot complete.

~~~
spott
The flip side is that there is pretty obviously a market for such a product.
If it isn't released by google or apple, it will be released by someone else.
If it isn't, then that is a pretty good idea for a startup.

~~~
cft
Only well funded startups will make ASICs.And most of them will fail. This is
very different from many small startups programming general purpose computers.

~~~
asimpletune
So then maybe the key is a start up that's in the business of raising the
chance of success that other players having in this endeavor?

------
wyldfire
> . The chip, known internally as the Apple Neural Engine,

Is this a real IC/processor for arbitrary software or an abstraction of an
underlying GPU/DSP?

~~~
mtgx
Most likely some kind of dedicated deep learning accelerator. This is coming
with or without Apple:

> _Exynos 8895 features VPU (Vision Processing Unit) which is designed for
> machine vision technology. This technology improves the recognition of an
> item or its movements by analyzing the visual information coming through the
> camera. Furthermore, it enables advanced features such as corner detection
> that is frequently used in motion detection, image registration, video
> tracking and object recognition._

[http://www.samsung.com/semiconductor/minisite/Exynos/w/solut...](http://www.samsung.com/semiconductor/minisite/Exynos/w/solution/mod_ap/8895/)

> _New Vision Processing Unit (VPU) paired to the Image Signal Processors
> (ISP) that provides a dedicated processing platform for numerous camera
> features, freeing up the CPU and GPU and saving power._

[https://www.mediatek.com/products/smartphones/mediatek-
helio...](https://www.mediatek.com/products/smartphones/mediatek-helio-x30)

I was also hoping that with Google's high-efficiency for the TPU, they would
make a version for mobile as well, at least for their Pixel phones. I guess
they still might, but the fact that the TPU2 does both training and inference
makes me think they won't do that anytime soon anymore.

The biggest reasons why I like this "mobile AI chips" trend is that they can
give you back some privacy, if the data can be analyzed _locally_ without
going to the vendors' servers, and I think they will also boost the
capabilities of computational photography. No more spying toys for kids, etc.

~~~
protomyth
What does the instruction set look like on that? Is there another example out
of this type of chip?

~~~
MBCook
Google has their Tensor Flow. chips, I would imagine it is similar.

~~~
protomyth
Is there released documentation on the instruction set of the Google Cloud
TPU?

~~~
pjscott
They list the main ones on pages 3-4 of this paper:

[https://drive.google.com/file/d/0Bx4hafXDDq2EMzRNcy1vSUxtcEk...](https://drive.google.com/file/d/0Bx4hafXDDq2EMzRNcy1vSUxtcEk/view)

It's mainly moving memory around, matrix multiplication, convolution, and
applying activation functions (sigmoid, tanh, relu, etc.). Very simple, high-
level stuff. This has the handy side-effect of making timing very predictable,
which makes the latency a lot more deterministic.

------
strin
Neural Engine = General Matrix Multiplier?

------
philplckthun
It's nice that the article is trying to deliver an intro that explains that
Apple clearly has some catching up to do.

Except that now I'm pretty baffled, since I've seen an article a few months
earlier, that says Apple is massively investing in AI, and already using it in
several places in their products.

So what am I supposed to believe now? :/

~~~
luhn
Believe them both.

Apple _is_ investing massively into AI and is using it in their products.
However, Google has been working with AI longer and has much more experience.
(The article praises Amazon's AI chops as well, I dunno about that one.)

------
ojosilva
I think the general availability of TPUs is an important inflection point in
the path to AI popularization, abd, who knows, some type of singularity event.
Definitely a milestone in 21st century history.

But I can't resist making a parallel between evolving TPUs and how the CPU
found in the arm of the T800 changed history (negatively) forever in the
Terminator universe.

------
ge96
That's what we need! A bunch of GPU's and computers carrying a car battery
haha. Man that would be crazy. Pre trained before it leaves the factory.
(don't know what I'm saying) but I do imagine a man-sized humanoid robot with
a bunch of GPU's, hard drives.

------
bonoetmalo
I'm not too familiar with the concept of ML specific chip designs, but isn't
most AI done on servers and the results returned to the device? What kind of
applications involve local ML code execution?

~~~
msh
If you want to protect user privacy it's a very relevant selling point to be
able to say that the data never leaves the device.

------
faragon
Apple partially offloading Siri from the cloud to client devices in order to
reduce datacenter costs?

------
samfisher83
Why not use a gpu. A lot of AI stuff is linear algebra: Multiply accumulate
etc.

~~~
Symmetry
Just as going from scalar to vector instructions provides a speedup so does
going from vector to matrix instructions. If you've got big vectors than the
amount of parallelism exposed for more hardware execution resources used on
isn't too big but the reduction in register file read port usage is pretty
significant.

Also, inference is usually happy with int8s whereas graphics workloads are
mostly float32s. So you can save a lot of hardware that way too.

~~~
dnautics
Why are graphics workloads float32? 32bit (1million+alpha) which is higher
color resolution than most eyes can see - "true color" \- is 3 8-bit ints + an
8 bit alpha channel (sometimes)

~~~
kgwgk
GPUs are not (only) about representing pixels, they are (mostly) about
geometric computation.

------
Geee
Are these kind of chips used to accelerate the NN training process only?

------
gigatexal
Good. But Google beat them to it already.

~~~
npgatech
How so? Google developed a server class TPU for Datacenters. Apple is trying
to build on-device low powered custom chip.

~~~
bitmapbrother
Google announced it at I/O. They're using DSP's, specialized for neural
network processing, on the SoC with TensorFlow Lite on the device.

[https://youtu.be/Y2VF8tmLFHw?t=1h22m8s](https://youtu.be/Y2VF8tmLFHw?t=1h22m8s)

~~~
npgatech
Oh wow, thanks. I feel like Apple is a step behind in pretty much all things
AI.

~~~
gigatexal
Yeah SIRI has been lame compared to Google's hey Google for a long time.

------
ClammyMantis488
Very interesting...

