
How AI and Machine Learning Work at Apple - firloop
https://backchannel.com/an-exclusive-look-at-how-ai-and-machine-learning-work-at-apple-8dbfb131932b
======
Smerity
None of this really answers the overlying question that Jerry Kaplan and Oren
Etzioni raised. The question raised by most in the field isn't whether Apple
use AI/ML internally, the real question is why they avoid the research
community so strongly.

For me, the greatest thing about the ML/AI community is how open it is and how
strong a sense of camaraderie there is between people across the entire field,
regardless of whether they're from industry or academia.

Employees from competing companies will meet at a conference and actually
discuss methods. Papers are released to disseminate new ideas and as a way of
attracting top tier talent. Code is released as a way of pretraining students
in the company's stack before they ever step through the company's doors.
Papers are published on arXiv when the authors feel they're ready - entirely
free to access - without waiting for a conference for their ideas to be
spread.

This entire push of camaraderie has accelerated the speed at which research
and implementation have progressed for AI/ML.

... but Apple are not part of that. They publish little and more broadly don't
have a good track record. On acquiring FoundationDB, they nixed it, with
little respect to the existing customers. Fascinating pieces of technology
lost. If they aren't using the exact thing internally, why not open source it?
I fear the same is likely to happen to Turi, especially sad given the number
of customers they had and the previous contributions that many of Turi's
researchers made to the community via their published papers.

Apple may change in the future - they may become part of the community - but a
vague article of self congratulation isn't going to sway me either direction.

"We have the biggest and baddest GPU farm cranking all the time" ... Really?
¯\\_(ツ)_/¯

~~~
IBM
What makes you think this is self-congratulation? This is a response to Marco
Arment, Ben Thompson, and other pundits (largely spurred by the PR efforts of
Google and to a lesser extent Facebook and Microsoft) who claim that Apple is
"behind" in this field or that it somehow poses an existential threat to
Apple. This is Apple saying "uh no".

They'd be perfectly fine with not talking about it as they've done until now
if they weren't trying to counter these media narratives.

As to your point about Apple not engaging with the research community, how is
this a surprise at all? Their M.O has been secrecy ever since Jobs returned to
Apple.

~~~
Smerity
This article lacks any real evidence they're ahead though, which is
essentially equivalent to "behind" if (a) the majority of the advances are
democratized by your competitors and (b) you lack the ability to attract top
tier talent. If AI/ML is an existential threat to Apple, then this article
doesn't provide any comfort beyond being a PR puff piece.

The talent war for AI/ML is truly insane. Given that the top researchers and
engineers can get similar benefits at other entities, whilst also continuing
to publish code and research out in the open _, Apple aren 't that attractive.

They may try compensating for this via acquisitions but that still leaves a
fundamental issue when it comes to long term retention.

_ As I mentioned before, this is a fairly universal trait. Many come from
academia or have open source affiliations / rely on open source tools for
their skill set, so the idea of putting their ideas out there to be used, ...

~~~
IBM
If what AI/ML talent wants is to do research and draw a Silicon Valley salary
then you're right, they're not going to work at Apple.

Some people don't want to work at Apple because they can't describe their job
on their LinkedIn pages or tell their friends and family about what they do.
Others don't like that they have to buy their own meals at Caffe Macs. I don't
think Apple worries about this too much.

What they do worry about is finding people that fit Apple's culture and want
to work on products used by hundreds of millions of people.

In any case, being perceived or not to be at the top of the field and being
able/unable to hire the best ML/AI talent isn't an existential threat. ML/AI
isn't magic, it's a technology. If there's one thing that the tech industry
should have learned from Steve Jobs, it's that you make good products by
working backward to the technology. In that respect, what Federighi mentioned
about not having a ML/AI division was reassuring.

~~~
sangnoir
> If what AI/ML talent wants is to do research and draw a Silicon Valley
> salary then you're right, they're not going to work at Apple.

Let me simplify that further: "If what AI/ML talent wants is to
_publish_...then you're right, they're not going to work at Apple". Money may
or may not be a factor, but Apples secrecy won't allow the talent to publish
their research. Given that AI is highly driven by research right now (no "if"
about it), what talent would want to work somewhere where they can't publish
research papers?

~~~
fsloth
"what talent would want to work somewhere where they can't publish research
papers?"

There are lot of people out there who don't particularly merit the publish-or-
perish ethos of academia and would rather publish only after they have
something worthwhile to publish and not just footnotes to an established
scheme.

E.g.
[https://www.theguardian.com/commentisfree/2014/feb/14/higgs-...](https://www.theguardian.com/commentisfree/2014/feb/14/higgs-
boson-publish-or-perish-science-culture)

and so on.

------
throwanem
I like how Apple can't win here.

If they publish, they aren't doing anything new and they haven't innovated
since Steve died, and they should really just give up because there's
obviously no point to anything they do and hasn't been since 1997.

If they don't publish, they're evil secretive bastards who don't contribute to
the ML community and probably drown puppies or something because who knows
what goes on behind closed doors?

I don't really have a dog in this fight, except inasmuch as I'm a generally
satisfied iPhone owner. I just think it would be really neat if people would
settle on one narrative or the other, instead of keeping on with both at once.

~~~
CogDisco
If they were serious, they'd provide objective evidence for their claims.
Papers, source code, numbers. Not "we have the biggest, baddest GPU farm" and
refuse to say any more as policy. Under this veil of secrecy and doubletalk,
all they have is marketing: Trust the brand rather than objective reality.

Many other companies are mature enough to show their cards but Apple keeps
declaring victory. Apple writes its own narrative instead of participating.

There are mature ways for this to shake out and they don't include your
strawman dichotomy.

~~~
BinaryIdiot
> If they were serious, they'd provide objective evidence for their claims.
> Papers, source code, numbers. Not "we have the biggest, baddest GPU farm"
> and refuse to say any more as policy.

But why? Who cares about papers, source code or numbers beyond the tiny
segment that HN caters to? Apple has always been about the user experience.
When they discuss a vast majority of prior accomplishments they discuss them,
many times, in terms of the layman. Does that mean they can't manufacture?
Nope. AI is simply harder to show without the papers / source code you mention
but this isn't typical Apple to release that.

I honestly don't think they care to go beyond the level of details outlined
here. I interviewed with the Siri team twice now and they certainly have some
incredibly smart people. Whether they're "winning" against Google, Microsoft
or whoever? I say who cares. They want to control their narrative without
divulging too much detail like they have always done.

~~~
Chronic9q
> But why? Who cares about papers, source code or numbers beyond the tiny
> segment that HN caters to?

I don't know, say, the thousands of future engineers and researchers, that,
you know, make the product?

~~~
throwanem
Yes. That tiny segment.

~~~
niels_olson
In my mind, this article is squarely aimed at that segment. They are trying to
tell one of the most competitive labor markets in the world, AI developers,
that they're hiring, but more importantly, that they're doing interesting
things.

------
cromwellian
I realize this is going to get voted down, but piece reads like a Trumpism "we
have great AI, trust me, you would not believe, if you could see what we're
doing".

I don't think you get to claim PR credit for advances in ML unless you
publish. In general, for R&D, IMHO, you need to publish. There's product R&D,
and there's fundamental R&D. If you make an advancement in something
fundamental, but that helps your product, then publish it. If it is specific
to your product only and can't be transferred elsewhere, then maybe it's ok to
keep it secret.

Apple and Google's competitive advantage now arises from scale and path
dependency. I think they need to let go of this idea that somehow they derive
a competitive advantage by keeping these things secret. The Open AI community
is going to advance at an accelerated rate regardless and IMHO, it's better to
be part of it than to be seen as a kind of parasite that consumes public R&D,
but doesn't give back improvements.

~~~
wodenokoto
> If it is specific to your product only and can't be transferred elsewhere,
> then maybe it's ok to keep it secret.

wouldn't it be the other way around? if the competitors can benefit from your
knowledge, you'd want to keep it secret.

~~~
cromwellian
No, if everyone can benefit, that, IMHO, is precisely why it should be
contributed back to the community. If you want a selfish altruistic reason to
do it, well the likely improvements that the external community will make will
benefit you, including those contributions made by your competitors.

I think a culture of secrecy yields local optima. Only if you believe (and is
true) your company has unique geniuses that can't benefit from other people
reviewing their science will secrecy benefit you.

IMHO, only research that is useless to your competitors is research to keep
secret in the sense that it is too specific to your own proprietary
dependencies.

------
theinternetman
Finding these heavily curated advertorials Apple has been pushing out (This
and the recent wired advertorials come to mind) a bit of a sign that not
everything is sunny at One Infinite Loop.

~~~
ocdtrekkie
Probably no more so than Google. Steven Levy is basically the go-to when you
want this sort of stuff. For example, [https://backchannel.com/how-google-is-
remaking-itself-as-a-m...](https://backchannel.com/how-google-is-remaking-
itself-as-a-machine-learning-first-company-ada63defcb70#.3gjh9zpy1)

------
mikesf888
IMO the most profound quote in the interview: “Our practices tend to reinforce
a natural selection bias — those who are interested in working as a team to
deliver a great product versus those whose primary motivation is publishing,”
says Federighi.

------
tahoeskibum
Nice article, but in my perception, Apple is way behind in AI vs. Google on
mobile. Siri's current speech recognition is still far behind Google on my
iPhone. Half of the time it Siri doesn't recognize something but Google
recognizes it right way. Apple Maps continues to lag in the following
features: biking and public transit and even things like which lane to take on
a big interchange. As a result I end up using Google (Now) and Google Maps by
default.

~~~
m_mueller
Never mind biking. For example in Switzerland Apple Maps doesn't even seem to
take traffic information into account when it gives you ETAs. It shows heavy
traffic as red spots, but it doesn't know the delay. Google meanwhile gets all
of this perfectly.

------
KKKKkkkk1
Why should Apple researchers contribute back to the AI community on their
shareholders' dime? What makes AI special as opposed to any other field of
computer science?

~~~
oblio
The long term goal is loftier than almost all others except for extending
human life and populating other star systems.

~~~
mwfunk
That's an opinion. I don't think you could get a random group of 3 AI
researchers to agree on what the long term goal is, or if there even is a long
term goal as opposed to this simply being a very buzzword- and marketing-
friendly area of research at the moment.

------
vthallam
> If you’re an iPhone user, you’ve come across Apple’s AI, and not just in
> Siri’s improved acumen in figuring out what you ask of her. You see it when
> the phone identifies a caller who isn’t in your contact list (but did email
> you recently). Or when you swipe on your screen to get a shortlist of the
> apps that you are most likely to open next. Or when you get a reminder of an
> appointment that you never got around to putting into your calendar. Or when
> a map location pops up for the hotel you’ve reserved, before you type it in.
> Or when the phone points you to where you parked your car, even though you
> never asked it to. These are all techniques either made possible or greatly
> enhanced by Apple’s adoption of deep learning and neural nets.

This whole paragraph must be a joke. Google started doing this since way too
long and they don't even publish these as their best features.

------
soared
I've never had a single one these happen to me. Has anyone actually seen these
behaviors out in the wild? Is it because I use gmail, chrome, google maps, and
shut off most of the siri/recommended apps/etc functions in favor of serious
battery life gains?

>You see it when the phone identifies a caller who isn’t in your contact list
(but did email you recently). Or when you swipe on your screen to get a
shortlist of the apps that you are most likely to open next. Or when you get a
reminder of an appointment that you never got around to putting into your
calendar. Or when a map location pops up for the hotel you’ve reserved, before
you type it in. Or when the phone points you to where you parked your car,
even though you never asked it to

~~~
Thlom
Nope, but it often tells me how long it will take to drive to work (I never
drive to work), even when I'm on the other side of the country, on holiday,
and has been for two weeks.

~~~
laichzeit0
So glad I'm not the only one. How do you turn this crap off?

~~~
Thlom
It will take 524 minutes to drive to work. Traffic is normal.

------
LeanderK
i am really disappointed by apple. I respect Apples wish to develop products
in secrecy and i understand that you can't just open source your secret sauce.
I also really like their products.

But not publishing your advancements harms the community greatly. Its like
building your product entirely with open-source software (the published work
of other researchers) and not contributing back.

~~~
soperj
That's what they did with osx.

~~~
posterboy
OSX is much more than the kernel. Darwin is open source, even. What are you
talking about?

~~~
leereeves
How much of Apple's investment in OS X was in Darwin, and how much was in
proprietary technology?

Apple built a proprietary product using open source technology, and are now
building proprietary products using open ML research.

But it seems they're doing so for their own profit, not to benefit the open
source and research communities.

~~~
coldtea
> _Apple built a proprietary product using open source technology_

You say it as its somehow contradictory. Open Source licences allow (LGPL
etc), and some even welcome (BSD, MIT, Apache) creating proprietary software
based on open source technologies.

Besides, Apple also did extend open source technologies (as open source) a
hell of a lot. Webkit past Apple is 100x bigger/fancier/better than the puny
KHTML it started from. Tons of LLVM work, especially all the early stuff, has
been sponsored by Apple. Swift was made Open Source just recently...

~~~
leereeves
Apple's core "product" is the user experience. Darwin is open source, Carbon
and Cocoa are not.

I think that philosophy will continue as they use ML tools. They might share
the ML equivalent of plumbing like Webkit/LLVM/Swift, but probably not
improvements to the user experience like Siri's brain.

~~~
coldtea
> _Apple 's core "product" is the user experience. Darwin is open source,
> Carbon and Cocoa are not._

So? Why should they open source and commoditize their core product?

Besides, I don't know any company that did it and got much out of it, except
for some gratitude from OSS fans.

~~~
leereeves
I'm not saying they should, I'm saying they wouldn't (and probably shouldn't).

------
msoad
Apple sends their employees to ML/AI conferences with fake company names on
their badges to avoid leaking a single bit of their knowledge. I don't know
how any AI researcher resists working at Apple!

~~~
thanatropism
So - I don't need a source on this to grill you and score internet rhetoric
points; I need a source to share this juicy factoid with skeptical friends.

------
vonnik
I have a big problem with articles like this.

Apple's PR is notorious for cracking the whip, which means that the "inside
story", if they give it to you, comes with a warning to the journalist to
behave and be nice. Levy's piece is generous with flattery and cautious with
criticism. He quotes Kaplan and Etzioni high and briefly in the piece, and
spends the rest of it refuting them. Apple will give him another inside story
down the road.

Apple has a big question to resolve for itself about the tools it's going to
use to develop this. It can't go with Tensorflow, because TF is from Google.
It's kind of at another turning point, like the one in the early 90s when it
needed it's own operating system and Jobs convinced them to buy next and use
what would become OSX.[0]

The most pointed question to ask is: What are they doing that's new? The use
cases in the Levy story are neat, and I'm sure Apple is executing well, but
they don't take my breath away. None of those applications make me think Apple
is actually on the cutting edge. There's no mention of reinforcement learning,
for example; there is no AlphaGo moment so far where the discipline leaps 10
years ahead. And the deeper question is: Is Apple's AI campaign impelled by
the same vision that clearly drives Demis Hassabis and Larry Page?

We see what's new at Google by reading DeepMind and Google Brain papers.
Everyone else is letting their AI people publish, which is a huge recruiting
draw and leads to stronger teams. Who, among the top researchers, has joined
Apple? Did they do it secretly? (This is plausible, and if someone knows the
answer, please say...) The Turi team is strong, yes, but can they match
DeepMind? If Apple hasn't built that team yet, what are they doing to change
their approach?

Another key distinction between Apple and Google, which Levy points out, is
their approach to data. Google crowdsources the gathering of data and sells it
to advertisers; Apple is so strict about privacy that it doesn't even let
itself see your data, let alone anyone else. I support Apple's stance, but I
worry that this will have repercussions on the size and accuracy of the models
it is able to build.

> “We keep some of the most sensitive things where the ML is occurring
> entirely local to the device,” Federighi says.

Apple says it's keeping the important data, and therefore the processing of
that data, on the phone. Great, but you need _many_ GPUs to train a large
model in a reasonable amount of time, and you simply can't do that on a phone.
Not yet. It's done in the cloud and on proprietary racks. So when he says
they're keeping it on the phone, does he mean that some other encrypted form
of it is shared on the cloud using differential privacy? Curious...

> "How big is this brain, the dynamic cache that enables machine learning on
> the iPhone? Somewhat to my surprise when I asked Apple, it provided the
> information: about 200 megabytes.."

Google's building models with billions of parameters that require much more
than 200MB, and that are really, really good at scoring data. I have to
believe either that a) Apple is not telling us everything, or b) they haven't
figured out a way to bring their customers the most powerful AI yet. (And the
answer could very well be c) that I don't understand what's going on...)

[0] If they have a JVM stack, they should consider ours:
[http://deeplearning4j.org/](http://deeplearning4j.org/)

~~~
c0g
Good comment - a few thoughts:

AlphaGo is impressive no doubt but has DeepMind done anything really key to
Google's bottom line yet? A lot of the really sweet stuff they do doesn't have
immediate commercial utility that I can see. Apple might be waiting to strike
once Google has found the killer app (remember they're never really first at
anything and focus holistically on the product).

Apple is benefiting from other companies releasing their research. If everyone
but Apple releases the community is nearly as good, and Apple gets the pick of
external and internal research while not needing to give up any of their own
ideas. I know they send people to conferences and it can be a bit weird
talking to someone who won't tell you anything about what they do.

Regarding researchers, I don't know of any top trend setters who've joined but
they do have some very good applied ML people through direct- or aqui-hires.

Tl;dr: Apple doing what they usually do and keeping their powder dry/free
loading off other's work until they can execute the product.

~~~
davidcgl
DeepMind reduced Google data center cooling bill by 40% in an experiment. If
this effect is real, it probably more than justified the $500m tag Google paid
to acquire DeepMind. Google might not have come up with a killer app, but
perhaps the real killer app is not in the consumer space.

[https://deepmind.com/blog](https://deepmind.com/blog)

[https://news.ycombinator.com/item?id=12126298](https://news.ycombinator.com/item?id=12126298)

~~~
j1vms
To be fair, the HN discussion was mostly focused on the lack of substantial
details or an actual research paper to back up that claim. Google would
probably need to show that only DeepMind, or an equivalent device, would have
been presently likely to find the energy savings that it ended up finding.

------
devy
There are a lot of criticisms here about Apple's secretive AI/ML development
practices. But it's not unusual to Apple's long culture heritage of secrecy.

From consumer's perspective, I applaud their firm believe of customer privacy
as well as pioneering consumer products based on AI/ML development AND with
differential privacy in mind.[1][2]

[1] [http://highscalability.com/blog/2016/6/20/the-technology-
beh...](http://highscalability.com/blog/2016/6/20/the-technology-behind-apple-
photos-and-the-future-of-deep-le.html)

[2] [http://www.imore.com/our-full-transcript-talk-show-
wwdc-2016...](http://www.imore.com/our-full-transcript-talk-show-
wwdc-2016-phil-schiller-and-craig-federighi)

------
quattrofan
THis feels so much like a story done as a marketing/PR angle

------
castell
Is there a way to op-out of this on iOS and macOS? That's really scary.

ML and AI on OS-level should run decentral on the device itself, and don't
leak data at all. The spirit of the 1990s was that way, and we older desktop
software works fine that way (even on Pentium 1 hardware), so it would run
like a piece of cake on a modern smartphone.

The "differential privacy" technology may sound good, but without an
independent audit who knows how good it works.

------
dstaten
siri has gotten worse over time, or at least that's why friends and i have
noticed

~~~
k-mcgrady
I've noticed it get drastically better. I never used to be able to use it at
all. It was just a pointless exercise. Now it works for everything I use it
for, which isn't a lot, but I don't get any issues.

------
emehrkay
Off topic a bit, but this article makes me wonder where does one start with
Deep/Machine Learning/AI? I've seen a few posts the past few days talking
about the topic (Deep Learning with Python, etc.), but what are the core
requirements regarding math, statistics, programming, etc? Where should a web
developer start?

~~~
starchild3001
Check out the stanfard MOOC class taught by Andrew Ng... that's where most
people (I know of) started.
[https://www.google.com/search?client=safari&rls=en&q=Andrew+...](https://www.google.com/search?client=safari&rls=en&q=Andrew+Ng+machine+learning&ie=UTF-8&oe=UTF-8)

Web developer to ML? Doable but you may have to work harder than some other
folks have a background in statistics, probability, signal processing etc.
Plus, you don't have to be an algorithms developer... doing the backend,
compute cluster is hugely valuable, too.

~~~
emehrkay
Thank you, I actually just found him on
Coursera([https://www.coursera.org/learn/machine-
learning](https://www.coursera.org/learn/machine-learning)) before your reply.
I will definitely check it out.

I don't think that I want to make the switch, just know what is what and
dabble a little bit.

------
plg
has anyone considered the possibility that perhaps Apple has made a judgement
that the recent resurgence in AI/ML is, (like all of them before, over the
past 50 years) overblown, and Apple would rather spend their time and
resources on other things?

There's no doubt that the recent advances in deep learning have improved ML/AI
in certain specific domains ... but it seems like every 15-20 years or so we
see an advance and an accompanying narrative that "AI is back! fully automated
future is near!"... which fizzles out, again

Also, Apple has a more humanist tradition than Google, FB, etc, and it's my
impression that they value the human element perhaps more.

Sure, there's Siri, but Siri strikes me more like an ongoing experiment than a
fully fledged whole hog "let's put all our eggs in this ML/AI basket"

------
caycep
They were certainly at NIPS. I got a nice Apple pen and ski hat...

------
doe88
It isn't clear from the article do they still use Nuance?

