
Why Apple believes it’s an AI leader–and why it says critics have it all wrong - arunbahl
https://arstechnica.com/gadgets/2020/08/apple-explains-how-it-uses-machine-learning-across-ios-and-soon-macos/
======
jointpdf
Setting aside all the other work Apple is doing in this area, can someone
explain why iMessage autocorrect is unacceptably bad (to put it
diplomatically) in the era of weapons-grade language models?

Its behavior is totally bizarre. It’s like an underpowered chess engine that
makes flagrant blunders: capitalizing random words in the middle of a sentence
(rock -> Rock), the context sensitivity of an actual rock, forcing the same
correction multiple times (i.e. you go back to fix its error, and it defiantly
repeats it), contraction mixups, blindness to off-by-one-keystroke errors
(consentuallt -> _< no action>_), and of course the occasional random word
substitution. Only martial artists want to duck people (and it—no joke—just
now substituted [duck -> suck]. You had one job autocorrect.)

Is this just me? Is it actually 2020? What is going on?

~~~
nfg
The one the drives me mad the most is when I type a word which it incorrectly
autocorrects to a proper noun (which gets capitalised), I then use backspace
to delete that word and type the - completely different - correct one, but it
insists on upper-casing the first letter to match the original casing of the
incorrect autocorrect. It really should keep track of the fact that it
introduces the case-change and invalidate it when the automated action itself
is invalidated. This happens frequently to me.

~~~
strogonoff
I can never make iOS write “Hong Kong” when swipe-typing; tried setting up
forced text replacement entry (kong->Kong) to zero effect. It is always “Hong
kong”.

~~~
jsinai
Similarly I can never get iOS to type “Kings Cross”, it always forces “Kong’s
Cross” even when I delete the word and retype.

PS: iOS just did this twice while typing this reply.

~~~
logicprog
iOS for me always autocorrects king -> kong, and I don't understand why. What
even is "kong," and who says it more often than "king"??

------
throwaway202020
You can't be AI leader when every AI leader is staying away from you by 10ft
pole. When Ian Goodfellow joined Apple, there was literally rain of criticism
on him from ending his career as researcher to bowing down to money. I don't
know of any researcher who does want to have continued research career willing
to join Apple. They simply don't allow that kind of freedom or publishing
results. While Apple has some strong points in imaging (thanks to their 1000+
people team) and wrist rejection, virtually everything else they do that
requires AI sucks and lags behind competition, including, Siri, maps,
autocorrect, spell check, iCloud, search, calendar, spam detection,
recommendations etc. For most of these things, most people don't even count
them as real competition. Google on the other hand is able to achieve very
competitive performance through software and AI without such large team on
phone camera and frankly quite pathetic hardware.

These kind of reality distortion pieces aren't going to help them. They have
$100+B, in cash, they can easily start reputable open research lab that can
rival FAIR, OpenAI or DeepMind. Even a smaller companies like Intel and Adobe
is starting to realize that this is necessary so they can tap into expertise
on demand. At minimum that will be totally worth for a talent pipeline that
can be motivated to do "rotation" or "sabbaticals" into product groups from an
open lab.

~~~
linguae
It's a shame that Apple doesn't have an industrial research lab; it certainly
has the funding to create a lab that would rival Microsoft Research or perhaps
even legendary labs like Xerox PARC and Bell Labs. Apple used to have a lab:
during the "interregnum" years of 1985 to 1997, Apple had a fantastic research
group called the Advanced Technology Group, led by the late Larry Tesler. In
some ways this group was a sort of spiritual successor to Xerox PARC; Larry
Tesler was ex-Xerox, and the legendary Alan Kay (of Smalltalk fame) and Don
Norman (who wasn't from Xerox PARC but who is a legend in usability) were
involved in this group. This group worked on many interesting and important
technologies, such as Quicktime, AppleScript, OpenDoc, HyperCard, speech
recognition, and more. Even though the Dylan project did not come from the
Advanced Technology Group, it is another example of interesting work that came
out of Apple during this time period. Even though commercially Apple struggled
during the latter half of the interregnum, Apple created some amazing
technologies during this time period.

Of course, the Sculley/Spindler/Amelio Apple seemed to be far more open
regarding research than the Jobs- and Cook-era Apple. Jobs closed down the
labs in 1997 and help institute Apple's famous culture of secrecy that
persists today.

~~~
odyssey7
Has the culture of secrecy benefited them at all since the launch of the
iPhone?

~~~
sjg007
I think the answer to that question is yes.

------
lacker
_" Google is an amazing company, and there's some really great technologists
working there," he said. "But fundamentally, their business model is different
and they're not known for shipping consumer experiences that are used by
hundreds of millions of people."_

What does he mean by this? Google search and Android are both used by more
than a billion people. YouTube over 2 billion. These are all bigger than any
Apple product. If anything, Google _is_ known for shipping consumer
experiences that reach a large number of people. Apple by contrast is known
for its high-quality, high-price consumer experiences that reach fewer people.

~~~
shrimpx
I thought that was odd, too. My best guess is that he’s emphasizing “known” as
in notorious. Apple is “that company that ships experiences” whereas google is
that company that records you while you use their apps or whatever.

~~~
pas
Is that even true? I mean on HN sure, but in the general population ... I have
no idea what's the perception of Google.

------
cromwellian
(Googler here) I feel article presents a false dichotomy/view (imho borderline
disingenuous impression) that Apple is doing on-device, and everyone else is
doing on-cloud.

Android also does on-device. A lot of the time, things start out on-cloud,
until whatever machine learning model can be shrunk to run on-device at the
right performance. So you see for example, text-to-speech and speech-to-text
start out as cloud calls, and now they're on-device. Google Translate ran in
the cloud, but now for some languages, it happens on-device.

Things like Google's "Live Transcription/Caption" on Android wouldn't work if
it wasn't on-device.

Apple similarly went to the cloud for Siri speech recognition and TTS until
they could run it locally.

For other things which need large models, there is Federated Learning to
preserve privacy. Google Keyboard has been using Federated Learning for some
time now.

------
Veedrac
Apple's AI work is decent for what it is, and their AI hardware is fine too,
but so much of this article is just weird.

It starts by talking about how until recently Apple wasn't doing AI work where
it needed to be. Then there's the weird excerpt where he claims Google is “not
known for shipping consumer experiences that are used by hundreds of millions
of people.” The author then raises the legitimate point that AI benefits from
having lots of data to train on, but then quotes an answer by Giannandrea to a
different question, which includes him stating that bigger models aren't more
accurate than smaller ones. The point that on-device inference is more
responsive is valid but not unique to Apple; the article says “Android phones
don't do nearly as wide an array of machine learning tasks locally”, but I
don't think this is true.

~~~
jolux
Apple’s chips are way better so whether they’re doing more locally or not they
have the headroom to do a lot more.

I would agree with that characterization of Google. They have a very small
handful of successes and an enormous pile of failures that they’ve discarded.
They give every impression of not really knowing what to do.

I think Giannandrea was also referring specifically to the kinds of
experiences Apple can provide on the iOS and iPadOS platform because of their
high end hardware and deep software integration. Google has yet to replicate
that, and even seem to be bored with Android recently:
[https://daringfireball.net/linked/2020/08/05/wear-os-
music](https://daringfireball.net/linked/2020/08/05/wear-os-music)

~~~
IshKebab
What AI failures have Google had? Google Home is much better than Siri or
Alexa. The camera feature in Google Translate works really well. Their new
auto subtitles on Android work really well. Searching photos for objects works
really well.

The only thing I can think of is that think they demoed that would call
restaurants to book them for you, but that was clearly highly experimental,
and it's not like Apple has done that.

~~~
tedeh
[https://github.com/elsamuko/Shirt-without-
Stripes](https://github.com/elsamuko/Shirt-without-Stripes)

~~~
IshKebab
Ok what's your point? Does Apple's search engine work for this query?

------
Despegar
This piece is interesting because Apple was saying this all along but no one
really believed them because it sounded like excuse making. But here JG is
basically saying the same thing:

>Yes, I understand this perception of bigger models in data centers somehow
are more accurate, but it's actually wrong. It's actually technically wrong.
It's better to run the model close to the data, rather than moving the data
around. And whether that's location data—like what are you doing— [or]
exercise data—what's the accelerometer doing in your phone—it's just better to
be close to the source of the data, and so it's also privacy preserving.

A few years ago was when this narrative was at its peak and I believe it was
mostly because Google (and to a lesser extent Facebook) were talking about
machine learning and AI in basically every public communication. What came of
it? Were all the people who claimed Apple's privacy stance would leave them in
the dust proven right? For one, being "good at machine learning" is like
saying you're good at database technology. It's a building block, not a
product. Maybe Google and Facebook are doing cutting edge research in the
field, but so was Xerox PARC.

~~~
moandcompany
When it comes to machine learning, the subtlety here is that there are at
least two sides or facets to machine learning: (1) training and (2) inference.

It's fair to say that there are multiple areas for AI leadership.

It is generally believed that:

Model Creation

(1) Those with the access to the best (which is not necessarily the most, but
often believed to be) data have a strong starting point for training models;
because of this Google, Facebook, and Microsoft have often been attributed to
have this advantage due to the nature of their businesses.

Model Application

(2) Inference/prediction at the edge, e.g. on-device, is believed to be the
best point for applying those models; this can be for a variety of reasons,
including latency and other costs associated with sending model input data
from edge sensors/devices. Some applications are entirely impractical or
likely impossible to achieve without conducting inference on-device. Privacy-
preservation is also a property of this approach. Depending on how you want to
view this, this property could be a core design principle or a side-effect.
Apple's hardware ecosystem approach and marketshare (i.e. iPhones) provide a
strong starting point for making the technology ubiquitous for consumer
experiences.

~~~
llampx
Re: Prediction at the edge, I would think that it's better if there aren't
going to be any updates to the model. Or if internet access is limited.
Correct me if I'm wrong, but most of the ML inference actually takes place on
the cloud nowadays, not on-device.

~~~
tialaramex
Here's a nice example that like a lot of the best things is always passively
present. My Pixel 2 knows what music it hears.

There's a pop song playing, I kinda like it. I _could_ pay attention to the
lyrics and try to Google them or ask somebody that might know what it is... no
need, I just look at my phone, "Break My Heart by Dua Lipa" it says on the
lock screen. The phone will remember it heard this, so if I get home this
evening and check what was that... oh, "Break My Heart by Dua Lipa".

Google builds a model and sends it to phones that opted in to enable this
service. It's not large, and I actually don't know how often it's updated -
every day? Every week? Every month? No clue. But the actual matching happens
on the device, where it's most useful and least privacy invading.

------
georgespencer
I don't know anyone who believes that Siri is as good as Alexa. I spent four
months self-isolating in an apartment with the small hockey puck Alexa devices
in random corners and returned to my big open plan apartment with two paired
HomePods on the kitchen counter.

The frequency with which Siri shits its pants (can't help, asks me to excuse
it being slow as it tries to set a timer, mishears me, etc.) is honestly
remarkable.

(Not to mention the fact that my phone continues to alert me to text messages
read on my Mac or iPad whole minutes ago.)

Apple is still working to overcome deep problems in both its cloud
infrastructure and AI/ML. If they cannot be honest about this, they should not
be dishonestly trying to present a picture of all being well.

~~~
antipaul
Apple says they do on-device, so yea, why is Siri "thinking" for 20 seconds
when I ask it to set a timer for 10 minutes for the 1000th time

------
tialaramex
Show don't tell. You don't lead in AI or anything else by insisting everybody
who noticed you're bad at something "has it wrong".

~~~
cwhiz
Did you read the article? The whole point was that Apple has innovated but
that they don’t make a big deal out of it. Marketing, basically.

~~~
rvz
Yes, it's more marketing than research here. They still push their reality
distortion spells on us to believe that they're an AI leader but in reality
they innovate on the ideas of others; just like old times.

> Did you read the article?

I really hope you have read the HN guidelines [0]

> Please don't comment on whether someone read an article. "Did you even read
> the article? It mentions that" can be shortened to "The article mentions
> that."

[0]
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)

~~~
cwhiz
I did not read the guidelines. My mistake. You’re right.

------
bambax
> _Machine learning is used to help the iPad 's software distinguish between a
> user accidentally pressing their palm against the screen while drawing with
> the Apple Pencil, and an intentional press meant to provide an input. It's
> used to monitor users' usage habits to optimize device battery life and
> charging, both to improve the time users can spend between charges and to
> protect the battery's longterm viability. It's used to make app
> recommendations._

The problem with machine learning altering the behavior of a device is it
shortcuts human learning. The human brain is very good at learning deep
insights about things and its environment and alters its behavior accordingly.

If things change while we're learning about them, it confuses and upsets us. A
dumb machine is much easier to use than a "smart" one.

~~~
kumarvvr
This might be true for mostly user interface stuff. But for things that are
happening under the hood, a human mind will get accustomed to a gradually
improving user experience.

Apples mantra seems to be to make everyone use the device as per their whims
and fancy, and let the device figure out how to deal with it.

Ultimately what this leads to is a _user experience_ lock in, where other
devices that dont adapt feel clumsy or stupid.

------
jml78
Did the interviewer not know enough about ML to even challenge the bogus
statements.

Yes, apple’s strategy is more privacy protecting.

But holy hell, yes larger data sets are going to be more accurate and the
resultant model doesn’t have to run in the cloud, it can run locally.

AI training in a data center with large data sets, ship the model to local
devices to execute.

That is ALWAYS going to be better and more accurate than what Apple is doing.

------
Spooky23
I agree that Apple is underrated here, and I think their rigid user interfaces
for things like Photos hides the power of their platform.

It seems absurd to let Google Photos slurp up your data server side when your
iPhone can do 80+% of the photo categorization automatically. It’s equally
absurd that Apple has a glacial pace of change for the user side.

~~~
marmshallow
In my and several of my friends' experience, Google Photos search is simply
much, much better than Apple Photos. Even though I use Apple Photos
personally, I have to admit Google's is just better. My friends always say "if
I can't search for my photos easily, what's the point in amassing a large
collection?".

~~~
dmitriid
It's not just search. Google Photos "slurps" photos at 10-20x the rate of
Photos.app with clear indication of what's happening and what's being
uploaded.

Photos... just sits there. To get a new photo to my desktop it's faster to
open Google Photos on my desktop and download the photo from there than wait
for both Photos.apps to finish syncing.

~~~
lowdose
It would be nice to have an option to not delete every photo after backing up
to Google Photos.

------
LeicaLatte
Fluff Piece. Apple PR machine learning that techies read ars Technica.

~~~
x42n
Absolutely! Crafted to 1) Create interest and talent pipeline 2) identify and
funnel user frustrations and comparisons.

------
hepinhei
An interview full of nothing.... When they say Google has no experience
shipping user experience used by millions of people

------
majestik
> After a brief pause, he added: "I guess the biggest problem I have is that
> many of our most ambitious products are the ones we can't talk about and so
> it's a bit of a sales challenge to tell somebody, 'Come and work on the most
> ambitious thing ever but I can't tell you what it is.'"

JG, I don’t think that’s your “biggest problem” - Siri is. Your privacy
centric on-device strategy limits your view of user feedback, Google gets a
lot of shit wrong but they know how to transmit user data and understand their
feedback.

------
pcr910303
I like this article. I like Apple’s approach to ML because it blends in. When
applied, the feature should not expose that it’s based on ML — that’s a
failure. And so Siri, Alexa, Google Assistant are failures. But Face ID, plam
rejection are successes.

If you have to explain the customers that it’s ML based, that’s the same as
asking for the customers to understand it’s unreliability. And unreliable
features are worse than no features, and that’s why nobody uses Siri, Alexa,
Google Assistant except for a few reliably-working requests.

------
segmondy
Apple is a fashion house. They will be around for quite a while, and still
have time to turn things around, but if they keep at their current pace, they
will be back to irrelevance in 2 decades. Apple builds beautiful looking
hardware. Software wise? Complete garbage.

------
Syeposxr
It's the decline of Intel and the advancements Apple has made in designing its
own processors that I think are really interesting. Having powerful and
efficient processors on mobile devices - both laptops and phones - allows
Apple to do edge computing in a way other companies haven't been able to, and
integrate ML in much more privacy-focused way.

------
m3kw9
Aside from what they say, they still can’t get their iOS AI based spelling
correction up to par.

------
NicoJuicy
Don't have iOS. But I do wish that Android with detect the current language
you are writing in.

I have 3 languages on my keyboard and sometimes it suggests an autocomplete in
the wrong language.

~~~
Krasnol
I have German, English and Polish and it works perfectly.

~~~
NicoJuicy
Not so great with Dutch, French and English though.

Weird

~~~
Krasnol
I was very surprised with how good it actually was for Polish.

It's a hard language and not as good covered usually with ML as English or
German is.

------
lowdose
It's like saying you are the smartest girl of the city.

When you have to say it yourself you probably aren't.

------
amelius
NVidia thinks they're an AI leader too, and they're probably closer to the
truth.

------
innagadadavida
Funnily, the one data that seems to matter for Siri is the web and wiki data
for question answering. Siri still uses wolfram for many trivial questions.
None of this is about privacy or user data and Siri is behind the state of the
art here.

------
notsureaboutpg
Apple is behind in AI. Google has Google Lens, real-time transcription of
audio recordings, live captions. And Google Assistant is leaps and bounds
ahead of Siri in nearly every way.

What is there in AI that Google doesn't beat Apple at?

~~~
mr_toad
I haven’t seen an Android phone with good face recognition yet.

------
29athrowaway
What is AI? Your washer machine and some rice cookers use AI (fuzzy logic)

Everything uses AI. A program with an if statement is AI.

------
f2322323ffff3
One thing where apple is shining is brainwashing. It looks very funny when
caravans of people are waiting store opening to hysterically buy an overpriced
smartphone, even if it's technologically 2 years behind cheaper Samsung S20
(except chip performance, but it doesn't matter anymore, because there is no
tasks that needs so much computing power)

~~~
enjeyw
> because there is no tasks that needs so much computing power

I think your own point explains why people are more than happy to use a more
expensive iPhone despite technological differences.

I'm currently developing with a relatively new android phone, but I'd never
use it as an actual day-to-day phone in lieu of my circa 2014 iPhone (believe
me I've tried).

The extent to which I prefer the overall iPhone experience far outweighs any
technological upper-hand that the android has. In the scheme of things, phone
technology has barely shifted in 6 years; a supposed 2 year difference is
utterly negligible.

Side note: what's the deal with the Google Play store??! I tried to download
facebook messenger a few days ago for testing, and accidentally downloaded an
entirely different app with a very similar name and icon as it somehow managed
to occupy the top listing (ad?). Seems obscene that it's so easy to mount a
phishing attack against android users.

~~~
canofbars
I just switched from a pixel 2 to an iphone 11. For me it seems that iOS and
Android are pretty level quality wise, they both have minor things that they
do better but the number one thing that makes the iphone better to me is they
are able to do all the awesome stuff android does without sending your data
away for processing. Having image recognition without the privacy loss is
worth the price tag to me.

