
Apple’s announcement on artificial intelligence is a big shift for the company - monsieurpng
https://www.washingtonpost.com/news/the-switch/wp/2016/06/13/apples-big-announcement-on-artificial-intelligence-is-a-massive-change-for-the-company/?hpid=hp_hp-cards_hp-card-technology%3Ahomepage%2Fcard
======
Animats
It's not so much about AI as it is about putting all services behind an Apple
interface. "Users will soon be able to use Slack, Uber, or Skype, by talking
directly to Siri." That doesn't mean launching the vendor's app. It means
bypassing it.

Apple is taking control of the user experience with third parties. It's the
next generation of the "portal" concept. Expect to see Apple standards on what
your API needs to look like.

~~~
sametmax
Yep. First, take control of the distribution with the stores. Then take
control with the paiement system. And now take control of the data input.

The golden jail is getting a new door.

~~~
samastur
New room. New door would be fine, golden or not ;)

~~~
marcosdumay
It's a one way trap-door.

------
abruzzi
> Such technology can make the phone or other device appear smarter because it
> anticipates the types of activities people want to do.

Currently all the attempts I seen to anticipate what I want make applications
far more annoying (do you want to send this email to your mother as well?).
Its kind of like an uncanny valley. An application that truly could anticipate
my needs would be good, but when it tries to anticipate and gets it wrong, it
becomes worse than the application that doesn't try and silently waits for me
to give it instructions.

~~~
TheOtherHobbes
Is this even a solvable problem?

Imagine a human PA who knows you really well. You'll still have to clarify
what you want from him/her a good fraction of the time.

I suspect AI lives in a special kind of uncanny valley where we're _less
tolerant_ of mistakes and imperfections than we would be if we were telling
humans to attempt the same tasks.

I'm not sure why this is, but it could be because in spite of all the bugs and
failures, we still expect computers to be far more predictable and reliable
than humans.

If AI doesn't match the expectation, it's perceived as more frustrating and
less useful than perhaps it really is.

------
mklarmann
Google at their last conference put AI so central – as the next big thing,
that I just can't help to think, that Apple was forced to follow that path. So
they used the terminology of "AI" and "deep learning" in their presentation.
Yet it didn't make me confident they are really up to it. That they really
have the expertise, to make this into something that does work (even on the
phone itself! - all the other deep learning algorithm need huge frameworks
with GPUs!)

Myself, I wasn't able to experience the promised virtues of AI from Google
itself. So I wouldn't go as far to call it a bluff from Google. But definitely
from Apple, as of now.

~~~
simonh
I see it more as a back-and-forth. If anything Apple fired the first shot in
anger in the platform AI wars by integrating Siri into iOS. Before then voice
control was just simple vocal commands and dictation.

It seemed for a while though that Apple was falling behind. Google Now and
Cortana overtook them in capability and now they're fighting for the lead
again.

I wonder what happened. Perhaps the original Siri team didn't properly gel
into the Apple corporate culture? I know the founders left after a few years.
Anyway, it looks like Apple now have a solid internal Siri team able to push
the platform forwards.

------
MaysonL
Note that the words "Artificial Intelligence" were not once uttered during the
keynote.

~~~
ansgri
These are taboo for significant part of the general population, almost like
formulas in slides.

~~~
pc2g4d
AI has already gone through at least one prior boom and bust cycle.
Expectations in this area have a tendency to get wildly inflated. Does AI mean
general AI? Or just a cleverly applied machine learning algorithm? Seems wise
to me to stay away from the term.

------
comex
> For example, Apple will now scan your photos using facial recognition to
> cluster people together in your photo collection.

Apple's OS X photo apps have done facial recognition since 2009. This new
thing is more advanced as well as a first for iOS, but I'd hardly call it a
"big shift".

~~~
philjohn
The first is doing it on a mobile device, locally, without sending data to the
cloud. That's pretty big, and only been possible recently with the performance
increase in mobile SoC's.

~~~
NEDM64
Previous versions already did that.

~~~
jdminhbg
Previous versions did not do that on mobile devices, only Macs.

------
baldfat
Good for Apple they are finally responding to the market and developers by
opening up their wall a bit.

------
peter303
Apple lead with Graphical User Interfaces for the better part of 30 years.
First with the original Mac bringing Xerox technology to the masses. Then with
the NeXTStep revived iMac, the clever iPod, and the smart phone. Now they
struggle in the post GUI era of Voice and A.I.

~~~
NEDM64
According to Google/Android fanboys circle jerk, yes.

------
bitL
Another company going to do what they don't have any clue how to do. Google+,
Bing, ... Seems like we now have 4-5 companies copying each other in a silly
way all the time. Apple/MS simply can't do cloud properly whereas Google
barely keeps with Amazon, Amazon's AI is horrible comparing to Google, Apple
is copying MS' surface and design (!), Google+ can't do Facebook at all,
Facebook can't do ads properly, MS can't do search... Looks wonderful for
Apple's AI.

~~~
riyadparvez
I thought MS cloud is doing fine in comparison to Google -- MS is second while
Google is third. However, I totally agree with you on others. Particularly,
Apple with its closed culture. How many researchers Apple can hire AI
researchers while the researchers are so high in demand. The researchers can
go to other companies, whom are more open and let the researchers publish
their work.

~~~
bitL
Technically they are still well behind both Amazon and Google (which is
shooting itself in its feet by dumbing down the infrastructure they offer to
their clients). The same that you stated for AI holds true for distributed
systems - not that many top people around. MS is obviously leveraging their
Win platform to get subscribers even if they can't match Amazon.

------
hokkos
No it's not, Apple have since decade facial recognition in iPhone, voice
recognition in Siri (with a bit of AI to generate the answer), and handwriting
recognition too.

------
Oletros
Face recognition and picture processing is considered AI?

~~~
sandstrom
I agree! Six months ago this was called machine learning, now it's all AI.

Sort of like BigData and Web 2.0, vague terms used for almost everything.

~~~
WoodenChair
Machine Learning used to be considered a sub-discipline of AI (open an AI
textbook and you'll see a couple chapters on it). It's grown so big that now
it's a term used in and of itself outside of AI. It is starting to be seen as
distinctive from "traditional" AI in some circles. So the correct
understanding is either that Machine Learning is a subset of AI or that
Machine Learning is a field that emerged from AI.

------
ams6110
All of those features sound awful to me. I don't want AI scanning my text
messages and "anticipating" what I want to do. That's just way too creepy. The
photo stuff, irrelevant. I realized long ago that I never look at my photos
later so i stopped taking them. But if I did I don't think I'd want Apple
slurping them all up to run recognition algorithms.

~~~
hrayr
During the keynote, they mentioned the face recognition and other ML
processing happens on the phone hardware to protects your privacy.

~~~
s_q_b
Oh good, well at least it's not being calculated on a device with a constantly
scanning promiscuous radio.

~~~
comex
If someone hacks your phone, they can get your photos regardless of whether
your phone is doing facial recognition on them.

~~~
s_q_b
Yes, that's exactly my point. The article portrays the facial recognition as
"safe" because it occurs on an iPhone. iOS is not a secure enclave, and
pretending that it is can be very dangerous.

