
Apple’s Shortcuts will flip the switch on Siri’s potential - evo_9
https://techcrunch.com/2018/07/08/shortcuts-will-flip-the-switch-for-apple-on-siris-potential-just-not-how-you-think/
======
subroutine
Cognitive science has a term called "Theory of Mind" (ToM), which refers a
human cognitive ability that typically emerges in 4 year olds (In fact there
is a famous experiment you should look up and try if you have any
kids/nieces/nephews around that age), whereby humans are able to formulate the
mind of another entity (the capacity it's working at; information it holds,
and doesn't). This is what allows us to, for example, tell a lie. It also
allows us to modify our approach in the way we communicate with dogs vs
children vs adults, for better communication efficacy (or two different adults
where one has high and one has low expertise in some domain, like
programming).

In my opinion the biggest thing preventing me from using siri is not what it
can and cannot do, but that it has been nearly impossible for me to develop a
ToM for Siri. And since I simply dont know what siri is capable of, I only use
this bot for a very narrow set of tasks. Furthermore one of my ToM priors for
siri is that she has a mind that is incapable of learning. This is a big
turnoff - bigger than i think we acknowledge; since we are used to interacting
with entities that can, even if laboriously. For instance my Australian
shepherd might not be able to bring me the newspaper; but if I really wanted
her to do that task, I could slowly get her to approximate that behavior, and
it would probably be satisfying to see progress in performance. With siri, I
simply assume there are things she can do and things she cannot, and it'd be
pointless to try and goad her into adding even a trivial task to her
functional repertoire.

~~~
acranox
The lack of learning is what also bugs me about all the new tech. It's being
used for targeted ads, and stuff like that, but not for trivial tasks that
would actually improve my interaction with the interface. A few things that
come to mind, are that I always listen to music as full albums, and yet the
UIs still always try to get my to shuffle various playlists. Every day around
the same time I text my partner nearly the same message, "I'm leaving work."
and yet autocomplete still suggests other phrases. If I click the share
button, I then go through 6 steps to share the article via a text message with
the same person. Imagine if clicking share offered one click of "Would you
like to send this as a text message to Jane?"

I'm not a fan of the current AI craze, and happily like my devices somewhat
stupid, but if they're going to try and be smart, at least learn my
predictable behaviors, and offer me shortcuts on those.

It's unclear to me what these new shortcuts really offer, but I'll be
interested to try them. However I have a hunch what I want is actually even
simpler than these will provide.

~~~
Infernal
So I currently use a combination of Launch Center Pro and Workflow (the
developer of which was purchased by Apple last year, and appears to be the
source of what has now become Siri Shortcuts) to do what you're describing.

I have a Workflow that when run, uses Maps to estimate the drive time between
wherever I am and my house, pads the estimate by 5 minutes, then adds that to
the current time to get an ETA, formats it into a text message and sends the
ETA to my wife.

In Launch Center Pro, I have a geo-fenced shortcut that presents a
notification to run my workflow whenever it sees that I have left the area
around my office building.

The UX is, halfway between my office building and my car, I get a notification
from LCP - I tap the notification, Workflow thinks for a minute, and presents
me with a text message to my wife and I tap "send".

As nice as this is already, I believe Siri Shortcuts may improve it in several
ways:

1) Currently, apps are not allowed to send messages without explicit user
interaction, so Workflow can't actually text my wife, it just presents me with
an imessage screen and an already-written message that I must tap "send" on

2) Workflow does not support geofenced or time-based running of workflows,
hence why I need LCP to launch my Workflow based on a geofence. If Siri
Shortcuts supports this, then I won't have to rely on LCP anymore.

3) There is no way (that I'm aware of) to trigger my workflow automatically, I
must tap a notification and unlock my phone, or run an app and select the
workflow within it. An improvement would be a programmable Siri Shortcut ("Hey
Siri, let my wife know my ETA"), and even better would be automatic running of
workflows based on defined conditions, so my workflow would run and send a
message automatically when crossing the geofence without me even needing to be
aware of it (besides maybe a notification that lets me know the workflow has
run).

~~~
mason55
According to this article[0] those annoyances are still there

> _It does look like certain annoyances with Workflow are showing in Shortcuts
> as well. For example, sending a message via a Shortcut still requires you to
> manually hit the send button_

[0] [https://9to5mac.com/2018/07/06/apple-shortcuts-hands-
on/](https://9to5mac.com/2018/07/06/apple-shortcuts-hands-on/)

------
jordansmithnz
I’m the developer of a popular timetable/schedule app for college. It seems
like the perfect use case for Siri: “What class do I have next?”.

Up until now it hasn’t been possible due to the restrictive API. I’ve looked
into the new shortcuts and Siri API docs and although the beta documentation
is sparse, I’m confident I’ll be able to develop a natural, first class Siri
integration.

Unfortunately, it won’t be trivial to implement (at least to do well). It
probably won’t be done before the iOS 12 release, and I’m hesitant to start
working it just yet due to the sparse beta docs.

Over the next year or two, I think there will be some great Siri integrations
built. Hopefully, users discover and use them :)

~~~
berti
I always wanted something like this in my uni days, but was always too lazy to
make it! More importantly than what, "Hey Siri, where's my next class?" ;)

-e- heh, quick LinkedIn stalk reveals you went to UC at the same time as me!

~~~
Nullabillity
Isn't that just.. a calendar?

~~~
Kognito
This is what I did. Create a ‘Uni’ calendar and fill it with your class days
and times - with repeats used as necessary.

Irritatingly, Siri doesn’t distinguish between calendars so you can’t reel off
just classes, but you can ask about the day or the next ‘event’.

Personally I found this setup along with Apple’s ‘Up Next’ widget and Siri
Apple Watch face to be way better than any other class management app I tried.

~~~
Nullabillity
Many even publish iCalendar files ready to import. I actually ended up writing
a custom scraper for my HS[1], after which the LMS provider actually wrote
their own exporter fairly quickly.

I guess they weren't too happy one of the kids had to keep a database of all
of their students' passwords... :P

[1]:
[https://github.com/teozkr/schoolsoftsync](https://github.com/teozkr/schoolsoftsync)

------
halflings
> Also, Shortcuts don’t require the web to work – the voice triggers might not
> work, but the suggestions and Shortcuts app give you a place to use your
> assistant voicelessly. And importantly, Shortcuts can use the full power of
> the web when they need to.

> This user-centric approach paired with the technical aspects of how
> Shortcuts works gives Apple’s assistant a leg up for any consumers who find
> privacy important. Essentially, Apple devices are only listening for “Hey
> Siri”, then the available Siri domains + your own custom trigger phrases.

I don't get it. How is this different from Android? Android Actions [0] were
announced before this. I think Assistant also works offline (with most voice
commands + in voiceless mode).

[0]
[https://developer.android.com/guide/actions/](https://developer.android.com/guide/actions/)

~~~
hug
SiriKit and Siri Shortcuts are very different things.

SiriKit allows you to build Siri support into your app a la Android Actions,
but Siri Shortcuts is designed to allow drag-and-drop end-user "programming"
of workflows that can be triggered by Siri.

"Hey Siri, I'm on my way home" could turn on you thermostat up, order you a
pizza, remotely trigger your IoT enabled kettle and start playing your home-
commute playlist.

For the more advanced of us, Workflow _currently_ allows doing things like
calling arbitrary REST APIs and parsing JSON. I've reverse engineered the API
of a local coffee-ordering app so I can one-click order my morning coffee.

Next thing I'm planning is my "I need a coffee" button which will get the
nearest cafe, order me a flat white, and pull up the directions.

~~~
icebraining
Yeah, Siri Shortcuts seems more like using Tasker with the AutoVoice plugin.

------
Shank
To me, the biggest thing here is that it takes a completely different approach
than what has been the traditional path for voice assistants. In the past, it
was always the game of waiting either for a custom skill or app, or it was
hoping that Google or Amazon would program in some logic for handling a
particular case.

Shortcuts enables basically any end user with enough devotion and dedication
to short circuit this. It doesn’t require them to be an app developer and it
doesn’t require them to learn code at all. The most basic shortcuts can be
created without any if-then-else logic while enabling so much.

I’ve seen my mom ask her Google Assistant to do things with “and” a lot and
they just don’t work because command chaining hasn’t been implemented. But
with Shorcuts, she could conceivably make a chain and designate a phrase to be
conversationally equivalent with an “and” in the middle.

Instead of having to explain to family why Siri can’t do X or Y, I can just
make a Shortcut or show them how, and I’ve solved the problem rather than
explaining why it can’t be done.

~~~
entropie
> I’ve seen my mom ask her Google Assistant to do things with “and” a lot and
> they just don’t work because command chaining hasn’t been implemented

it is implemented. It just doesnt work often (or at all) for foreign
languages.

I'am pretty sure this is the next big thing google is updating

> Instead of having to explain to family why Siri can’t do X or Y, I can just
> make a Shortcut or show them how, and I’ve solved the problem rather than
> explaining why it can’t be done.

At least with GA and ifttt you can as of late make your own phrases (and
responders). Nothing for the nontechies, but some progress there.

~~~
joshschreuder
Do you know if it's possible to get the Home to respond to some external
event? Doesn't seem to be possible to use as a Then with IFTTT (for example
"When I arrive home Then say Welcome Home")

I found this:

[https://www.npmjs.com/package/google-home-
push](https://www.npmjs.com/package/google-home-push)

Which might let me write my own API to do something but it would be good if it
were built in somehow.

~~~
entropie
> Do you know if it's possible to get the Home to respond to some external
> event?

Iam actually not sure if this should be the feature of my google home
assistent while, ofcourse i see the use cases.

> [https://www.npmjs.com/package/google-home-
> push](https://www.npmjs.com/package/google-home-push)

The dependincies show that this uses the [https://github.com/thibauts/node-
castv2-client](https://github.com/thibauts/node-castv2-client). So iam
guessing this only allows one to use the chromecast api of the google
assistent device.

Iam pretty sure the google assistant api will not be open soonish.

------
walterbell
_> This user-centric approach paired with the technical aspects of how
Shortcuts works gives Apple’s assistant a leg up for any consumers who find
privacy important. Essentially, Apple devices are only listening for “Hey
Siri”, then the available Siri domains + your own custom trigger phrases.
Without exposing your information to the world or teaching a robot to
understand everything, Apple gave Siri a slew of capabilities that in many
ways can’t be matched._

Is there a good doc or video that delineates what data stays on the device vs.
being sent to Apple for processing? E.g. how much of this functionality will
be available if you are not signed into iCloud?

~~~
mark212
from the keynote, no information is sent to Apple for processing. Whether
it'll work if you're not signed into iCloud is a different issue but from the
way it's described in various places in the WWDC Keynote, it ought to work
just fine without any data connection, like airplane mode and no Wifi.

~~~
rueynshard
Wouldn't the user's speech data need to be sent to Apple to convert to text,
or identify the intent first?

~~~
peckrob
We were doing speech recognition 20 years ago without any kind of networking.
I dictated part of a term paper in 1995. Dragon Dictate I think is what it was
called. You could even navigate the word processor menus and UI with it, or
say things like “make that bold” and it usually worked! Just a bit less often
often than Siri works actually.

Sure it was more of a novelty, and had to be trained on your voice. But that
was 20 years ago.

~~~
walterbell
Not only is Dragon still a product, it is owned by Nuance, which helped to
develop Siri and drew on SRI research,
[https://www.forbes.com/sites/rogerkay/2014/03/24/behind-
appl...](https://www.forbes.com/sites/rogerkay/2014/03/24/behind-apples-siri-
lies-nuances-speech-recognition/)

~~~
Angostura
Hence Siri’s name.

------
arthurofbabylon
I agree with this idea for one reason in particular: as a user, truly
leveraging Siri requires a manual engagement - the experience is not
conversational, it is not like speaking with a person. It is like using a tool
- one needs to conscientiously adjust oneself to follow its rules and get it
to work as desired.

Users actively taking control over how they use Siri (as in iOS 12 Siri
shortcuts) will almost certainly encourage them to more conscientiously adjust
their usage behavior patterns.

There exists a general assumption that a voice has a human-intelligence behind
it, but obviously now not all voices do. This poses a tough learning curve, as
evinced by criticism of Siri as ineffective or plain bad. Yeah, Siri won’t
respond the same way your boyfriend will when you ask him to find some good
Chinese food, or express some feeling. But Siri will excell at setting alarms,
or adding an event to the calendar, or starting a meditation with Timeless. It
comes down to matching the language to the tool / following a protocol. It
comes down to manually engaging with precision.

~~~
saagarjha
> Siri won’t respond the same way your boyfriend will when you ask him to find
> some good Chinese food, or express some feeling

I think Siri did pretty well when I asked her these:

Q: Find me some good Chinese food

Siri: I found four Chinese restaurants a little ways from you:

Q: How are you?

Siri: Very well, thank you!

~~~
singularity2001
That's about the only things Siri can handle without completely blowing it.

~~~
scarface74
I use Siri the most when I’m driving. It comes in handy for a lot of things
then...

Directions. - Take me to X.

Reading text messages - “Read my text messages from my wife” or “Read my last
text message”

Reminders - “Remind me to call my X/do X when I get in the car/get out the
car/get home/at Y”

Music/Podcasts - Play X/Play a song by Y.

Taking notes, calendar events, (What do I have to do today?)

------
ehsankia
It's great to see Apple finally add some powerful customization, and for power
users this is fantastic, but to average users this makes no difference. When
it comes to normal queries, Siri is still far behind. Sure, with some extra
work I can get almost any query to work, but most users don't want to manually
code up all their queries.

~~~
adam
Perhaps it's already there, but I wonder if shortcuts will be able to be
shared or there will be a library for them. So someone can pre-define all
sorts of stuff, then I as a "basic" user can just add the shortcut vs. having
to "program" it myself.

~~~
hug
Workflow (the mother of Siri Shortcuts, as I understand it) already has
sharing functionality.

For example, this "get travel time to input destination" Workflow:
[https://workflow.is/workflows/ff987bcf0ad746d496415d7f4c75a8...](https://workflow.is/workflows/ff987bcf0ad746d496415d7f4c75a872)

~~~
saagarjha
Shortcuts has this as well.

------
amelius
Does anyone know of an extensive list of useful things that people typically
ask their voice assistant?

As a non-user, and as someone who types faster than they speak, I find it hard
to come up with compelling use-cases.

~~~
fouc
Haven't looked for a list, that'd be a good idea.

So far all I do is:

"Remind me to wake up at 6am" or "Remind me to get my laundry in 30 minutes"
(sets it in the todo list reminder app, since the alerts are better than the
alarm app)

~~~
akvadrako
I think you have a lot of company in the "only use Siri to set timers" camp. I
also do alarms and appointments, but nothing else is reliable enough.

------
dwaite
Two of the issues with Siri is that Apple does not like to roll out features
unless they work for everyone (across all languages), and discoverability of
what it can do against disappointment in what it can't (really, you think
defaulting to a Bing search is the best thing to do in this case?)

Siri Shortcuts lets you define your own local command and control phrases from
presented actions, effectively solving both issues. It _could_ (depending on
per user investment) make Siri way more individually useful.

------
ronnier
iOS 12 beta is amazing. It is extremely fast and feels like perfection.
Hopefully apple will open the OS a little more for customization and apps to
do more powerful things then it will truly be perfection.

~~~
Kpourdeilami
How does it hold up for day to day use? Is it stable enough that could be
installed on your main phone without many issues?

~~~
kalleboo
I've had my lock screen flip out a couple times, once requiring a manual
reboot of my phone. The notifications list has graphical glitches. Some third-
party apps are suspiciously crashy. It's probably the least problematic early
iOS beta I've used, but like any beta I still wouldn't rely on it if missing a
call would be a deal-breaker.

------
mgiannopoulos
That’s all good but I hope that Siri can be improved to understand voices like
mine (a native Greek so my English accent is not perfect). Siri understands
40-50% of what I say. Google (on iOS, so same hardware) is more like 80-90%.

~~~
bitwize
Google's voice recognition is nuts. One time I fired up Google's voice search
and fed it some of the MST3K names for Dave Ryder from _Space Mutiny_ (e.g.,
"Punch Rockgroin", "Blast Hardcheese", "Splint Chesthair", "Big McLargehuge",
etc.) and was surprised at how many of the searches it got right --
capitalization, spacing, and all.

------
sirn
To demonstrate what Shortcuts will be able to do, here's someone writing a C
parser in Workflow (a predecessor to Shortcuts):

[https://twitter.com/uroboro845/status/985169126871224321](https://twitter.com/uroboro845/status/985169126871224321)

(One of the replies is by Ari Weinstein, who is currently on the Siri
Shortcuts team)

------
GeekyBear
Think of Shortcuts as a visual scripting language that leverages app
functionality and ties into Siri.

Either you can use drag and drop to write your own scripts, or you can run
scripts written by others.

You can run your scripts in a variety of ways, including a trigger phrase you
set with Siri.

Here's a short video showing the creation of a very simple script in an older
version of the program.

[https://www.youtube.com/watch?v=uK0sBtF5_1E](https://www.youtube.com/watch?v=uK0sBtF5_1E)

If you are a podcaster, you might create a more complicated script that
converts a source audio file to MP3, adds MP3 tags and artwork to the
resulting file and then uses FTP to upload the result to your podcasting
network.

Another script idea would be to text someone a list of the blocks of free time
you have open during a given workday based on your calendar data to help set
up a meeting.

The possibilities are wide open. You can even tie directly into web API's.

------
jtbayly
But does anybody know whether it will finally be possible for me to ask Siri
to give me directions to my next appointment?

I’ve been waiting for this obvious functionality for years.

~~~
mergesort
Yep, you can do that. This weekend I made a shortcut that goes through your
calendar, finds the next appointment, and then picks your maps app of choice
to load up directions into.

I took it another step further and am working on a version where it looks at
the prices on Lyft and Uber, presents you with the respective options, and
calls a ride to your next appointment. Shortcuts in iOS 12 are a real fun
little thing to play with, the most fun I've had programming in a while.

Happy to send it to you. I looked for your contact info but couldn't find any
way to contact you, but feel free to reach out in my profile if you want.

~~~
jtbayly
I'm not running the beta, so I guess don't bother. I'm sure I'll have fun
creating it myself in a few months. :)

------
mustacheemperor
Looks like this update still won't let you ask Siri to play a specific song on
any service except Apple Music, which is definitely my biggest gripe with Siri
and the iPhone at large since switching from Android. I would switch to Apple
Music, except then there'd be no way to wirelessly cast/play music through my
component stereo. Semi related, if anyone has solutions to this problem I'd be
very appreciative.

------
Tehnix
A lot of people seem to mention Assistant Routines, but the major problem here
is that it’s not available in a lot of countries outside the US (even not the
UK..), meaning it’s not actually an alternative for the majority of the world.

The funny bit is though, that your Google Home will still recommend that you
set up routines for things, except you just...can’t.

Since I don’t have it available myself, I cannot comment much on it, but while
searching for why it wasn’t appearing in the Home app, I certainly didn’t get
the impression that people are impressed by it :(

------
madrox
This is great for developers, but pointless for users. It doesn't change the
user behavior, and the research shows people are using VA at home and nowhere
else...a place better suited to Google Home and Echo.

There've been a lot of attempts by Apple to create richer hooks into apps like
search integration, but they don't do much for engagement. There's some good
ideas out there, along with anecdotal successes, but the interaction model
isn't great and better app integration won't fix that.

~~~
MBCook
I use Siri on my phone all the time. Do you have a link to that research? I
haven’t heard that before.

Given that iOS will be prompting people with possible Shortcuts based on what
they do frequently I can easily see people starting to adopt this.

I don’t think you have to use Siri to trigger these, it’s just an obvious
easy/fast way. Users could still use the shortcut app or widget to do it.

As I’ve been watching some of the Apple community on Twitter since this
started to go into beta they’ve already produced some fun/surprising stuff.
This seems like it’s going to be a BIG deal.

~~~
madrox
This is the best I can do right now. It isn't where I originally read it but
google is failing me. [https://creativestrategies.com/voice-assistant-anyone-
yes-pl...](https://creativestrategies.com/voice-assistant-anyone-yes-please-
but-not-in-public/)

What it comes to is the people on HN don't represent most people using
iPhones.

------
mirceal
I personally turn off all notifications on my iPhone (except for a few select
apps) and do not use Siri.

I can tell you that the Siri suggestions in Search really annoyed me (usually
use search to find... apps, not to be fed all the crap from random apps).

I want a clean, non-intrusive experience. Apple should focus making the
hardware better and to get the software out of the way (don’t make me think
about it) instead of pulling all this crap on its users in the name of
innovations.

~~~
jknz
> I can tell you that the Siri suggestions in Search really annoyed me
> (usually use search to find... apps, not to be fed all the crap from random
> apps).

In iOS 11 the delay for searhing for apps is unacceptable. instead of
displaying the apps found right away (substring match over all apps should be
instantaneous), it waits until all the crap from Siri and other random apps is
fetched.

~~~
matt-attack
Agreed. The _only_ way I find & launch apps (besides the 4 on the dock) is by
swipe-down from the middle of the homescreen, to reveal search, then to search
for the app's name. This is very analogous to how I launch apps in OSX, namely
command-SPACE, then start typing the application name.

It's extremely frustrating when iOS takes forever to find an app, or worse
proioritizes all kinds of other garbage _before_ the app's icon. If I type
"waz" and I have the Waze app installed, I sure as hell expect to see the Waze
icon instantly at the very top. Ideally, after typing just the "W".

~~~
hug
You'll be pleased to know that the results in iOS 12 instantly reveal the
first four-matching installed apps. (Instantly meaning "I can't really
determine how long it takes, because it's probably somewhere under 100ms")

All of the other stuff loads very quickly. Mail results are very very quick
(over 3 inboxes containing over 100,000 mail items), and then a-bit-less-quick
for the Siri search suggestions.

I'm actually shocked at how much iOS 12 improved responsiveness.

~~~
dionian
also was very surprised, its been one of the most notable perceived
performance jumps between major releases probably ever

------
myrandomcomment
I switched to car play and use Siri a lot more then I used to. For most things
it is fine. Others... this could help.

One thing I would like to know is why Siri cannot answer basic questions about
what music is on my phone. For example, “what albums by New Order do I have on
my iPhone?”. That does not work but “play album Substance 1987 by New Order”
does.

~~~
robin_reala
Siri on CarPlay is useless for me. Maybe it’s my situation (RP British
English, driving in Sweden) but it usually takes 10 attempts to get it to
recognise a track I want to be played, and getting routing information is
easily double that for anything other than ‘drive home’. I one tried to get it
to route me to ‘Malmö’ (third largest city in Sweden) and gave up after 30
attempts. I honestly have no idea what I’m doing wrong, but I suspect nothing
and it’s just useless software.

~~~
thirdsun
I'm not sure if this has changed yet, but until recently Siri regularly failed
to recognize artists, albums or titles, which often happen to be english, if
the device/siri was set to a different language. In my case that's german and
I can't count the number of times I had to pronounce the artists I wanted Siri
to play in a ridiculously german way.

English is so ubiquitous that, in my opinion, all Siri queries should be
checked for both local language and english.

~~~
myrandomcomment
Yup. Mixed language is a pain. Set to English by default but when I ask for
something in Japanese or Mandarin it does no work well.

------
gkilmain
To run an Alexa skill on iPhone you need to install the Alexa app and the
Amazon app and use your Amazon app to handle the voice part. Then open the
Alexa app to view the card. I hope I'm understanding the hype of the article
correctly in that I can build a native app and use Siri to interact with it.

------
mark212
This is a great example of how owning the entire stack from CPU on up to user
interface allows a company to do things that others just can't. Whether it's
any good or not, well, time will tell. But it's not even possible for any
other company (maybe Google with the Pixel, maybe).

~~~
rueynshard
I don't see why this could not be possible on Android. All Google needs to do
is build an API for Assistant that other apps can use to 'donate' frequently
used actions/intents. Assistant can then make suggestions to users based on
its own analysis of users' activities in the same way Siri does.

~~~
Bahamut
Apple could feasibly go further and do integration with macOS Siri as well
down the line I'd imagine - Google can't do that nearly as deeply.

~~~
izacus
Why? Can you explain that reasoning? Google already shipped full integration
of Assistant into their ChromeOS.

~~~
sharcerer
I am seeing a trend in this thread, that people think that Google Assistant
doesn't have something whereas in reality it has already implemented all and
more than what Apple did this year. Either, they are biased and are making
assumptions, or they use Siri and are again making assumptions . Partly,
another reason might be that Routines is within the Assistant Settings so
people haven't discovered. Still, seems odd at a site like this. Frustrating.

------
SN76477
I just want macros. I would love to have a single button that does in OS
functions, like having a button for EQ settings, or to launch a specific
website in a browser (such as a different weather service)

------
adityapurwa
So basically they exposed an API to add to Siri a new skills. I wonder whether
other platform already did simillar things? CMIIW, Cortana already exposed an
API for us right?

~~~
MBCook
Really they’ve combined two things. They took an app called workflow, which
they bought, would you can use to automate things on iOS. They extended it
dramatically, and then added the magic ingredient of making it possible to
have Siri call those actions that you define. Plus some AI stuff to suggest
shortcuts for things you do frequently.

Alexa is quite popular because it has so many skills the people have created
for it. That’s one of the reasons people claim Siri has been “behind“ and
Amazon’s Echo devices have been doing so well.

I don’t know if Cortana or Bixby support anything like this.

~~~
halflings
Android has the same features:

[https://developer.android.com/guide/actions/](https://developer.android.com/guide/actions/)

[https://techcrunch.com/2018/05/08/google-launches-
slices/](https://techcrunch.com/2018/05/08/google-launches-slices/)

------
LTL_FTC
"hey Siri, destroy all evidence!" phone begins wipe

~~~
saagarjha
I’m sure you’re trying to be humorous, but just in case you’re not, this isn’t
something that Shortcuts lets you do.

------
threeseed
I also think this is going to revolutionise the TouchBar.

The ability to have contextual shortcuts could be pretty powerful.

~~~
MBCook
I think this is iOS only for now, but it seems like a very obvious thing to
port to the Mac.

~~~
saagarjha
It’s already there in the form of Automator and AppleScript.

~~~
MBCook
That’s a different way of automating things. Automater seems to have been
ignored, and AppleScript is quite old and people have been worried about for
years.

------
romangibson
So, is it ready to do the complex task just like google and amazon echo?

~~~
TeMPOraL
Does Google finally allows user to configure top-level actions, instead of the
silly "Ok Google, ask $someapp to do something" (where what I want is simply
"Ok Google, do something")?

~~~
singularity2001
Or even better just "$act" when I'm alone in the room!

The "Ok Alexa ask $someapp to …" prefix is disgusting.

------
shiburizu
Saw someone else comment on a different TechCrunch article here on HN about
how these articles about Apple read like press releases. I really feel that
now.

~~~
ndynan
As a PM, this article is what I would write as sample press coverage I'd like
for a product release.

Interesting to see that the author is a former PR member of
[http://my.workflow.is/](http://my.workflow.is/) \+ has one recent article for
TechCrunch.

Oh hey looks they were purchased by Apple:
[https://techcrunch.com/2017/03/22/apple-has-acquired-
workflo...](https://techcrunch.com/2017/03/22/apple-has-acquired-workflow-a-
powerful-automation-tool-for-ipad-and-iphone/)

Quick Edit: I'm not saying that this isn't good technology, rather I'm more
concerned that TechCrunch is taking contributors who have ties to products
being reported on and may have interests that are not being communicated to
readers.

~~~
ianlevesque
Yeah, this is native advertising and it's Everywhere.

~~~
ibeckermayer
"Native advertising" sounds very much like the MBA-term for what I would call
"deception".

(I'm not accusing you of anything, just making commentary)

------
tzahola
Sounds like Automator for dummies.

~~~
saagarjha
A dumbed-down Automator, _maybe_. Please don’t insult the users of Workflow.

~~~
tzahola
I'm not insulting anyone. "X for dummies" is a book series.

~~~
mlang23
Which doesn't make it any less insulting.

