
Show HN: Sesame Lock Screen quick launches anything (learning bot and intents) - philwall192
https://play.google.com/store/apps/details?id=ninja.sesame.app
======
philwall192
The core idea is to break functionality out of native apps, then build a
single interface that can get to things faster.

It uses the Accessibility Service to observe actions, and then Intents and
APIs to replicate them. That way we can automatically make shortcuts to stuff
you actually use. This clip is a pretty good example:
[https://www.youtube.com/watch?v=DqGS1kswIlM](https://www.youtube.com/watch?v=DqGS1kswIlM)

Right now we have it working for:

* All your apps (ranked by how often you use them)

* All contacts w/ 1 touch to text, call, or email (WhatsApp too)

* Websites you visit (but not in incognito)

* Music library and new stuff you listen to (Spotify only atm but expanding soon)

* You can Quick Search: Maps, Waze, Netflix, Spotify, Google Music, Chrome, Play Store, Yelp and YouTube

* It makes shortcuts back to your searches

* Hangouts conversations, including groups

* Alarms you set

* Reddit (shortcuts to your subreddits)

* Uber/Lyft see live ride options w/ ETA & Surge Pricing

The interface is a command / GUI hybrid. It’s best feature is the keyboard. We
modeled our search off the command palette in Sublime Text. It understands
shorthand. If I want to play music by Bob Marley using Spotify. I can type “S
,B, M” and out of the thousands of choices "Spotify: Bob Marley" will be on
top. It also learns from what you choose. So “S, B” might work the next time,
then just “S”. Like this:
[https://www.youtube.com/watch?v=ixZ6iB1aRS0](https://www.youtube.com/watch?v=ixZ6iB1aRS0)

The reason we’re building around the keyboard is that it’s an incredibly
versatile tool that is already memorized by users. It really only takes 1-3
taps to get to anything. It’s an adjustment to change your behavior from
“swipe pretty picture to enter your phone” to “make a decision now using this
power tool!”. But our analytics shows that ppl who take the time to learn the
keyboard seem to love Sesame and keep it.

There’s also a ready-to-tap list of your apps. This is more familiar to most
users. We wanted the ranking to be based both on how often you use an app how
recently you’ve used it. When you open an app it gets a “vote”. We split the
day into 15 minute buckets and look at 4 weeks worth of data. We degrade how
much each vote is worth based on how far its bucket is from the current time.
We give the current bucket a 10X multiplier so anything you just opened is
likely to be at the top of your list, for at least 15 minutes. With all this
learning, it takes a little time to get good. We’ve gotten a few requests to
let ppl pin their favorite apps, but when I follow up a few days later they
normally say “nah, it’s good now”.

Building a stable lock screen has been the hardest part. We kept the system
security in place. It seemed like the only way to really protect the phone.
But it brings a huge cost. It’s not supported by Android, the OEMs all have
access to tweak the code too, and different user settings multiply the
possible conflicts. The result is we can’t consistently control across all
devices how Sesame transitions to the system security screen. We technically
sit on top of the real lock screen. I’ve put in every hack I can find or
figure out and am happy to share them. I would love to hear from ppl who have
dealt with this beast too.

You can open Sesame (see what we did there?) by long pressing the Home button
too. This was easy to implement and gives you a way to use Sesame even if you
have Lock Screen conflicts.
[http://stackoverflow.com/questions/14233330/replace-
google-n...](http://stackoverflow.com/questions/14233330/replace-google-now-
gesture)

We use the Accessibility Service to make new shortcuts for you. Back to the
Bob Marley example. If you don’t have a shortcut for Bob yet, but go into
Spotify and do a search for Mr. Marley. Sesame will catch that event and be
able to make a new shortcut.

The Accessibility Service is sorta this untapped miracle of user data. Right
now we use it to do a few specific things. But we tried to build a program
that would parse the stream of Accessibility Data so we could map pathways
through native apps and automatically make new shortcuts w/o writing new code.
It was a combination of graph theory, trees and sets, and Dijkstras
pathfinding. It was kind of crazy but I swear we _almost_ got it working
before we decided it was going to be the death of us :)

Anyways, we’ve put a lot of work into Sesame and are honestly more comfortable
building than promoting. So we hope y’all try it :)

I’m happy to answer any questions and would love to hear feedback.

~~~
philwall192
Some crickets in here...

Nobody thinks reading data off the Accessibility Service, building a library
of intents and auto constructing user specific shortcuts is cool?

