> One thing I've been wondering about is, could you build an Android app that acts like google assistant in the sense of you pressing a button/swiping gesture whatever and the "assistant" kicks in from any context on the phone (e.g. while you have another app open)
There is literally an API specifically for this, and the settings app has... an option for choosing which assistant should be invoked when you use the assistant gestures.
Back in the day, you could (ab)use the accessibility api for screen reading. That’s still works, but now there is a proper “read screen contents” permission that the Google Lens app uses and any other app can request
There is literally an API specifically for this, and the settings app has... an option for choosing which assistant should be invoked when you use the assistant gestures.