Hacker News new | past | comments | ask | show | jobs | submit login

I think Amazon's Echo device is doing this the proper way, which "uses on-device keyword spotting to detect the wake word. When Echo detects the wake word, it lights up and streams audio to the cloud". It seems like a technical or design failure on Samsung's part to not feature similar functionality.



Also Google Now (for devices that are always listening for the "trigger word"), where your phone will make a very distinctive noise and pop up a screen to indicate that it's listening.


Pretty much the same thing that everyone else does: "Echo uses on-device keyword spotting to detect the wake word. When Echo detects the wake word, it lights up and streams audio to the cloud, where we leverage the power of Amazon Web Services to recognize and respond to your request." The article refers to the same thing being the difference that Samsung doesn't owns the Amazon/Microsoft/Google/IBM cloud where they run all the voice-recon algorithms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: