Hacker News new | comments | show | ask | jobs | submit login

I think this would be really useful to UX researchers! The folks I know in that space often set up screen-sharing sessions with users, and watch them interact with the UI in real time. Adding gaze tracking to this process seems like a natural progression, allowing the researchers to see where the user expects a button or UI feature to be, even if the user can't articulate that in real time.



Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: