Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So here’s the other rub with using the keyboard to enter the command. Most people , even experienced touch typists, will flit their eyes towards the thing they’re trying to interact with.

Therein lies a big part of the problem with eye based interaction. Our brains move our eyes for a lot of different tasks, saccade to get a constant read of your scene (eyes have very poor resolving power so need to move a lot), they also signify what you’re thinking (there’s a lot of studies in neurolonguistics about eye direction signalling how you’re thinking, but at a base level, you tend to look up or away when you’re pondering).

Anyway not to say it can’t be done. But it’s a fascinating domain at the cross section of UX, technology and neural science.

For what it’s worth, there are VR headsets with dedicated eye trackers built in (PSVR2, certain Vive Pro models, Varjo etc..) and there have been cameras from Canon (in the film days even!) that used eye tracking for autofocus targets.

It’ll be interesting to see how things shape up. Meta have their big keynote on Tuesday where the Quest Pro / Cambria is meant to have eye tracking sensors.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: