Hacker News new | past | comments | ask | show | jobs | submit login

Is eye-tracking tech not good enough these days to position a cursor when one stares at that position for 2+ seconds? I am recalling studies of driving or artwork where the gaze lingers.



Maybe, but 2+ second pauses are way too much to indicate a position on your screen, since you do that a lot during coding


XR devices with eye tracking have practically no latency instead you would need to tune down the points of interest hover over time triggers and then there are a lot of experimental keyboard ideas in Spatial Computing that could be integrated like starburst wheels could give you layers of type assist etc


I bet it's faster but it feels slower because it's a passive action.


You can find examples of people using eye trackers for coding/gaming on Youtube and judge the latency yourself, here's one such example: https://youtu.be/FZRgBw8m34c?feature=shared&t=90


Eye tracking with neural "click"

I need extremely responsive and accurate eye tracking with device that can detect at least 3-5 buttons

- click

- click down

- click up

- move right

- move left




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: