Hacker News new | comments | show | ask | jobs | submit login

Just curious, but does anyone who pays attention to this space/technology know if eye tracking (with commodity-ish hardware) has gotten to the point of "replacing the mouse"?

E.g. at the OS level, the pointer basically follows your gaze.




Eyes don't point a things we look at as accurate as a mouse does (they only aim within the fovea, about 1 degree angular, corresponding to about 1cm on the screen) and eye trackers are often far worse.

This summer I built a system that uses slight head movements for refinement, allowing hands free mousing as fast as a trackpad. However, eye trackers good enough for the system to be pleasant cost $10000+. I have ideas that would allow it to work with a $100 eye tracker, but even then it has the downside that the computer vision for your mouse will peg one core.


As someone who hates mice and hopes one day to have this sort of tech, I'm trying to understand what you mean by "Eyes don't point at things we look at as accurate as a mouse does" - how is it that I can choose to "look at" an individual pixel on my screen, or an individual serif on a letter, and then choose another one far less than 1cm away, and my eye focus shifts to it? That shift is not detectable by a camera?

(Edit update: by "detectable" I mean theoretically detectable by a really good camera and software. In other words, it seems you are arguing the tech is impossible even theoretically due to some aspect of biology. Am I following you correctly there? Thanks)


I'm also not an expert, but I suspect a confounding problem is the fact that they eyes never stop moving, even when it feels to us like we're looking at a single point: https://en.wikipedia.org/wiki/Eye_movement#Saccades


Not an expert, but you actually focusing on a pixel is your brain doing image processing, not a function of your eyeball.

Your field of vision (and focus) is much bigger than 1 degree, so any part of the image that you are "looking at" is something the brain is doing to the image as post processing, not something your eyeball is doing.


Your eye is constantly moving around, even when you think you are staring at the exact same point.


I researched this over a summer once by downloading the latest scientific papers on gaze tracking. The result were pretty disappointing to me. Then I figured I was doing this researching all wrong. Because I already knew from highschool biology that your eye does micro movements all the time to keep the retina stimulated and to keep a larger area in focus at the same time. So I opened wikipedia and looked up the smallest micro moments that the eyes do. Based on the angle of it and the average distance between your eye and the screen, it's easy to see that you can never replace the mouse with gaze for a pixel perfect pointing device.

However! If you think outside the box, you might get a fairly accurate gaze tracker and a different GUI design to get this to work. That vision (no pun intended) is more of a long term one. An easy short term use for eye gaze use would be automatically setting GUI window focus based on eye gaze. That already might save you a keyboard-mouse switch. As long as you have no more than four windows on your screen, you can make it work with the current tech already.


"you can never replace the mouse with gaze for a pixel perfect pointing device"

And how about getting in the precision area like a finger has on a touchscreen? You think, this is possible? Because touch works pretty well, if the UI is good (big) enough ...


I researched it some years ago and the precision was just too low. Furthermore I believe that it has to work REALLY well. In particular it is interesting to know if there would be some kind of health risk?

E.g. if the tracking is off (i.e. the predicted x,y coord differs from the target you had in mind with your gaze) I guess one would try to auto-correct the error by a slightly different gaze, i.e. looking slightly beneath the target. I am not sure if this is a problem or not?

We do not have this problem with the mouse, as we control it via relative speed and do not set specific x,c coords.

Maybe one could solve this by also tracking the head? E.g. make and educated guess via gaze tracking and let the user refine it with her ... nose. ;)


I was wondering if this sort of cursor control would be possible with, say, a Google Glass. The software is already able to take input in the form of winking for snapping pictures. So, couldn't you have a cursor overlayed on the Glass's screen, line it up with where you want it on your computer screen, and then wink to place it there?


Kind of related, there's a VR headset coming with built-in eye-tracking: http://www.getfove.com/


Even if it could just focus the particular terminal window I'm looking at then it'd be really useful.


BUT BUT BUT!!! I spend so much time moving the cursor out of the way so I can read the letter under the pointer.


You can avoid that easily using more conservative MAGIC (actually a reference to an HCI paper) that only moves the mouse pointer when you want it to.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: