Hacker News new | comments | show | ask | jobs | submit login

As someone who hates mice and hopes one day to have this sort of tech, I'm trying to understand what you mean by "Eyes don't point at things we look at as accurate as a mouse does" - how is it that I can choose to "look at" an individual pixel on my screen, or an individual serif on a letter, and then choose another one far less than 1cm away, and my eye focus shifts to it? That shift is not detectable by a camera?

(Edit update: by "detectable" I mean theoretically detectable by a really good camera and software. In other words, it seems you are arguing the tech is impossible even theoretically due to some aspect of biology. Am I following you correctly there? Thanks)

I'm also not an expert, but I suspect a confounding problem is the fact that they eyes never stop moving, even when it feels to us like we're looking at a single point: https://en.wikipedia.org/wiki/Eye_movement#Saccades

Not an expert, but you actually focusing on a pixel is your brain doing image processing, not a function of your eyeball.

Your field of vision (and focus) is much bigger than 1 degree, so any part of the image that you are "looking at" is something the brain is doing to the image as post processing, not something your eyeball is doing.

Your eye is constantly moving around, even when you think you are staring at the exact same point.

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact