
Google buys Eyefluence eye-tracking startup - SkarredGhost
https://techcrunch.com/2016/10/24/google-buys-eyefluence-eye-tracking-startup/
======
kbenson
I can't help but feel that most these companies working on these technologies
are viewing them in isolation, when none of our peripherals are ever used in
isolation. The mouse wasn't developed to replace the keyboard, but to
supplement it. You don't generally use a mouse to select characters from a
representation of a keyboard on screen (there are reasons when you want this,
but it's not the general use case), so why do we always see the equivalent of
that in new peripherals?

The two main cases I'm thinking of are eye tracking, and brainwave tracking
(like EMOTIV). Individually, neither looks like a compelling way to control a
computer when you have a mouse and keyboard, but _together_ , I think that
might really yield something interesting (and sooner!).

Instead of using brain waves to move a cursor around the screen, use eye
tracking. Instead of using gaze lingering for click, use brainwave tracking.
Individually they seem cumbersome and annoying to use. Together they could be
a really compelling interface device, IMO.

~~~
kristianc
Either that, or most of the companies know that the technologies they are
building are likely to be acquired to build part of the feature set of a
larger offering, and are building them for precisely that reason. Too cynical?

~~~
burkaman
Why is that cynical? That seems like a very reasonable thing to do if you want
to start a company with very narrow domain knowledge.

------
jacquesm
For those wondering about the privacy implications of this technology,
eyefluence required a VR headset to function and I'm not aware of them
demonstrating their tech using the user-facing cameras found in mobile phones
and tablets, so I don't think there is any reason for worry on that front
(yet).

Here is a video of a 2016 demo of eyefluence tech:

[https://www.youtube.com/watch?v=hYH8qLvq7rc](https://www.youtube.com/watch?v=hYH8qLvq7rc)

~~~
asafira
Would this be interesting for Google's Daydream efforts?

------
bluedevil2k
10 years ago while I was IBM, I submitted a proposed patent that was a sort-
of-joke by saying Amazon (or other sites) could get rid of 1-click shopping
and go to 0-click shopping by following your eyes and automatically making the
purchase if you looked at something for a given amount of time. Sounds like
this will be a reality in a few years.

~~~
kbenson
Wife: Honey, do you know why Amazon keeps sending us mini-skirts and fishnet
stockings? I mean, they're not even in my size!

Husband: Nope, sorry, no idea!

------
Dowwie
How many people here cover the front-facing camera on their phone when they're
not using it? I haven't, until now.

~~~
wh0rth
Do you also now cover your computer mic?

~~~
chill1
Not trolling, serious question: What is the audio/mic equivalent of covering
your webcam with a sticker?

~~~
icehawk219
One that I've seen is plugging something into the audio jack, and then
breaking it off and taping over it. That way the computer will detect
something is plugged in and turn off the mic. Of course this won't be possible
once Apple, and then everyone else, gets rid of the audio jack from computers.

~~~
piyush_soni
Isn't that still software controlled? So, let's say a malware is able to gain
privileged access to my Windows machine and can install anything, would he be
able to override this default behavior of 'turning off the mic when something
is plugged in', especially when the external voices are more audible than
something coming from the 'plugged in' dummy audio-in cable?

------
aresant
Better details from a beta tester over on /r/oculus on Eyefluence's tech:

" . . . I'm still not allowed to say how exactly it works but in my mind it's
the gold standard for eye based user interfaces . . . Their [main focus] is
the UI" (1)

So sounds like a blend between some good eye tracking tech and some GREAT user
experience design and implementation.

(1)
[https://www.reddit.com/r/oculus/comments/598clo/google_buys_...](https://www.reddit.com/r/oculus/comments/598clo/google_buys_eyefluence_eyetracking_startup/)

------
amelius
How difficult is it to track eyes using modern deep learning techniques? Isn't
this a simple, run of the mill task by now?

~~~
chriskanan
I'm a machine learning researcher and I also have a lot of experience
analyzing eye movement data.

There aren't large labeled datasets that are publicly available in this area.
Just to classify eye movements into events, you would need images of the eyes
or eye velocity to classify eye events into smooth pursuit, fixation, etc.
That means someone needs to label all of those in ground truth. To actually
determine what someone is looking at, you have to go beyond this to have
information about the scene. When looking at a screen, this is usually done
with calibration, but things get more complicated when trying to predict what
someone is looking at with a mobile eye tracker and a fixed scene camera. Body
and head movements complicate things further (e.g., VOR) and many algorithms
ignore them.

~~~
charlesdm
If you don't mind me jumping in here, since people with eye tracking
experience seem to be quite scarce: is it viable these days to build a working
accurate gaze tracker that would, say, work using just an iPad camera? No
hardware whatsoever, just using the camera to somewhat accurately position a
dot on the screen. Every tracker I've tried is close to absolute garbage when
it comes to accuracy.

~~~
jacquesm
YC invested in a company called 'GazeHawk', but they didn't make it (acquihire
by facebook), the page is still up though.

[http://www.businessinsider.com/facebook-acquires-gazehawk-
fo...](http://www.businessinsider.com/facebook-acquires-gazehawk-for-
talent-2012-3?international=true&r=US&IR=T)

[http://www.gazehawk.com/](http://www.gazehawk.com/)

~~~
charlesdm
Yeah.. I've literally tried 20 solutions, and most can not detect gaze
location well. Gazehawk included.

~~~
jacquesm
That may have been one of the reasons why they didn't make it.

Eye tracking is tricky. I worked a bit with two guys developing a commercial
eye tracker in the 80's, it was a pretty sophisticated optical set-up that
combined the image being looked at and the image of the person looking using a
45 degree angle 50% mirror, that made the job considerably easier.

One of the complicating factors was that eye movements have a lot of noise in
them that our brain has become very adept at filtering out so what we _think_
our eyes are doing can be quite different from what they're really doing.

------
yalogin
Even though this is for hardware and VR, I have to admit the first thought on
seeing the headline was that google doubled down on its advertisement
business. More so after the recent news that they changed their policy to
track users more.

------
legohead
He asks us, "how do you transfer your intent to act without winking and
without waiting", but never tells us how its done. Anyone know? This made the
video incredibly frustrating for me.

~~~
kajecounterhack
They use saccades + some fancy algorithms:
[https://en.wikipedia.org/wiki/Saccade](https://en.wikipedia.org/wiki/Saccade)

------
misiti3780
Facebook bought an eye-tracking startup a few years back:
[http://www.businessinsider.com/facebook-acquires-gazehawk-
fo...](http://www.businessinsider.com/facebook-acquires-gazehawk-for-
talent-2012-3)

At the time, i read it was only a acquihire

------
microcolonel
By the way, Eyefluence's marketing videos are brilliant. I've seen people's
arm hairs stand straight watching them.

They'd be worth acquiring if only for the marketing department.

------
_pmf_
I can see good eye tracking as getting much, much more traction than VR for
regular users. While most applications of VR to everyday tasks seem forced and
gimmicky, eye tracking can revolutionize reading, which is arguably the task
professional and casual users spend at least 80 percent of their online time
on.

Much smarter than investing in VR.

------
mrfusion
Could this be used for vr?

~~~
pmyjavec
Did you read the article at all ?

