
Webgazer.js webcam eye tracking on the browser - lazyjeff
https://webgazer.cs.brown.edu/
======
natebleker
Disclaimer, I am the primary engineer of a commercial eye tracking system.

Tools like these popup every now and then are a nice tool for rough estimation
of gaze. Fixation tracking is a large complicated issue that usually requires
some sort of calibration to get more precise results. By sidestepping the
calibration problem, much higher subject compliance can be achieved since you
don't have a grad student barking confusing order at you. The downside to this
is the noise you see in the tracking results. For those interested a product
that produces similar results is pupil labs ambient gaze tracking "Core"
research headset[0]. [0] [https://pupil-
labs.com/products/core/](https://pupil-labs.com/products/core/)

~~~
xamuel
In your estimation, how far away are we from eye-tracking software being able
to detect the start of a microsaccade, estimate where the gaze is moving
toward, and draw new contents on the screen there before the gaze even reaches
that point? I would think that by "hacking" into human brain "vulnerabilities"
like saccadic masking and chronostasis, such software could potentially yield
seriously trippy and mind-altering results!

~~~
natebleker
There’s two parts to that implementation of such a system, and they’re both
interesting! The first part is detection of a micro saccade which is already
available in research systems I have personally worked on. You can basically
crank up the camera frame rate until you’re around the micro saccade range and
do some clustering analysis on the positional data to decouple the hardware
moving vs the face. The second part is having a fast enough reliable
commercial grade displa system to present stimulus on. Displays are very
fickle in practice and getting your hands on one that can run with adequate
color, contrast, and brightness at speed is currently very difficult. One
angle under research currently is the perception of stimulus during saccades
and micro saccades. There’s quite a bit of time and effort in the industry
going into neurological assessments through saccades, the tools coming out of
this are really starting to come down in price. This opens the door for a
bunch of lower priced research options, such as the parent article, to enable
a much more rapid pace of understanding.

~~~
tootie
That's fascinating, but I'm wondering what is the practical value of detecting
microsaccades? Aren't they just involuntary twitching?

~~~
taneq
For the duration of the microsaccade, you're blind. So if something changes
onscreen it's much harder to see.

IIRC people have used small orientation changes during microsaccades for
redirected walking in VR. You feel like you're walking straight but you're
actually curving back on yourself.

Edit: I think that was just detecting full-on saccades but a microsaccade
version would be smoother and harder to detect.

Source: [https://blog.siggraph.org/2018/05/challenge-accepted-
infinit...](https://blog.siggraph.org/2018/05/challenge-accepted-infinite-
walking-in-vr.html/)

------
giraj
Had a blast at LauzHack '16 using this library to make a Chrome extension
enabling "no-hands scrolling"[1]. Sadly, reading web articles while eating
with both hands is still a dream. We managed to get okay-ish results on a
MacBook Pro in perfect lighting from time to time, but nothing consistent. In
variable lighting and on lower-end laptops we found it impossible.

I wonder if it would be possible with a better webcam and good, consistent
lighting.

[1]
[https://github.com/jarlg/lookmanohands](https://github.com/jarlg/lookmanohands)

------
awake
In the brown undergraduate computer vision course one of the homework’s (or
final project; my memory is a little hazy) was to improve the tracking of
webgazer. People were pretty successful from what I remember. As an undergrad
I didn’t think much of it but as a grad student I would be slightly mortified
if undergraduates were improving on my current research for homework.

~~~
dylan604
Why? Someone smarter will always follow

~~~
yjftsjthsd-h
Not even just smarter; we all stand on the shoulders of giants, and every year
the barrier to entry falls that much more.

------
gruez
Can't wait until this makes its way to adtech. You can have sites that hold
their "premium" content hostage until you allow camera access and proved (via
eye tracking) that you looked at their ad.

~~~
oceliker
“Please drink verification can”

~~~
ketzo
Boy, I love to see an r/lol reference on HN. Gotta be a lot of crossover,
right?

------
stared
I had an idea for an evil-eye game, in the style of Happy Tree Friends or so.
That is: a lot of cute creatures around, and nothing happens (they may giggle,
jump or blink occasionally) until you look at them. If you do...

~~~
lecarore
That's a really cool use case for the library ! I think the calibration stage
could be made easier with some "click as fast as you can on the randomly
appearing circle on screen" gamification, and just assume that people look at
what they click on by default. It could be some little character changing
color at random and you need to click it at that point, to make sure the user
keeps his eyes on it for a few seconds before clicking.

------
paq85
If you want to see how well webcam eye tracking works check this out
[https://www.realeye.io/test/f80dd676-3b55-4d2c-931c-d925b068...](https://www.realeye.io/test/f80dd676-3b55-4d2c-931c-d925b068a740/)
It's using WebGazer as it's core - but with many improvements done by us.

BTW I'm RealEye co-founder so AMA :)

------
HomeDeLaPot
I just don't see the point of this. I don't think I've _ever_ lost track of my
eyes.

------
mrburton
Sadly my eyes are too small to actually track the pupils. I remember when I
got my ID, the lady taking the picture said "Sir, please open your eyes". I
looked at her and asked "They are open... you don't see that?" She laughed and
asked if I had been smoking. :/

------
chrisweekly
Whoa. I have a strange mix of reactions to this. It's impressive, and
interesting, and worthwhile, slightly creepy, and potentially hugely useful
for a11y, democratizing cutting-edge HCI, AR, gaming, etc etc. Surprised it's
not getting more attention!

~~~
chrisco255
One good thing about webcam access on the web is that browsers require
explicit access by the user and browser tabs clearly indicate when a tab is
recording with either the mic or camera.

~~~
chrisweekly
Yes! Hence the "slightly" modifier. :)

------
evan_
This seems really cool but it doesn't seem to let me choose which webcam to
use- it seems to default to a virtual device I use for streaming that isn't
set up right now so I can't really try it out.

~~~
nl
I've had this issue with Chrome on OSX where it won't allow camera selection.
Firefox (and I think Safari) both do.

It's unclear why Chrome does this - it has a preference for which camera but
it seems ignored and takes the most recently installed one.

------
bscphil
The "move the ball around with your eyes" demo doesn't work at all for me in
Safari on a MacBook. I can only move the ball with the mouse. The other demo
works okay, but very low accuracy (like 50%). Firefox on the same computer is
unusably slow, generates a new "data point" (the visible dot) only every few
seconds. Accuracy was about 10%.

------
Meph504
The listed use cases here struck me as odd, I can't imagine why I or anyone
wanting to read a news article would allow the site access to my webcam. Nor
do I think most users would turn on their cameras to help with any given sites
analytics.

So aside from games, what is the actual use case here ?

~~~
michaelbuckbee
I'm starting to develop RSI in my wrists and hands, and have been looking at
eye tracking systems as a potential way to help reduce the amount of clicking
that I need to do.

I was thinking maybe I could put webgazer.js into a chrome extension and use
it to navigate through pages for me, but I don't think it has a click
mechanism.

fwiw - I'm a huge vimium extension user as keyboarding is less of a strain
than mouse movement.

~~~
yjftsjthsd-h
Have you looked at [https://eviacam.crea-si.com/](https://eviacam.crea-
si.com/) ? Supports Windows, GNU/Linux, and Android, works globally, and runs
separately from the browser. (Technically not _eye_ tracking, but close
enough)

~~~
michaelbuckbee
Wow! I had not, but that looks like it might do the trick. Thanks!

------
cbrgm
Can you recommend any literature to get an introduction into eye-tracking at
all?

~~~
natebleker
Here’s an old school high level overview of pupil tracking methods[0]. The
focus recently has been moving these older computer vision methodologies to ML
without introducing too many errors.

[0]
[http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.155...](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.155.5186&rep=rep1&type=pdf)

------
ebg13
I tried my best on the calibration demo page in Safari and consistently get a
0% accuracy result. It calibrates sort of ok in Firefox. I guess this is
heavily browser dependent? But why?

~~~
lazyjeff
Hmm, not sure -- do you see the gaze dot when you're looking at the center
circle?

~~~
ebg13
I see a gaze dot. It just seems kinda like Safari is accumulating some
tracking error that Firefox doesn't. Both browsers find my face accurately.

(side note: when the gaze dot overlaps the calibration circles, the circles
become unclickable without some very careful fiddling)

------
dsteinman
I tried using this one time but unfortunately found it didn't work when I wore
my reading glasses. The screen reflection off my glasses just whites out both
eyes.

------
pmlnr
Now add this to vidconf systems by figuring out which speaker is who, and make
a button to talk only to them.

