

Homebrew Oculus Rift Eye Tracker - modeless
http://jdarpinian.blogspot.com/2014/06/homebrew-oculus-rift-eye-tracker.html

======
maciejgryka
That's pretty cool! It's also possible to put the cameras below the eyes, if
you're comfortable cutting through the lenses:
[http://arstechnica.com/gaming/2014/06/why-eye-tracking-
could...](http://arstechnica.com/gaming/2014/06/why-eye-tracking-could-make-
vr-displays-like-the-oculus-rift-consumer-ready/)

(disclaimer: I work for SMI)

~~~
modeless
Cool, I was excited to see the articles about your eye tracker last week. I
don't have any equipment that would allow me to cut lenses precisely like that
(did you use a waterjet cutter or something?). I really wonder how you guys
are dealing with the calibration issues I mentioned in my post! Are you
somehow tracking the motion of the Rift relative to the head?

------
hyperion2010
This would be idea. The only issue is that to get the full power of eye
tracking integrated into the rendering system you need a camera that does at
least 1000fps and even that is too slow for some stuff (researchers who use
saccades as a behavioral readout in monkey research often use an implanted
eyecoil because cameras are too slow). Those cameras are not cheap but maybe
with a little math they can get away with using a slower one that costs less.

Ultimately I expect this to happen officially because it is just too powerful
an interface and there is so much useful data to be gained (plus scientists
would love oculus forever). Probably won't happen in the near future since as
Plamer pointed out in one of his talks it turns out avatars only need to look
in a probable direction to seem realistic.

~~~
sitkack
Low resolution high frame rate CMOS imagers are quite inexpensive.

------
vanderZwan
Very cool - I bet psychologists would also love this, pupil dilation is a
pretty good quantitative for mental effort IIRC, and with this set-up you can
really control the circumstances.

How hard/easy was it to get that image processing working in Halide?

~~~
modeless
It was a bit tricky to work with Halide. The Halide code itself is beautiful,
but it depends on LLVM which takes forever to build and link, and I ran into
some snags with Windows support. I originally wanted to write the whole
algorithm in Halide and run it on the GPU, but it turned out that the Halide
language isn't expressive enough to implement a Hough transform. I had to
write that bit in C and I'm currently only using Halide to do some more
straightforward image processing tasks like edge detection filters, blurring,
and resampling. It works really well for that though, and performance-wise it
runs rings around the naive nested C for loops that I would otherwise have to
write.

~~~
vanderZwan
Thanks, been interested in the language from a distance for a while now.

Can't claim to know what a Hough transform is, but do you think the lack of
expressiveness can be fixed without interfering with the tricks that make it
work? It's still a language under development after all.

~~~
modeless
I think the language could be made expressive enough, and the developers
thought so too when I asked them about it. I think they're focusing on other
things for now though.

A Hough transform is kind of a weird beast; not really your typical image
processing problem. For anything that looks like a series of convolutions or
other simple image operations, Halide is going to be the fastest thing around;
even faster than a series of calls to the Intel IPP library (because of the
ability to tune memory access patterns across multiple stages at once).

------
jonmrodriguez
That is a brilliant idea to use a hot mirror and IR camera like that!

This is probably today's most accurate technique to use for VR eye tracking,
right? And it looks super cheap for any company or project to produce this
assembly in-house.

------
Nogwater
The patent landscape around eye/gaze tracking must be a minefield.

~~~
modeless
I hope that my public disclosure of all these ideas will, at some point in the
future, ruin a patent lawyer's day.

~~~
fit2rule
I know some people who have the patents on this stuff. You've only just
scratched the surface...

~~~
modeless
You don't need to tell me that companies who've been doing eye tracking for
years are ahead of my little homebrew project. I know that, obviously. But if
I can prevent even one patent that claims "eye tracking like before, BUT IN A
VR HEADSET NOW", I'll consider that a win.

~~~
fit2rule
You don't have a chance. Eye-tracking has been an integrated part of VR
headsets for years, since the 80's, and is well and truly sealed up with
patent protection. That there are no consumer-ready, plug and play products on
the market today is actually a reflection of this fact, not a consequence of
omission. The people with these patents are using them in
industrial/medical/military/marketing applications - but the consumer version
of this tech is well and truly on its way.

Once the glasses-computing wheels start to turn for consumers, expect to see
integrated eye-tracking become a stock standard.

------
gdewilde
Could make a mini map for the living room (with the humans marked on it)
and/or a perimeter alert system to avoid being the subject of practical jokes.

------
sixQuarks
I'm excited to see what Oculus officially has in store for eye tracking. I can
see this being essential for non-gaming VR (pun intended)

------
atmosx
That's impressive. Raises privacy issues[1] but it's awesome and the
possibilities endless.

[1] Would you trust a clear screenshot of your eye which could probably used
for identification flying around the internet?

~~~
taco_john
Hey man, Facebook's gotta put that In-q-tel capital to work!/s

But on a more serious note, isn't shining IR directly into your eyes from such
close distances hazardous?

~~~
antimatter15
I'm sure the amount of IR light involved is far less than the exposure you get
from walking outside on a nice sunny day.

