
GazeHawk (YC S10) Does Eyetracking With Web Cams - bwaldorf
http://techcrunch.com/2010/07/29/y-combinator-backed-gazehawk-heatmaps-with-web-cams/
======
jashmenn
About 2 years ago an acquaintance of mine suddenly had a stroke. When she came
to she was conscious but completely paralyzed. The only way could communicate
was with her eyes [1].

She's since recovered and is able to talk. But she describes an intense
loneliness and frustration at being a prisoner in her own body. When she was
finally able to use an expensive, specialized computer to communicate by
slightly moving her fingers, it was an amazing encouragement.

Part of the problem right now is that these input systems are extremely
expensive and specialized. Even if the family can afford to pay for one, they
still have to spend time researching which system is the best.

I imagine a free, open-source system that will allow for rapid eyetracking-
based writing. Say, a live cd that contains many commercial webcam drivers and
boots into a system like Dasher
([http://www.inference.phy.cam.ac.uk/dasher/DasherSummary2.htm...](http://www.inference.phy.cam.ac.uk/dasher/DasherSummary2.html)).

This would provide instant communication, however crude, to a person who has
recently been through a tragic experience.

Is webcam eye-tracking imperfect? Yes. Do commercial systems exist that solve
this problem better? Yes. But the idea here is to have something someone could
put into hardware they already have to get _something_ working right away.

OpenGazer: <http://www.inference.phy.cam.ac.uk/opengazer/> A nice list of eye
trackers:
[http://www.cogain.org/wiki/Eye_Trackers#Open_source_gaze_tra...](http://www.cogain.org/wiki/Eye_Trackers#Open_source_gaze_tracking.2C_freeware_and_low_cost_eye_tracking)

*Yes, someone would have to help the paralyzed put the cd into the tray and boot the computer

[1]: <http://www.caringbridge.org/visit/conniemchaddad/mystory>

~~~
jgershen
Props for linking to OpenGazer and Dasher. AEGIS is absolutely a terrific
project, and I've been inspired by the work that Emli-Mari Nel and others are
doing on exactly this problem.

Something to keep in mind when you're thinking about eye tracking for
accessibility is how much custom hardware can greatly enhance the experience.
Even if you're just mounting a webcam on a head bracket and removing the IR
filter, some inexpensive work can really improve the user interface. (Of
course, mounting a camera on your face isn't necessarily conducive to unbiased
usability testing!)

One of my greatest hopes, personally, is that our approach to webcam-based eye
tracking will yield algorithms and methods that we can give back to the
community. Eye tracking has its shortcomings, to be sure, but the potential
benefit is too great to be ignored.

------
davidcann
Is there an example page where we can test the accuracy of the eye tracking
software?

I like the idea, but for $490, I'd like some assurance that the eye tracking
technology actually works and that it's not just tracking the mouse or
something else.

~~~
bkrausz
Haha, a reasonable request. We're working on a good way to let people demo our
technology. Part of the problem is that this is computationally intensive, and
we didn't want all of TC killing our processing servers.

The other part is we found that people demoing the software are less likely to
follow the instructions (since they're not getting paid) and we worried about
having a lot of poor tracks that would look bad. Our testers have been
amazingly good at following instructions :).

~~~
jacquesm
It's only computationally intensive because you're using the processor on your
end to do the work. Why not have a large portion of the work done on the
clients machine?

~~~
bkrausz
Two main reasons:

1) Our processing can't be done realtime (i.e. it takes more than 1 second per
second of video), at least for now, so there would need to be a lag that can
get significant if they user has a slow enough computer.

2) We currently don't require any software to be installed client-side (we use
Flash + JS), which we consider a big perk.

~~~
jacquesm
I can see the advantage of not having to install client side stuff, but if
that's such a performance penalty that you have to charge so much more you're
opening yourself up to competition by someone that does the same thing but
finds a way to do it client side.

Looks like you have a bunch of tough optimisation ahead, much good luck with
that.

Very nice idea by the way, eye tracking used to be the domain of specialised
setups using cameras behind half-transparent mirrors, it's quite impressive
that you've gotten this far with just a cheap webcam as an input device.

------
mceachen
Having done a bunch of user studies in the past, one of the big issues I've
seen is the Hawthorne Effect ( <http://en.wikipedia.org/wiki/Hawthorne_effect>
) -- just the fact that the user knows they are being watched, they change
their behavior so much that the study is irrelevant.

I think GazeHawk has the chance to push usability closer to being unbiased --
very promising!

~~~
bkrausz
Thanks! We didn't even realize this was one of the extra benefits of our
technology, but it seems to be a pretty important one: people have reported
completely forgetting that they're being tracked when using our early
prototypes.

------
alttab
You can now do something very similar on the iPhone 4 because of the front
facing camera. Imagine if you can hit-track when someone is looking at an ad.
You could have an advertising business model at cost-per-glance. This works
great if the ads are mainly for branding impact.

~~~
swah
Would it work with such a small screen?

~~~
jgershen
People typically hold the iPhone closer to their face than they hold their
laptop. Since accuracy in eye tracking depends on the angle between the
calculated gaze vector and the true gaze vector, this means that error will be
reduced.

The amazingly high pixel density on the iPhone 4 means that that the
calculated gaze focus may be more pixels away from the real point - but we're
working on some ways to compensate for that.

------
harrybr
I'm surprised how most commenters here are willing to accept the idea of
Gazehawk without questioning it. No disrespect meant to Brian and Joe, but any
experienced researcher knows that any new research technology / method needs
to be welcomed with a healthy cynicism until its value has been proven in the
real world.

Firstly, can it really be as accurate as commercial eye tracking hardware?
We're talking about the difference between $40,000 and $40 here - a Pepsi
challenge is needed! Some leading figures in the UX industry claim that even
the top end devices can easily become de-calibrated and provide bogus data.

Secondly, even if the technology is acceptably accurate, is the gazehawk
research method effective at delivering findings that actually help you
improve the design of your sites?

The idea of a predefined panel of 'users' is rather worrying. A user is
defined as someone who actually uses your site. In the standard gazehawk
offering, they provide a group of trained testers (who get paid $8 a pop), who
may simply not care about the problem your site tries to solve. For example,
if you have a webapp about mountain biking, you're going to get a panel of
testers who may simply not care nor understand about types of bike, trails,
nor understand any of the terminology you use. Eye tracking heatmaps are a
product of conscious thought (as well as low level visual processing) - if
testers don't care about nor understand the problem your site is trying to
solve, the heatmaps you'll get from them will be almost entirely worthless.

Thirdly, analysis of eye-tracking data is very tricky. There are a lot of
mistakes an untrained analyst will make, which could have major repercussions.
I did a presentation on this at User Experience Lisbon a few months ago:

[http://www.slideshare.net/harrybr/what-you-need-to-know-
abou...](http://www.slideshare.net/harrybr/what-you-need-to-know-about-eye-
tracking-new-uxlx-version)

I really don't mean Brian and Joe any ill-will. All of these concerns can be
dealt with if they share case study projects and having an open channel of
discussion with the UX research community.

------
jolan
It's a shame this is for webpages only. I would very much like "focus follows
eyes" in my window manager.

~~~
hugh3
Yes, if it could actually be made to work accurately (down to the few-pixel
level) and in real time then it would be a brilliant user interface element.
Why should I click a mouse when I can just stare at the "reply" button for
half a second?

(Yes, all sorts of potential badness which could happen if implemented
unwisely.)

~~~
lkozma
There is a big problem with eye-controlled interfaces called "Midas touch"..
It gets extremely tiring to force yourself to look/not look in one point.

------
jgershen
Looks like you beat us to the punch with the article!

Questions and comments are welcome. Thanks everyone!

~~~
johnrob
Any more info about the actual test users? Are they using their own computers,
or do they come to a test lab?

~~~
bkrausz
They use their own computers and webcams, providing a much more natural
environment for testing. That's one of the reasons we can offer such a low-
cost service.

If anyone is interested in being one of our users, sign up at
<https://www.gazehawk.com/tester/signup/>

------
brent
Sounds like an amazing idea... strikingly similar to the idea I pitched to YC
in an interview in November, 2006 :-/. I guess they didn't like our team or
progress. Bummer.

------
aresant
Brilliant.

we spend all day thinking about conversion rate optimization via
conversionvoodoo.com - some of our larger clients have full hardware rigs and
I can tell you this data is invaluable WHEN it's accurate.

Certainly moreso than usertesting due to the speed at which the eye moves /
tracks - killer product, great release and can't wait to test it - check your
site support email, I'd like to see how we could leverage this for more of our
clients & case studies.

------
eekfuh
Very cool software and I would LOVE LOVE LOVE to use this, HOWEVER we make a
hardware appliance and we can't let people on it (via web access) to do
testing. I'd rather pay like $5k for a webcam and a software package to
install and run tests with my customers (which are NOT normal web users, they
are security experts) to see how they view our interface.

~~~
bkrausz
Thanks for the feedback. Right now we're targeting web-based interfaces, but
expanding into different products is definitely something we're considering.
It depends on what kind of demand we see for it.

~~~
eekfuh
The product portion I want to use is a web-based interface, however it can
only be used for internal network use and we wouldn't be allowed to make it
internet-accessible.

I think we should be able to webcam/monitor our users here, have an app gather
data on our computers, then have our computers (which can access the internet
AND our internal network appliance's interface which we are testing) upload
the data to your website to be parsed.

~~~
bkrausz
I see. We may be able to work something out, email me at brian _AT_
gazehawk.com and we can discuss the details. Our upcoming plans to let you use
your own users may solve this.

------
ck2
Very impressive. They deserve to succeed for being so clever.

But are there more samples/examples? I can't find any on the site.

At $5 per site viewed, I hope the qualify the viewers! (oh wait it's " _up to_
" $5)

( <https://www.gazehawk.com/tester/signup/> )

~~~
bkrausz
Good point, we'll add some example links to the site. In the meantime:

<http://www.gazehawk.com/img/techcrunch-heatmap.png>

<http://www.gazehawk.com/img/wakemate-heatmap.png>

We pay $2 for trying right now, $4 for actually completing, and $5 if you give
us some incredibly insightful feedback or bug report. We may decrease this in
the future, but we need to figure out our supply/demand curve first.

------
swah
The next step is to reorganize the ads on the page dynamically so you can't
avoid them!

------
handler
it would be extra useful if you could see the order in which people look at
things

~~~
bkrausz
We have a nifty feature that let's you replay someone's gaze in a dot that
moves around the screen. In the near future we'll have more quantitative
things like how gaze order and fixation time.

~~~
handler
awesome, once you get enough data like that you can do real cool analytics
like "what color draws the most attention", "do drop shadows make it easier to
focus on sections", etc... it would make for a great blog, something along the
lines of <http://blog.okcupid.com/>

------
il
I would try your service in a heartbeat, but the pricing seems quite high.
There's a big disconnect between charging $49 per user and paying them $5-
couldn't you significantly lower pricing and still be profitable?

~~~
bkrausz
There's a significant overhead to eye tracking in terms of computation, so our
costs aren't just paying testers. We wanted to price ourselves competitively
compared to sites like UserTesting.com, but of course pricing was/is something
we've discussed a lot.

------
paraschopra
If you can add where a user looks first, that'll be great. Something like
order of gazes on a page?

