
OptiKey – Full computer control and speech with your eyes - aw3c2
http://www.optikey.org
======
kozukumi
From the get started page

    
    
      "If you are unsure which computer/laptop/tablet to purchase
      and are considering spending a lot of money then please email me 
      - I can offer personal advice on how to target the sweet spot 
      between cost and performance (and screen size)."
    

So not only is he giving away this amazing software but he is also offering
free, personal advice on what you need to buy working with your budget!

Truly an inspiring person.

~~~
sorahn
"The way to take over an industry is not to fix the current model, but to
completely destroy it and replace it with a model you know is better"

Open source is the better model for every single industry. It's coming.

~~~
minikites
Do you think open source fine art is better than one from a single artist's
vision?

~~~
cgio
Art has been quite open source in its workings since centuries. Look at how
artists were joining masters to learn the art and how they "forked" masters'
techniques to create theirs. Personal vision is not precluded from open
source, to the contrary it is encouraged.

~~~
bro-stick
R&B, EDM, pop and so on wouldnt have much left without remixes and sampling.

~~~
robbrown451
But in that case money changes hands. That's pretty different from open
source.

~~~
sangnoir
Nothing about open source precludes money from changing hands. Case in point:
Red Hat.

------
dageshi
[https://www.reddit.com/r/programming/comments/3ke7ug/eye_tra...](https://www.reddit.com/r/programming/comments/3ke7ug/eye_tracking_software_for_sufferers_of_alsmnd_can/)

A lot of interesting information from the author over there.

META: Interesting to note how much cool stuff is turning up on reddit first
before it appears here... I feel like it used to be the other way around.

~~~
dev1n
Re: META

People will stop posting to a community when they feel their posts aren't up
to what they believe the community tends to accept as Par for the course.

~~~
dageshi
That's my impression as well.

~~~
fr0styMatt2
The way posts work on HN is a bit puzzling and not immediately obvious - I'll
post something and then refresh the front page and won't even see it. The
Reddit model seems simpler - I post something and it appears.

~~~
prawn
I think you're correct. A Show HN that goes nowhere fast will disappear and be
seen by virtually no one. An equivalent in r/startups or similar will still be
seen by a few people even if it doesn't attract comments or upvotes.

------
exhilaration
Could this be covered by existing patents? Tools like these for the disabled
are a big business, and patent holders have shut down competition in the past:
[http://www.disabilityscoop.com/2012/06/14/dispute-ipad-
app-p...](http://www.disabilityscoop.com/2012/06/14/dispute-ipad-app-
pulled/15846/)

~~~
_-__---
This is a good point - how do copyrights protect intellectual property from
open source alternatives? I think that this is a really good thing to open-
source, but the law can be tricky to navigate. Almost like you need a degree
in it...

~~~
kefka
Well, good luck shoving this code back in the bottle (read: git archive).

It's out and 22 public forks already. Who knows how many git clones...

------
mh-cx
I wonder if you can combine this with Vim so that you could use it for
lightning fast cursor movement and selections and still use the keyboard for
anything else.

~~~
intruder
The problem with gaze tracking is that achieving the kind of accuracy you are
thinking about isn't possible right now.

The trackers he listed have an accuracy of about 0.3-0.5 degree which at a
distance of 50cm to the screen still maps to a fairly large area of about
50x50 pixel (depending on screen resolution and PPI). That's far too big to
guess where you want your cursor to be placed.

Moreover, the fact that you'll see the cursor moving creates a loop in which
you will follow the delayed cursor around with your gaze.

------
mcbuilder
I wonder how well the hardware would work with an input system like Dasher.
Dasher was experiment in statistical inference and tries to be an
accessibility tool as well.
[http://www.inference.phy.cam.ac.uk/dasher/](http://www.inference.phy.cam.ac.uk/dasher/).
I found it really fun to play around with and could get decent speeds but not
great. It's true about eye tracking being prohibitively high. When we order an
eye tracker for the lab, it ended up costing us £10K.

~~~
dsfsdfd
yeah, I remember dasher. Quite effective and pretty fun - agree a mashup of
eyetracking and dasher could be worthwhile.

------
melling
In the demo, he was quick and he didn't even seem to use much autocompletion.
However, I wonder if using a different keyboard layout would be helpful.
Dvorak a Colmak, for example, place the more frequently used letters on the
home row so you'll spend more time with your eyes there. You can evaluate
different layouts with a tool like this:

[http://patorjk.com/keyboard-layout-analyzer/](http://patorjk.com/keyboard-
layout-analyzer/)

~~~
pavel_lishin
I wonder if it really matters with your eyes vs. your hands and fingers.
Flicking your gaze around the keyboard seems much easier than moving your
fingers around - you'll never have two fingers hitting each other.

And on the other end of the speed spectrum, it seems like it's unlikely that
this will ever quite be as quick as operating even a QWERTY keyboard with
fully functional hands.

~~~
melling
This tool is for people who can't use their hands, of course.

For the average person, we're pretty close to being able to use voice instead
of typing.

[https://www.extrahop.com/blog/2014/programming-by-voice-
stay...](https://www.extrahop.com/blog/2014/programming-by-voice-staying-
productive-without-harming-yourself/)

[http://ergoemacs.org/emacs/using_voice_to_code.html](http://ergoemacs.org/emacs/using_voice_to_code.html)

Throw in eye tracking and precise gestures
([http://www.youtube.com/watch?v=0QNiZfSsPc0](http://www.youtube.com/watch?v=0QNiZfSsPc0))
and the keyboard isn't necessary.

~~~
tcdent
Instead of coming home from the office with Carpal Tunnel we'll be coming home
with hoarse voices. Color me skeptical.

~~~
melling
Some people will gladly take a hoarse voice. Ever see someone have to resort
to using their nose?

[http://www.looknohands.me](http://www.looknohands.me)

------
malnourish
This is amazing software and I'm glad something like this is open source.
Accessibility hardware often costs a fortune (for understandable reasons) and
software isn't typically cheap either. This seems to be very high quality and
it has the potential to really improve people's lives. I'm glad things like
this exist.

------
dwiel
I have been using voicecode.io (with dragon naturally speaking) and IR head
tracking so that I don't have to use my hands for anything now with pretty
good success. The head tracking mouse has a much smaller learning curve, but
programming by voice has been rewarding as well.

~~~
melling
Has anyone gotten the other solutions to work? I've been collecting my notes
but I haven't gotten around to it.

[http://thespanishsite.com/public_html/org/ergo/programming_b...](http://thespanishsite.com/public_html/org/ergo/programming_by_voice.html)

~~~
dwiel
voicecode.io is far ahead of anything else in terms of out of the box
programming by voice. Everything else requires a lot of up front effort and
essentially designing your own language before you can be productive.

------
morley
This could be a really nice keyboard interface for VR too.

~~~
jasondrowley
It's curious that Optikey was posted today, because yesterday I wrote a post
about the promise of iris scanning technology to deliver a seamless payments
experience in VR headsets.

Unfortunately, none of the VR headsets shipping in the next six months have
iris trackers (much less scanners). The only headset I found was Fove, which
was successfully kickstarted and ships in 2016. Fove did a promotion with a
Japanese school for wheelchair-bound children, in which a child played a piano
using the iris tracking function of the headset.

There are some iris scanner-equipped phones shipping in the near future, but
none with iris trackers as far as I could tell.

I can share the post link if people are interested.

~~~
deutronium
[https://theeyetribe.com/products/](https://theeyetribe.com/products/) looks
kind of interesting regarding eye tracking, I'm curious how well it performs

------
samstave
Here is an idea: Tell me if this is stupid:

Assume you have a bunch of HUDs/AACs/Glasses/Whatever that are using this
tracking tech. Assume that they are ONLY looking at the real world, not some
online data/webpage etc...

There is a camera that either is fwd looking or also 360 looking.

Use the tech to eye track exactly everything that MANY people are looking at
to train an AI as to what items in the real-world are important.

i.e. "the ground exists and we know its there, thus its priority in
information is low" however "these signs that are being looked at have a
higher context priority, and require understanding"

By doing this on a fair scale... you could train AI to use the information of
"whats visually important to human navigation" to train them to navigate. This
augments what other ML/AI stuff has already been going on...

I do not know if this is basically how self driving cars were developed -- but
now that you have a seed of this tracking tech in open source -- it could
blossom.

~~~
olympus
I don't think this is stupid, but probably won't happen any time soon. A
project to correlate what humans look at to objects in the real world is
probably possible, but would be a significant effort to get it to work
reliably and accurately. Scaling it to a large population and processing the
data to the point of providing useful insight about how we behave and look at
things would be another hurdle.

Basically this probably won't happen until Google (or Facebook, or Apple, etc)
decides that knowing what people look at is worth the effort/cost.

~~~
JoeAltmaier
I'd be happy if all those time-and-space-stamped photos uploaded to facebook
could be stitched together to make a photoessay of my vacation. That way I
wouldn't even have to take a camera with me; facebook could use Other People's
Pictures that I just happened to be in proximity to.

------
neilmovva
Great work. I wonder if the author had heard of the Eyewriter project [1]?
Similar system, fully open source, but I'm not sure how active development is
nowadays. It's been up since ~2011, though, so quite old by software standards
(uses openFrameworks/openCV). Still, it worked impressively well when I built
a derivative system.

The accessibility space needs as much open-source development as possible -
most of the commercial tech, if you can find it, is locked down and outdated.

[1]: [http://www.instructables.com/id/The-
EyeWriter-20/](http://www.instructables.com/id/The-EyeWriter-20/)

~~~
OptiKey
One of the main motivators behind OptiKey - fantastic project.

------
aluhut
Can someone elaborate on the use of eye trackers in dark rooms (rooms
lightened by dimmed lights or only the monitor itself)? Does it work?

~~~
richman777
My mom has ALS and has an eye tracker. The camera is infrared and shows up
based on any light in the room.

From looking at the tracking software it seems to calculate contrast between
the retina and the surrounding eye to track what it is looking at.

~~~
aluhut
Thank you.

------
andhess
You should take a look at EyeFluence - they have developed some incredible
techniques to type with your eyes using non-dwell methods.

~~~
OptiKey
I will, thanks for the heads up

------
daturkel
Readers here may appreciate: I spoke with the dev for a piece on OptiKey:
[http://www.businessinsider.com/an-eye-tracking-interface-
hel...](http://www.businessinsider.com/an-eye-tracking-interface-helps-als-
patients-use-computers-2015-9)

------
rsmith05
As someone that suffers from RSI, I'd be interested in an Opti-mouse.

Does this exist yet?

~~~
olympus
Tobii bundles some (Windows only) integration software with their EyeX. It
allows you to click on whatever you are looking at with the press of an action
button (left ctrl key for me), along with a few other features (eye tracking
the alt-tab switcher is what I really like). It works okay, but eye tracking
isn't quite as accurate as a mouse and you either need to make your icons big
and widely spaced or you have to zoom in (with an action click) before
selecting the icon precisely.

------
ximeng
[https://www.justgiving.com/Julius-
Sweetland](https://www.justgiving.com/Julius-Sweetland) also raising money for
cancer charity at the same time.

~~~
OptiKey
Thank you for including this. People are being very generous.

------
sanqui
I wonder if a version of this for mobile phones could make typing on a phone
faster than fumbling with a touch screen.

~~~
jerf
Mmmm... that was a great deal slower than me using the latest Android swiping
support, even counting errors. Can't speak to the other mobile tech stacks,
but if you're typing on a phone is slower than that you might want to poke
around, you may be missing a technology already installed on it.

(Not a criticism of the excellent work here. It's a fundamental bandwidth
problem.)

Concretely: In the time the demo took to write "Meet OptiKey. Full computer
control and speech using only your eyes.", I wrote in an Android text area: "I
am just typing words as quickly as I can in the android interface. I'm even
having to think a bit about it more that I've written so much. uh, hello
world? and other stuff. I'm still going and going and going." Including the
missing "now" between "more" and "that", and you can't see it but I had to
correct 3 words, too. That's about three times faster on the phone I have in
my pocket. And let me just say... that is impressive performance for a free
eye tracking suite, in my opinion, to still be that fast with just eyes.

~~~
OptiKey
No you're right. Dexterous fingers beats eye tracking for text input, but my
target audience can't use their hands so this is (hopefully) a good
alternative.

~~~
jerf
Please allow me to reiterate: I'm _impressed_ with your work. Nothing there
was a criticism. In fact it struck me as well-designed.

------
jparishy
Pretty cool, fellow Julius!

------
gragas
This is amazing software, but will it work with me? My left eye is a glass eye
and it has little to no movement.

~~~
olympus
I have a Tobii EyeX and it has the option to track a single eye. You would
basically need to set it to track just your right eye, which is a piece of
cake in the config (on Windows at least).

~~~
vanderZwan
And if OptiKey doesn't support it yet, open an issue on github! :)

~~~
OptiKey
OptiKey will listen to whatever it's told to! If you're using a Tobii tracker
you should be able to change your configuration to track just one eye. OptiKey
just cares about the coordinates that the Tobii engine spits out so it should
be fine. Best of luck.

------
curiousjorge
this is so mind blowingly good.

