
Reverse-engineering how the Oculus Rift DK2′s tracking system works (2014) - aaronsnoswell
http://doc-ok.org/?p=1095
======
aaronsnoswell
Part 1 of an ongoing 4-part write-up. In part four he has released an alpha
preview of his open source Oculus and VR framework; see
[http://github.com/Doc-Ok/OpticalTracking](http://github.com/Doc-
Ok/OpticalTracking).

tl;dr; The Oculus positional tracker system uses 40 IR LEDs, each with a
10-bit cyclic code of bright or dim flashes. The tracker camera is synced to
the headset and can identify each LED using these codes, elegantly simplifying
the job of recovering the 3D position of the headset (the 'Perspective-n-
Point' problem in Computer Vision). This gives very good absolute position and
orientation tracking, however degrades when only coplanar LEDs are visible.
This can be overcome by fusing the IMU data from the Oculus, which will be the
contents of part four when it is written up.

This man (Oliver Kreylos) seems to do a great job of explaining computer
vision problems. I really enjoyed reading these posts, and others on his blog.
He's is very well qualified to write about these topics, as well (see his bio
here: [http://doc-ok.org/?page_id=6](http://doc-ok.org/?page_id=6) \- he has
been researching in this area for years).

------
TrevorJ
Interesting. I think Valve's solution is probably the most elegant one I've
seen so far: [http://www.hizook.com/blog/2015/05/17/valves-lighthouse-
trac...](http://www.hizook.com/blog/2015/05/17/valves-lighthouse-tracking-
system-may-be-big-news-robotics)

~~~
telepheron
Which is a copy of the at least 50 year old head tracking system used in the
AH-64A (and probably other aircraft).

