
Inside Look at Valve's New Lighthouse Tracking System - beambot
http://www.hizook.com/blog/2015/05/17/valves-lighthouse-tracking-system-may-be-big-news-robotics?ref=HN
======
MrRadar
There's some discussion of this article on r/oculus:
[https://www.reddit.com/r/oculus/comments/36cli8/valves_light...](https://www.reddit.com/r/oculus/comments/36cli8/valves_lighthouse_tracking_system_may_be_big_news/)

------
twic
This is not entirely removed from how light guns work:

[http://en.wikipedia.org/wiki/Light_gun#Cathode_ray_timing](http://en.wikipedia.org/wiki/Light_gun#Cathode_ray_timing)

The basic mechanism in both cases is measuring the time taken for a patch of
light that is steadily moving from a start point to become visible to a
detector, and then using knowledge of the scan rate and the geometry of the
detector to calculate orientation.

Valve's system is considerably more advanced than Nintendo's, of course. But
then they have had twenty years to work on it (i think that's about two and a
half years in Valve Time).

------
CodexArcanum
The physics and engineering of light are not something I'm very knowledgeable
about, so forgive me if this is naive. Couldn't you reduce the latency further
by running the x and y scans in parallel, and using something (e.g. different
modulations) to tell them apart? It seems kind of a waste to run the x and
then the y in sequence.

~~~
hansihe
If I'm not mistaken, most standard IR sensors have built-in demodulators. They
most likely want to keep the parts needed for a receiver as off the shelf as
possible.

------
Nexxxeh
I wonder how it'll compare to the system Technical Illusions uses for castAR's
tracking. Especially given the divergence of the two teams.

(Technical Illusions being the company of Jeri and Rick, who left Valve taking
the Augmented Reality tech with them.

Jeri Ellsworth basically built the Valve hardware division. Valve worked on AR
and VR simultaneously and then canned the AR stuff. Gabe let Jeri take the AR
stuff when that happened.

An interesting story that I've oversimplified. I think it was on an AmpHour
podcast but don't quote me.)

~~~
theschwa
My understanding is that technical Illusions tracking system is tied to the
retro-reflective mat. The glasses see some LEDs on the mat and use that to
determine their position and orientation. So the tracking is limited to the
glasses being in view of the mat. If you turn around, you're no longer being
tracked.

The Lighthouse system, on the other hand, is a full volume tracking system
that can be expanded to cover any volume of space. So I could walk around a
room and no matter where I am or where I'm looking, I can determine my
location and orientation.

~~~
Nexxxeh
>My understanding is that technical Illusions tracking system is tied to the
retro-reflective mat. The glasses see some LEDs on the mat and use that to
determine their position and orientation. So the tracking is limited to the
glasses being in view of the mat. If you turn around, you're no longer being
tracked.

Yeah, although to clarify, it's not the mat per se, it's the tracking marker.
(Seen here, bottom picture, as it's the same one for the wand:
[https://www.kickstarter.com/projects/technicalillusions/cast...](https://www.kickstarter.com/projects/technicalillusions/castar-
the-most-versatile-ar-and-vr-system/posts/1184549)

There's a wand PCB and a wand on there too, they aren't part of the marker.)

The tracking marker is also covered in retro-reflective material, but it's not
part of the mat itself. Relevant because there are clip-on adapters to turn
the castAR goggles from AR to VR, and you don't need the mat for VR.

It may not be the final version, as the tracking system has gone through a few
upgrades and tweaks already according to the updates.

------
rsp1984
This sounds very cool but from what I understand from the article I guess it
would require direct emitter-receiver visibility, i.e. turning around and
facing away from the emitter wouldn't work.

Similarly, if you used several emitters to circumvent above problem you'd run
into aliasing issues and you'd have to engineer the system such that different
emitters can be clearly distinguished from each other.

The low latency is a big deal though. See-through AR is a particularly hard
nut in terms of latency because just the slightest bit of it can completely
destroy the experience (overlay images lagging behind real world), unlike VR
where all the photons come from one source and some latency is tolerable.

~~~
cma
The lighthouses seem to have a 90° sweep. The sensors on the Vive headset (but
for some reason not the valve controllers) seem to be in 90° occlusion pits.
This means that if the two lighthouses are at opposite corners of a (square)
room, for the most part no single sensor will ever pick up both lighthouses at
once, no matter where the headset is in the room or how it is oriented.

I think that was the initial strategy, but now they are going with a timing
based or modulation based solution, and that's why the controllers, which were
developed later, don't have occlusion pits. In this interview with Alan Yates
of Valve, he mentions timing, modulation, the role of the LED array, and
getting rid of the wire between lighthouses:
[https://soundcloud.com/hackertrips/alan-yates-of-valve-
talks...](https://soundcloud.com/hackertrips/alan-yates-of-valve-talks-non-vr-
applications-of-lighthouse-and-more)

~~~
cma
(actually from a talk today, it's a 120 degree sweep)

------
IshKebab
That's actually brilliant. I assume you then need two lighthouses so you can
do triangulation and get the position of your sensor rather just the angle to
the lighthouse.

~~~
gallamine
You would be able to get some distance information via the modulation on the
signal. If your photodiode response was able to pick up phase changes, you can
estimate distance. The light arriving at closer diodes will have a different
phase than the further diodes. The article mentiones "MHz" intensity
modulation on the does which gives you several meter wavelength.

~~~
beambot
I think your conceptualization is a bit off.

Time of flight as you describe could work at the transmitter, but could not
work at the Lighthouse receiving photodiode. To measure time of flight, you
need to know (very precisely) when the light was initially transmitted. The
synchronization flash is insufficient for this, as an error of just 1ns
results in a 1ft error -- modulation or not. Even with modulation, the
receiving photodiode doesn't have another signal with which to compare for ToF
measurements.

Normal laser rangefinders are transmitting the light (modulated), and looking
for the reflected return (modulated) at the same location as the transmitter.
They use a PLL to determine the phase difference between the TX and RX
(modulated) signals -- since the transmitter has both versions readily at
hand. The phase difference corresponds to a time difference => distance. Note
that the "reflector" (where the photodiode sits in Lighthouse) is not part of
the equation.

In other words, laser rangefinder ToF measurements require coherent
demodulation at the location of transmission.

~~~
nomel
This doesn't use time of flight, it's simple "lighthouse was at this angle
when diode triggered".

~~~
beambot
I know how Lighthouse works -- I wrote the article. :) I was responding to the
GP: "You would be able to get some distance information via the modulation on
the signal."

I was telling GP, (1) how Lighthouse cannot use time of flight, and (2) how
time of flight works for laser rangefinders.

