
Realsense T265 tracking camera - tlarkworthy
https://markku.ai/post/realsense-t265/
======
mathgaron
Nice project! Your 3D printed case for the stereo pair removes a lot of
potentially useful information for the feature extraction algorithm. If it can
help, I have an old project where we mounted a realsense (R300 if I remember
well) on Hololens.

You can inspire from our 3D printed mount, the CAD is available here :
[http://vision.gel.ulaval.ca/~jflalonde/projects/hololens3d/i...](http://vision.gel.ulaval.ca/~jflalonde/projects/hololens3d/index.html)

~~~
paronianttila
This looks great! In reality, our casing only covers really small area, but we
are always looking to improve :).

------
mhb
In case you're wondering - it's $209 from Mouser:
[https://www.mouser.com/new/Intel/intel-realsense-
camera-t265...](https://www.mouser.com/new/Intel/intel-realsense-camera-t265/)

~~~
onedognight
It's also available for $199 right from Intel's site[0].

[0] [https://www.intelrealsense.com/tracking-
camera-t265/](https://www.intelrealsense.com/tracking-camera-t265/)

------
TaylorAlexander
Interesting. I wonder how it compares in capability to these algorithms, which
are in Nvidia’s Redtail SDK:

[https://youtu.be/T7-IxCW1UlA](https://youtu.be/T7-IxCW1UlA)

~~~
meatmanek
Neat, thanks for linking this, I hadn't heard about Redtail before.

This camera only tracks its own pose - it doesn't output a full depth map,
according to the FAQ.[1]

The visual odometry algorithm ("Elbrus") in that video can supposedly be run
on an NVIDIA Jetson board (unclear which one) at 30fps.[2] It seems like this
runs on the CPU, which is pretty impressive.

The Elbrus demo app uses the $450 ZED stereo camera[3], compared to the $200
T265[4], though there's probably nothing stopping someone from using Elbrus
with cheaper cameras, including possibly this one.

    
    
      1. https://www.intelrealsense.com/tracking-camera-t265/#fl-accordion-5c70b553f2112-tab-0, "Is this a depth camera?"
      2. https://docs.nvidia.com/isaac/isaac/packages/perception/doc/visual_odometry.html
      3. https://store.stereolabs.com/products/zed/
      4. https://store.intelrealsense.com/buy-intel-realsense-tracking-camera-t265.html

~~~
TaylorAlexander
I have a Jetson Xavier and dual $50 PS4 stereo cameras on my robot and will be
trying the Nvidia algorithms.

[https://youtu.be/crHUGLr-JMI](https://youtu.be/crHUGLr-JMI)

------
mtw
I bought an Intel Realsense last year (2 generations ago) and this is much
more powerful, esp. in localization . If you buy this, expect it to be
obsolete in about 6 months

------
mrguyorama
Is there a "hobby scale" way to do SLAM?

------
rsync
Why does Intel make a product like this ?

~~~
mathgaron
Those cameras were developed by an Israeli startup that Intel bought a few
years ago (Omek Interactive). Making these sensor low power could enable
devices to have 3D reconstruction and tracking. This opens the door for more
robust AR/VR applications and cheap devices for robotics. Not only that Intel
is in a good position to sell/produce those sensors, but they probably sense
that it could be a good market to conquer.

~~~
PaulHoule
That's half of it. The other half is that Intel has pushed the RealSense API
as something that is independent of the exact technology used.

So this could be something that uses a projected dot matrix like the Xbox 360
Kinect or a time-of-flight sensor like the Xbox One Kinect or multiple
conventional cameras or...

There are many neat things you could do with depth cameras in principle, in
particular make computers into "sessile robots" that appear to understand the
space we live in the same way we do. Unfortunately the market hasn't made this
into reality yet.

------
franciscop
Does that look like micro-USB? Like, is it really 2019 new Intel tech being
released with micro USB and not USB-C?

------
14
This reminded me of the Microsoft Kinect with the 2 cameras for tracking

------
tomc1985
Back up a bit... what exactly is a tracking camera?

~~~
visarga
It's a 'Simultaneous Localization and Mapping device' (SLAM) that will give
you its position, velocity and acceleration in 3D. There's quite a large [1]
literature on the subject.

[1] [http://www.arxiv-sanity.com/search?q=slam](http://www.arxiv-
sanity.com/search?q=slam)

