
Show HN: fast visual-inertial odometry/SLAM for AR/VR/Robotics - xfei91
https://github.com/ucla-vision/xivo
======
xfei91
This is part of my research as a graduate student at UCLA Vision Lab. The SLAM
system is Extended Kalman Filter (EKF) based, has features (landmarks) in the
state, and jointly estimates the pose of the camera and the location of the
landmarks. It runs at 140 Hz on a PC and is much faster than (some if not all)
existing open-source VIO systems.

~~~
hlieberman
Awesome; this is great work!

One little nit, however: it should be important to note that this is not an
open-source VIO system. It's licensed for research purposes only; anything
else requires a commercial license from UCLA.

~~~
xfei91
Thanks for pointing that out. First time doing "open-source" (well it seems
it's not really open-source according to the modern definition). I'd like to
use a more permissive license, but it's up to UCLA.

~~~
tuukkah
A middle ground might be found in licensing as open source the code that is
needed to replicate any of your scientific findings. UCLA could keep
proprietary the tooling required for commercial applications.

------
AndrewKemendo
My team and I worked with Eagle and his team from RealityCap back in 2015 on
the monocular-SLAM iOS implementation of this for our AR application. Great
people and we were glad to see when they were picked up by the RealSense team.

Glad to see they were able to push some of the code open source.

------
msadowski
That looks awesome! I have Realsense d435i and would love to test it out. I'll
have an in depth look at it later but I'd love to share it in my mostly open
source list: [https://github.com/msadowski/awesome-weekly-
robotics](https://github.com/msadowski/awesome-weekly-robotics)

~~~
xfei91
That will be great!

------
blensor
I am very interested in using this for one of my projects. I am just curious
why you are using the D435. Is it because it has an IMU on board, you are not
using the depth information from the sensor right? That would be important for
my use case.

~~~
xfei91
The original D435 does not have an IMU. But the D435i version has an IMU. We
use it for our other projects which require the dense depth. But the SLAM
system itself should work with only RGB and IMU after some calibration and
parameter tuning.

~~~
blensor
Ah ok. I've got a T265 which I hope will work as it has an IMU as well and can
output the image frames as far as I know. And while the T265 does do the
tracking already I need to have a global reference frame as I would like to
drive on a predfined track.

------
awinter-py
I love all sensor fusion, thanks for sharing this

What are the pros & cons of using ROS as the basis of systems like this?

~~~
xfei91
ROS makes the inter-process communication much easier if the SLAM system is
incorporated as one component of a much bigger system. But you don't have to
use ROS for that. We actually provide the ability to run it without ROS. Also,
with ROS, it's easier to communicate with sensors given that the sensor
drivers have been wrapped into ROS nodes.

------
awinter-py
What happens to the autocalibration process when the IMU & video sources don't
agree, i.e. if they get misaligned?

~~~
yodon
Thats the whole point of the autocalibration process. The autocalibration
process figures out how the IMU & video sources are aligned.

~~~
awinter-py
or if the IMU breaks, for example, and starts outputting bad data -- will the
autocalibration algo know to discard it?

~~~
xfei91
The auto-calibration simply finds the spatial alignment between the camera and
the IMU. If bad data are present, one needs some outlier rejection mechanism
to filter out them. Auto-calibration alone does not provide that ability.

