
Intel Takes Lidar Indoors - stambros
https://www.zdnet.com/article/intels-new-realsense-camera-brings-hi-res-lidar-to-a-small-form-factor/
======
noen
Im done being excited by anything out of Intel that isn't an desktop/laptop
CPU.

I've been burned personally and professionally with every single Intel IoT
device I've touched.

Remember the Edison platform? Huge promises and possibilities that turned into
fatally flawed silicon that took Intel 2 years to admit.

The Compute Stick? The whole Atom ecosystem?

I was so excited by the realsense cameras, we got a bunch of them and thought
we must have gotten a bad batch. The hardware was so bad compared to similar
cost machine vision cameras, it was astounding.

The SDK was great at first glance, a really easy OOBE for multi camera setup
and PCL processing. Then you discover over a few weeks how flaky everything
is, how brittle the SDK and drivers are (like every other Intel dev platform
it seems) and after spending thousands of dollars on hardware and hundreds of
hours of dev time, you finally chuck it all in a bin and say "I will never buy
Intel crap again" for the third time.

Hopefully this Lidar device will buck that trend, but I doubt it. They keep
making random IOT hardware platforms with seemingly no long term strategy and
no path to commerical implementation.

~~~
Reventlov
My main problem with non "classic" intel products is the shit user experience
that come with them: I don't want to use ubuntu 16.04, just package software
in a maintainable way, ffs.

~~~
rubicks
When Intel touts a device "On Linux!(tm)", I have to lower my already meager
expectations. So long as you expect the drivers to be very thin open-source
wrappers around very brittle proprietary blobs, you won't be unpleasantly
surprised.

------
stefan_
I love the Intel RealSense stuff, you get very cutting edge sensor silicon at
consumer gear prices with open-source software to go along with it.

The only real problem is that I have no clue why Intel is in this business,
and I suspect they won't be for much longer.

~~~
rubicks
The Realsense hardware might be great. The software that goes with it is not:

* DKMS kernel module for what should be a plain vanilla USB 3.0 device

* firmware updates require closed-source libraries

* breaking API and ABI changes that do not respect semver or SOVERSION

The worst part of my dayjob is wrangling the Realsense software suite.

~~~
TaylorAlexander
I’ve met someone who is constantly asking me “why haven’t you tried
realsense?” and you just confirmed my suspicions. When the first realsense
products came out, they only supported windows. This is madness for a robotics
focused product. Finally my friend tells me now they support Linux. But for me
the damage has already been done. They have proven that they don’t understand
me as a robotics engineer. And you’ve just confirmed that for me. So I stick
with trying to use high resolution cameras and structure from motion
algorithms to understand the world. No need for a specific proprietary piece
of hardware. Since I’m mostly doing research in to what is possible, I prefer
this non proprietary approach.

This little lidar looks nice but the last thing I need is another weird kernel
module and some closed source library to support my hardware. No thanks.

~~~
rubicks
I mean... your friend's not wrong, if by "Linux" she/he means "Ubuntu 16.04
LTS" with the caveats:

* disable Secure Boot xor create your own efi signing key pair, get friendly with `mokutil`, and pray your firmware's UEFI implementation supports that complicated custom KEK

* Ubuntu 18.04 support, but forcibly install at least one 16.04 package they couldn't be bothered to build for the latest stable release of their chosen distro --- or `patchelf` the shared object and, again, pray.

* accept that the debugging symbols they provide still bear the source paths from the Jenkins instance that packaged them

* Oh, yeah; sometimes the device is detected as USB 2.1. That's fun when it happens 2 hours into a calibration run

They're good if all you need is a flakey proof of concept. It sounds to me
like you require something better.

~~~
TaylorAlexander
Damn that sounds awful. Yeah, I don’t need another headache. I can imagine
times where the hardware is the right tool for the job, but with all those
hoops you have to jump through to make it work I’d avoid that at all costs.

------
BubRoss
[https://newsroom.intel.com/news/intel-realsense-lidar-
camera...](https://newsroom.intel.com/news/intel-realsense-lidar-camera-
technology-redefines-computer-vision/#gs.log2df)

Here is the actual press release instead of the zdnet rehash.

[https://www.intelrealsense.com/lidar-
camera-l515/](https://www.intelrealsense.com/lidar-camera-l515/)

This is the actual page for this camera.

------
opwieurposiu
Previously the realsense stereo depth cameras suffered from a lot of depth
noise compared to TOF cameras like kinect. I had to use a lot of filtering
which limited the usable frame rate. Hopefully this new lidar cam has less
noise.

The realsense API is pretty good, I found it much easier to use then the
kinect API.

~~~
echelon
Can you use multiple sensors with a spherically overlapping FOV?

Kinect for Azure purports to be able to support overlapping FOV (whereas
Kinect 1 for Xbox did not)

~~~
opwieurposiu
The stereo realsense cameras like D415 support overlapping FOV. They also have
a way to use a sync cable to sync the shutters.

------
Animats
There have been lots of little indoor LIDAR units. The SwissRanger, around
2005, was one of the early ones. The Kinect, version 2, is one. The Kinect,
version 1, was a random dot pattern projector and two cameras for
triangulation. Intel made something similar, the RealSense.

So far, the most popular use for these things is video background removal,
allowing "green screen" type effects without needing an actual green screen.

~~~
BubRoss
Is that actually common? Basing a matte off of the depth map would be
extremely noisy and low resolution without some big time filtering.

~~~
Animats
Yes.[1]

[1] [https://youtu.be/RoeXGiWO9dU](https://youtu.be/RoeXGiWO9dU)

~~~
hmottestad
BubRoss asked if it would be low res. This video shows that it is extremely
low res.

Lidar instead of green screen is not for professional grade background removal
like you see on TV.

------
swiley
I really wonder how safe lidar really is for humans. Our retina are sensitive
enough to detect single photons (when healthy) and lidar is known to damage
digital camera sensors.

~~~
bdamm
In addition to safety I wonder about interference. Wouldn't lidar become
ineffective if there's so much lidar around that all the lidar sources start
interfering with each other and effectively blinding all receivers with noise?
I really wonder why lidar-based autonomous agents plan to deal with this
problem. It seems fundamental.

~~~
namibj
Usually not. They require resilience against ambient light already, so they
are either very dim and use coding gains or they use short pulses which only
yield a short time window for valid returns. You basically don't get non-
malicious interference issues, except for e.g. the dot projector systems.

Real ToF sensors can easily filter any accidental noise. You can often spoof
them, however, and there's not much one can do against it considering a
blinding DoS is often technically easier (track the LIDAR with a camera to
keep the laserpointer on-target)

------
echelon
I wonder if multiple sensors can be used with overlapping FOV. The website
claims,

>> Can multiple L515 cameras be used simultaneously?

> Multiple cameras can share the same field of view utilizing our hardware
> sync feature.

I really want to get accurate 3D spherical volumes in real time. (30fps is
sufficient, 60fps would be ideal)

I've thought about using Kinect "for Azure", because I think it satisfies this
use case and does hardware clock syncing between devices:

[https://azure.microsoft.com/en-us/services/kinect-
dk/](https://azure.microsoft.com/en-us/services/kinect-dk/)

Edit: It looks like their RealSense cameras can be set up in an inward-facing
configuration:

[https://dev.intelrealsense.com/docs/multiple-depth-
cameras-c...](https://dev.intelrealsense.com/docs/multiple-depth-cameras-
configuration)

~~~
kypro
I'm working on RealSense project at the moment. You won't be able to do it out
the box, but their SDK does come with a lot of sample code, one which makes
use of the RGB sensor on the D400 series to calibrate the cameras in world
space. With just depth data it's a bit trickier.

------
fnord77
$349 for pre-order on their store. Shipping next April

------
bluegreyred
Am I cynical to expect one of these in every Echo/Home/Portal "assistant"
within a decade? You know, strictly for 3D-avatar VR communication purposes
only.

------
melling
Direct link to Intel’s announcement with a video.

[https://newsroom.intel.com/news/intel-realsense-lidar-
camera...](https://newsroom.intel.com/news/intel-realsense-lidar-camera-
technology-redefines-computer-vision/)

The camera is the size of a tennis ball.

There are probably lots of industrial uses.

~~~
ganzuul
~~Ooh, it's the solid-state LIDAR tech I heard about a couple of years ago!
They must have bought the company that invented it.~~

~~The price is also just around where they expected it to be. They talked
about going down to 100 eurodollars per unit when they hit mass
manufacturing.~~

ED: No this is a MEMS device. The device I'm talking about is actually solid-
state, scanning the laser by way of, IIRC, acousto-optic modulation. Car
companies were interested in it.

------
justinclift
Wish I had the spare time to try hooking some of these into some kind of a
machine vision system, for automatically verifying that an object being
created (3D printer / CNC) was created as intended.

It'd help with automating production, but I'm not sure it'd be worth the
effort.

------
georgeburdell
I don’t know why they are in the business, but a cheap Lidar camera is very
interesting to me from a computer vision/home robotics standpoint. Here’s to
hoping for a long life for this product line

~~~
kypro
It's because of the vision processing chip. My understanding is that many of
the Windows facial unlock cameras are powered by Intel vision chips.

------
azinman2
Is this doing 360 plus Y as well?

------
huffmsa
I don't think it has enough resolution to help them find their 10nm design.

