
22-Year-Old Lidar Whiz Claims Breakthrough - deepnotderp
http://spectrum.ieee.org/cars-that-think/transportation/sensors/22yearold-lidar-whiz-claims-breakthrough
======
6stringmerc
It will be interesting, to me, if there are any observable effects on nature
exposed to 200 meter blasts of IR type wavelength stuff.

I mean, maybe it'll scare off deer, or pull them in closer to see what's up?
Any number of species with un-intended consequences. Insects in the desert.
Pidgeons in the big city. Gulls on the coast.

Maybe it's totally harmless, but I'd like a study that verifies mass adoption
would not have ecological consequences.

~~~
philipkglass
Your comment spurred me to search for information about near-infrared vision
systems in animals. NIR vision appears to be entirely absent from vertebrates.
(I am not claiming that it is _present_ in invertebrates, only that I found
stronger claims about absence from vertebrates.)

See "The Verriest Lecture 2009: Recent progress in understanding mammalian
color vision"

[https://pdfs.semanticscholar.org/c9df/0b61e4a45e10577c001513...](https://pdfs.semanticscholar.org/c9df/0b61e4a45e10577c001513a4fc1432696b3c.pdf)

Particularly Figure 1. Vertebrate photopigment sensitivity drops off rapidly
toward the near IR (~700 nm).

I found this fascinating related (but not peer reviewed) document on arXiv,
"Did Evolution get it right? An evaluation of near-infrared imaging in
semantic scene segmentation"

[https://arxiv.org/pdf/1611.08815.pdf](https://arxiv.org/pdf/1611.08815.pdf)

The author performs semantic segmentation with convnets using conventional
visual-spectrum images and compares performance with images spectrally
extended into the near-infrared. He finds that adding the additional NIR band
did not improve task performance -- hence that evolution "got it right" by not
expanding vision into a wavelength region that fails to improve semantic
segmentation. I'm not sure how good this work is from a professional
biologist's perspective, but I found it damn interesting.

EDIT: note that this laser operating at 1550 nm is far from the long wave IR
(~8000 nm and up) that various animals sense to detect prey body heat.

~~~
stenl
Vampire bats use modified capsaicin (chili pepper) receptors in their nose to
detect infrared light:

[https://www.nature.com/nature/journal/v476/n7358/full/nature...](https://www.nature.com/nature/journal/v476/n7358/full/nature10245.html)

They use this to find their warm-bodied prey, but of course it's only
detection, not vision.

I learned this today, and now I'm happy I did.

~~~
vanderZwan
I was about to ask if you work where I do, then I saw your username, then I
remembered that you hired me through HN..

~~~
nsebban
You guys seem to have amazingly interesting jobs. Please hire me, I'm your
future-colleague.

------
givemefive
There's a video here:
[https://www.youtube.com/watch?v=AasQG3fR8Mo](https://www.youtube.com/watch?v=AasQG3fR8Mo)

------
joshvm
So it's a LIDAR + galvanometer scanner in a box? This is nothing particularly
novel in itself, except they seem to have solved the scanning-speed problem.
If you look at the videos of the scan, you can see that rather than the ring-
like scans you'd get from a Velodyne system, you get a kind of snake-path.
They're scanning (extremely quickly, by the looks of it) with a mirror and
doing a kind of raster pattern.

From a skim of their patent it's a multiplexed system -
[http://www.freepatentsonline.com/y2017/0131388.html](http://www.freepatentsonline.com/y2017/0131388.html)

Seems like they're using a galvo to scan the LIDAR and they're beamsplitting
the laser to get multiple returns.

~~~
indolering
> except they seem to have solved the scanning-speed problem

By way of a GaN chip, which is the standard solution for anything requiring
super high clock speeds.

Considering another company is pitching a $50 solid-state LIDAR for autonomous
vehicles, I don't think this qualifies as a breakthrough[0]. It's just the
market pushing prices down.

[1]: [http://spectrum.ieee.org/cars-that-
think/transportation/sens...](http://spectrum.ieee.org/cars-that-
think/transportation/sensors/osrams-laser-chip-for-lidar-promises-supershort-
pulses-in-a-smaller-package)

~~~
joshvm
I was thinking more of the mirrors. I've used galvo systems at work before,
and the off the shelf ones couldn't come close to the sorts of speeds they're
showing off. Especially not in a raster pattern. 3D LIDAR using galvos has
been done before for space applications (one was used on the space shuttle, or
at least a ground-based model), they used a lissajous scanning pattern which
is much kinder on the motors because you can do continuous movements rather
than making the mirror do a rapid turn at the end of each row. It's called
TriDAR:

[http://adsabs.harvard.edu/abs/2006SPIE.6220E...7R](http://adsabs.harvard.edu/abs/2006SPIE.6220E...7R)

------
c517402
Back when the only really powerful lasers that could be found for optical
tables were IR, I know a couple of guys who lost enough of their retina to be
considered legally blind from light they couldn't see. One of them told me,
"You know you're in trouble when your eye starts to hurt and you realize your
safety glasses aren't on."

~~~
Kenji
As someone who works with lasers on a daily basis - this is my nightmare.
Luckily they're almost always overlaid with visible light (except during some
special production steps), making things much safer.

------
tyingq
The actual candor around the limitations, from a potential beneficiary of the
Lidar boom, is refreshing.

~~~
jobu
My thoughts exactly. Issues with rain/snow are a common concern, but I had
never heard of the "dark car problem" :

 _Current lidar systems can’t see a black tire, or a person like me wearing
black—Velodyne’s [Puck] wouldn’t see a 5- to 10-percent reflective object [30
meters away]. It’s the dark car problem—no one else talks about it!”_

~~~
macintux
Considering nearly every pedestrian walking along the streets near me at night
is wearing dark clothes (and there are typically no sidewalks) that's a
terrifying statement.

~~~
hwillis
It's not a problem at night. It's a problem during the day, when everything is
washed out. The lidar can still see something there, but not how far away it
is.

~~~
wyldfire
I'd speculate that the biggest hazard would be a stationary object (tire in
the road). If so, pedestrians crossing traffic seem less likely to cause
problems.

~~~
hwillis
That would show up as a kind of "hole" in the road. Clever software would be
required to see it as a danger.

~~~
sriram_sun
Isn't the asphalt (without lane markings) black as well? How does it recognize
a black object in a black background? I don't see it showing up as a hole in
that scenario.

~~~
hwillis
aslphalt is a few times more reflective than rubber

------
deepnotderp
It's interesting that he uses InGaAs, that's a relatively expensive material
and closes off the possibility of silicon photonics.

~~~
spott
Not necessarily. Bonding III/V materials onto Silicon is a thing[0].

Not to say it would be easy, but it should be doable.

[0][https://optoelectronics.ece.ucsb.edu/sites/default/files/201...](https://optoelectronics.ece.ucsb.edu/sites/default/files/2017-02/Davenport%20Heterogeneous%20Silicon%20III–V%20Semiconductor%20Optical%20Amplifiers_0.pdf)

------
bogomipz
The article states:

“Current lidar systems can’t see a black tire, or a person like me wearing
black—Velodyne’s [Puck] wouldn’t see a 5- to 10-percent reflective object [30
meters away]. It’s the dark car problem—no one else talks about it!”

How are the current autonomous vehicles from the Google, Uber et al
compensating for this shortcoming?

If anyone has some recommendations for learning more about Lidar that they
could share I appreciate it.

~~~
babo
Using more than one sensor helps, radar could see a black car, while lidar
not.

------
jpitz
The article mentions that the primary advantage of this approach is that it
provides the long distance coverage necessary to handle highway speeds. It
also states that, because of the device design, you'd need 4, one at each
corner.

It seems to me that you don't need 200m coverage to the rear, and maybe the
rear of the vehicle could be handled by a less expensive 360-degree lidar
unit?

------
51Cards
Can someone clarify this: "...particularly at 200 meters and beyond. That’s
how far cars will have to see at highway speeds if they want to give
themselves more than half a second to react to events."

200 meters, half second? That would mean the car is covering 1km in 2.5
seconds, or 1440km/h. Not sure what type of car is going that fast.

~~~
dtparr
My reading is that the half second only accounts for reaction time, not
stopping distance. Two cars head on at highway speeds would barely be able to
stop from a 200m detection.

------
partisan
I wonder if this is a problem that can be solved by public-private
partnerships. What if we have a public utility that maintains a network of
cameras throughout the city that can be used to create a real-time self
driving guidance system. Users of the system would pay fees to patch into the
data feed so that their cars can use that data. The utility would turn the
data into anonymized object data. Since we would want to have standards around
this type of infrastructure data, it would be regulated in the early days and
then can move to deregulation where third party providers can layer other data
and analysis into the stream. This could be cheaper and more effective than
the current attempts at per-car LIDAR with all of its limitations.
Alternatively, you could aggregate the data collected by all of the cars and
provide a single vision of the local area that is not limited by your single
LIDAR unit.

~~~
Fjolsvith
Think of the nightmare situations if that infrastructure data was hacked...

~~~
indolering
In the US, most local Governments capable of rolling out such infrastructure
already record everything a lot of information. Seattle public transit records
everything (including audio!). Many have license plate scanners in busy
streets. Then they have a roving band of police who with dash and body
cameras.

~~~
Fjolsvith
Yeah, but I'm thinking about a situation where the data is modified to cause
your vehicle to drive to the wrong place. Like that fellow who was following
his car navigation system directions and drove up a bike path and crashed.

------
spuz
It's interesting that the perspective in this article is that the high price
is unimportant. Isn't the problem with existing spinning mirror Lidar that
it's expensive? It makes me think there is some other advantage that solid
state Lidars have to generate so much investment.

~~~
joshvm
Moving parts and economy of scale. If you want to buy an industrial 1D LIDAR
off the shelf today, you're looking at a few thousand dollars. A galvanometer
scanner like these guys use is again, several thousand dollars (just look at
Thorlabs for a basic system). InGaS array sensors are also not cheap.

Frankly I'm skeptical that a galvo based system will last long in the real
world.

In principle, flash LIDAR solves all of the problems. You can image an area at
long distance in real-time. It has no moving parts and it's presumably
amenable to MEMS production. Basic (non-MEMS) units from people like ASC cost
tens of thousands. We just need someone with a serious bankroll to develop
one. We saw exactly the same with Time of Flight cameras. Microsoft bought
Canesta and suddenly a $5k+ camera cost $100.

~~~
deepnotderp
You seem to know a lot about LIDAR, I'd love to chat sometime if you'd be
interested. I couldn't find your email anywhere. Care to give me a ping at
sixsamuraisoldier[at]gmail[dot]com?

------
iaw
I wonder how the tradeoff between precision, speed of movement, and robustness
of the system was handled. Will a moving car on a bumpy road suffer from
corrupted signals?

------
smoyer
As a cautionary warning, a laser with a 1550 nm wavelength can still cause
damage to the human eye. I don't know how collimated the laser in this system
is but when we piped this color of light down single-mode fiber, looking in
the other end could leave you with a dead spot in your retina.

~~~
rubidium
They said in the article it was low mW.

Yes, looking at the other end of a laser fiber is stupid, particularly when
your eye can't see the wavelength. If you're working with 1550nm light and
single-mode fibers you should know that, and if you don't you should've
received better laser safety training.

------
bitmapbrother
>But, because it hasn’t got the 360-degree coverage of a rooftop tower, you’d
need four units, one for each corner of the car.

The technology is certainly interesting and it does have its advantages, but
the need to use 4 of these units, for complete coverage, isn't exactly going
to make it cheap.

------
ptero
Pencil beam with fast scans coupled to a moving, unstabilized platform can
introduce stitching/alignment problems that must be solved to maintain high
accuracy.

I wonder how well this system does it -- it is not trivial.

------
Hasz
Maybe I missed it, but I see no actual power output, just multiples of some
standard. IS this thing putting out 5mW or 1W?

~~~
hwillis
probably tens of milliwatts, normal lidars are fractions of a milliwatt
average output. The HDL-64 has a max laser power of 1mW.

------
Lord_Zero
This website is really frustrating to use.

~~~
0xCMP
It's fine if all JS is disabled using uMatrix

------
deepnet
What is the output of LIDAR ? A point cloud or stripes ?

What are the specs of the output of a LIDAR such a Velodyne or this one ?

~~~
joshvm
It's a point cloud, but you get stripes because of the way the systems scan.

Unless you're using a flash LIDAR (mucho dinero), all LIDAR are single point
sensors which have to be scanned. That is, you send out a pulse of light and a
single photodiode detects the return signal.

A Velodyne LIDAR has up to 64 rx/tx pairs arranged into a grid which is
rotated rapidly, so you end up imaging a series of rings around the sensor.

Specs - the best LIDAR systems scan at around 1Mpt/second over a full
hemisphere. The Velodyne system gives you around 20-30k measurements in the
field of view a typical car camera. Accuracy is centimeter level.

Stripes are produced by laser triangulation systems, but they're not true
LIDAR. You project a line and look at how the image of it shifts compared to a
known distance.

------
theprop
You can be super-human with just cameras. Lidar is not required.

------
lerie
The article is unbearable to read.

------
Sephr
Just because I can't see infrared doesn't mean it won't damage my retinas
through my fully dilated pupils at night when you pump up the power by 40x.

Edit: Emphasis on _fully dilated_. The amount of infrared energy that reaches
my retinas in daylight is reduced due to pupil contraction.

Additionally, this has moving parts so what if it suddenly fails to spin and
you are exposed to a continuous IR beam with fully dilated pupils at night?
I'm sure that it's possible for it to heat and damage your retina in this
failure scenario.

~~~
hwillis
Nah, you'll be fine. You receive a couple hundred watts of near infrared
radiation (ie the 1550 nm in the laser) just by being in direct sunlight.
That's spread over your entire body, of course.

The laser in this lidar has a power of ~40 mW, which technically puts it out
of being a class 1[1] laser, ie you can shine it directly into your pupil
_through a magnifying glass without fear_. However the class is just a
guideline.

Using a maximum permissible exposure chart gives a better sense of danger.
Lidar pulses are on the order of 10 nanoseconds, or 1/100th the smallest
division on this chart[2]. Even so, you can see that the maximum safe power
for 1550 nm is somewhere between 8 and 10,000 watts, at least 800x more than
the lidar emits.

1550 nm doesn't chemically damage biological receptors. The damage would have
to be thermal, but the laser is pulsed for an incredibly short time and even
if it wasn't it's well under what you can be exposed to indefinitely.

[1]
[https://en.wikipedia.org/wiki/Laser_safety#/media/File:Laser...](https://en.wikipedia.org/wiki/Laser_safety#/media/File:Laser_class_EN_60825-1.en.png)

[2]
[https://en.wikipedia.org/wiki/File:IEC60825_MPE_W_s.png](https://en.wikipedia.org/wiki/File:IEC60825_MPE_W_s.png)

~~~
Sephr
> You receive a couple hundred watts of near infrared radiation (ie the 1550
> nm in the laser) just by being in direct sunlight.

The visible light in sunlight makes pupils dilate, decreasing the amount of
infrared energy that reaches and heats the retina.

~~~
valuearb
You don't think the safety charts don't already understand that effect?

