Hacker News new | comments | show | ask | jobs | submit login
22-Year-Old Lidar Whiz Claims Breakthrough (ieee.org)
280 points by deepnotderp 205 days ago | hide | past | web | favorite | 151 comments



It will be interesting, to me, if there are any observable effects on nature exposed to 200 meter blasts of IR type wavelength stuff.

I mean, maybe it'll scare off deer, or pull them in closer to see what's up? Any number of species with un-intended consequences. Insects in the desert. Pidgeons in the big city. Gulls on the coast.

Maybe it's totally harmless, but I'd like a study that verifies mass adoption would not have ecological consequences.


Your comment spurred me to search for information about near-infrared vision systems in animals. NIR vision appears to be entirely absent from vertebrates. (I am not claiming that it is present in invertebrates, only that I found stronger claims about absence from vertebrates.)

See "The Verriest Lecture 2009: Recent progress in understanding mammalian color vision"

https://pdfs.semanticscholar.org/c9df/0b61e4a45e10577c001513...

Particularly Figure 1. Vertebrate photopigment sensitivity drops off rapidly toward the near IR (~700 nm).

I found this fascinating related (but not peer reviewed) document on arXiv, "Did Evolution get it right? An evaluation of near-infrared imaging in semantic scene segmentation"

https://arxiv.org/pdf/1611.08815.pdf

The author performs semantic segmentation with convnets using conventional visual-spectrum images and compares performance with images spectrally extended into the near-infrared. He finds that adding the additional NIR band did not improve task performance -- hence that evolution "got it right" by not expanding vision into a wavelength region that fails to improve semantic segmentation. I'm not sure how good this work is from a professional biologist's perspective, but I found it damn interesting.

EDIT: note that this laser operating at 1550 nm is far from the long wave IR (~8000 nm and up) that various animals sense to detect prey body heat.


Vampire bats use modified capsaicin (chili pepper) receptors in their nose to detect infrared light:

https://www.nature.com/nature/journal/v476/n7358/full/nature...

They use this to find their warm-bodied prey, but of course it's only detection, not vision.

I learned this today, and now I'm happy I did.


I was about to ask if you work where I do, then I saw your username, then I remembered that you hired me through HN..


You guys seem to have amazingly interesting jobs. Please hire me, I'm your future-colleague.


Pit vipers too. The pits are used to 'see' infrared.


Thank you. Definitely the coolest thing I will learn today.



One thing is that water has absorption bands at 970nm, 1200nm, 1450nm, and 1950nm, and eyes have a lot of water in them. So they simply aren't very transparent in the near infrared.


Also, why would you evolve eyes tuned to wavelengths that don't penetrate water well? That seems maladaptive in an aquatic environment.

https://en.m.wikipedia.org/wiki/Electromagnetic_absorption_b...


Why would you evolve eyes that could see wavelengths that don't penetrate your environment?


Yeah, why would you?


I'm not sure if this exchange means that there is no good reason to do that, or if it means that the reason why one would want to do that is blindingly obvious to everyone except me. :)


The reason you want to see wavelengths that don't penetrate your environment is so that you can see your environment.


None of that light would ever reach your eyes for you to see it, though, right? Or am I missing something really obvious?


Maybe I should have elaborated a bit more. My original point was that it makes sense that the visible spectrum was colocated with the frequencies not readily absorbed by water.

Which is to say, eyes were made for seeing the light that was bouncing around in the sea (and not the frequencies that had already been absorbed).

So yes, the simplest answer you can imagine.


Also to the same point: eyes evolved to see the wavelengths where the Sun outputs the majority of its energy, which is a significant factor in its ability to penetrate water to any depth.


Right, that makes perfect sense. I'm mostly unsure of what tempestn said.


Yeah, if the water is the surroundings.

I was answering the broader question of not wanting light to penetrate "your surroundings" - ie the desk in front of me or walls around me. Would make life difficult if everything was transparent. Normally when one talks about "surroundings" it means that stuff, not the air (analogous to the water for marine life) both of which obviously need to be at least somewhat transparent for vision to be useful.


In that case, how well will this new lidar work in the rain?


Rain is actually very little water at any one path. A drop is falling at 10 meters per second and heavy rain is ~1 cm per hour. So to ballpark it's around 1 part in 3.5 million water on average and assuming 1mm drop size you will see ~3.6km on average before you encounter a drop.

Note this is instantaneous, human vision deals with after images which adds blur.


That's a good calculation. http://hypertextbook.com/facts/2007/EvanKaplan.shtml backs up your facts and claims that a raindrop is actually more like 6 mm, which makes the straight line view much longer.

However in the rain, on a highway, we have water falling, water splashed up from the ground, a mist formed by cars, high humidity, and a wet layer on the outside of the car. These factors reasonably could impact this device if the laser is emitting at water's absorption frequency.


I currently work on a self-driving lorry, we get around this issue by installing over a dozen lidars.


I believe that I can't see that wavelength, but the energy has to go somewhere (similarly, I can't see microwave radiation, but don't want to put my head in a running microwave, a la Infinite Jest). I think the better question is: Will it blind you/me/animals?


Indeed, I find it mildly painful to look at IR emitters, for example, on a "night vision" security camera. I don't believe it is dangerous or will blind anyone, or anything, but it is plausible to me that there could be an unanticipated effect.

EDIT: It appears that this is given a passing mention in the article:

. . . it can receive at a wavelength of 1550 nanometers, deep in the infrared part of the spectrum. That makes this wavelength much safer for human eyes than today’s standard wavelength, 905 nm.


The laser in the article is on the order of tens of milliwatts. Even directly into your eye would not damage it. It would be painful, especially if it was in a visible wavelength, but the laser spot from a lidar is in one location for less than a microsecond.

In terms of direct heating, tens of milliwatts isn't even enough to really make a human-detectable rise.


> is in one location for less than a microsecond

When it is working properly.


Same with elevators, microwaves, airbags, and in fact literally anything that plugs into the wall. 120 volts can kill you if you contact it.

The laser is eye-safe, meaning you can look directly at it continuously and be fine. Though, if you did, you would deserve injury. Kind of like sticking your hand in a blender.

The device will not manage to malfunction and miraculously shoot you in the eye from meters away.


And self-diagnostics will know when it isn't.


Would it even be painful at that wavelength?


No, although lasers can sometimes be dangerous without being painful. Even if they aren't, by the time it hurts it is too late.


>similarly, I can't see microwave radiation, but don't want to put my head in a running microwave, a la Infinite Jest

Yet wireless access points spew microwave radiation at you all day.


Not at 1,000W.


However, you do receive a couple hundred watts of near infrared radiation (ie the 1550 nm in the laser) just by being in direct sunlight:

https://en.wikipedia.org/wiki/Sunlight#/media/File:Solar_spe...


But even though I can't see it, I can definitely feel it.


The premise of the parent comment is that even invisible lasers could be harmful. You pointed out that microwaves may not be as harmful at lower levels of power. I pointed out that the output of a lidar beam is far below the level that you get from sunlight, which doesn't even cause cancer and certainly doesn't blind people.

Just because I like running numbers, here's the comparison: the laser in this lidar has a power of ~40 mW, ~1000x less than sunlight. Lidar pulses are on the order of 10 nanoseconds, 100,000,000x less than a second. The energy on your eyes is less than 100 billion times less than a second in bright sunlight. There is certainly no mechanism for thermal damage to occur.


> the output of a lidar beam is far below the level that you get from sunlight, which doesn't even cause cancer and certainly doesn't blind people

Sunlight certainly does cause cancer, although we generally attribute that to the UV in sunlight. Sunlight is also well-known for blinding people.


The near-1550 nm wavelengths do not cause cancer or blind people.


Just to add to your comment, even so that the energy in each pulse is less than 100 billion times less than the energy from bright sunlight emitted in a second, it is still significant, but the number of laser pulses per second is key to knowing if the (peak) intensity of the laser is dangerous to humans or not, as they accumulate over said second. If you only had one pulse per second at 40 mW you would be looking at 30 mJ of energy per pulse, capable of making a nice plasma in air if focused, or machining metal. However they are probably working at 100s of kHz in order to be able to measure an image fast enough, so the energy per pulse is much lower. Key here is the fact that our eye's lens doesnt transmit 1550nm so well, so the interaction with the eye doesnt involve focused light on the retina, hence the interaction intensity is much lower as the light is spread over most of the surface of the eye. Hence they can use much more power and still be eye safe/class 1 or 2, because whatever light gets to the eyes never gets focused down to a tiny spot in the retina, unlike visible lasers.

Also, sunlight is specified in intensity (power/area), and here we only have laser power, not laser spot size, so it is hard to make a fair comparison. nevertheless, sunlight is roughly ~1kW/m^2, so if the laser had a spot size of 2 mm (required to be this big for diffraction over hundreds of meters of propagation to not reduce its intensity too much) we can calculate the power of sunlight in that area: 3 mW. So the laser would be actually ~10x more intense. (if the laser would be 7.2 mm wide then the intensity would be the same as sunlight) But one is comparing totally different wavelengths, so the laser safety rules are different. Two intros to laser safety for whomever is interested: https://www.rp-photonics.com/laser_safety.html https://spie.org/Documents/Publications/00%20STEP%20Module%2...

About thermal damage - funnily enough there is lots of time in nanoseconds to transmit heat and cause thermal damage. Heat transfer is actually reasonably fast at the submicroscale. As a curiosity see https://www.semrock.com/Data/Sites/1/semrockimages/technote_... for the difference between punching a hole with a femtosecond laser (no time for thermal diffusion to happen) and a nanosecond pulsed laser. The area around the laser just completely melts in the nanosecond case. Nevertheless this was done at much higher energies per pulse than the lidar lasers (thankfully!)


>If you only had one pulse per second at 40 mW you would be looking at 30 mJ of energy per pulse

40 mW peak power, not average power. Average power somewhat less, maybe 20 mW.

>Also, sunlight is specified in intensity (power/area), and here we only have laser power, not laser spot size, so it is hard to make a fair comparison.

With sunlight and with a laser scanner, the entire body will be illuminated with that power, very roughly. If you stand directly in front of the laser scanner that 40 mW will fall over your entire body, and a similar fraction of the energy will go into your eyes as with sunlight.


40mW would be dangerous for a constant laser (class 3b), but I'm not sure about a short pulsed one. I think you should look at the amount of energy per pulse.


It would be class 3r, for which direct retina exposure is "low risk". In general for pulsed lasers you can use the duty cycle. I did more analysis in another comment:

Using a maximum permissible exposure chart gives a better sense of danger. Lidar pulses are on the order of 10 nanoseconds, or 1/100th the smallest division on this chart[2]. Even so, you can see that the maximum safe power for 1550 nm is somewhere between 8 and 10,000 watts, at least 800x more than the lidar emits.

1550 nm doesn't chemically damage biological receptors. The damage would have to be thermal, but the laser is pulsed for an incredibly short time and even if it wasn't it's well under what you can be exposed to indefinitely.

[2] https://en.wikipedia.org/wiki/File:IEC60825_MPE_W_s.png


It depends on the power density. I have an IR laser that is invisible and it would burn your eyes out of your head.


~40 mW peak, ~10 nS pulses


It's worth pointing out that there are aircraft mounted LiDAR systems that already operate at 1550nm, such as [0]. Another company has an interesting paper about which wavelength is best for their activities[1].

http://www.teledyneoptech.com/index.php/product/titan/

[1] (PDF!) http://www.riegl.com/uploads/tx_pxpriegldownloads/Paper_ILMF...


Pit Vipers can sense IR with a secondary sight organ called a fossa. https://en.wikipedia.org/wiki/Pit_viper


>> He finds that adding the additional NIR band did not improve task performance

But did they try in low light? It seems that seeing in IR should help with detection of predators or prey at night.


IR near the visible range is associated with night vision because we use active night vision goggles that use an infrared spotlight at 850 or 940 nm wavelengths (700 nm is red). This is not particularly useful without an illuminator, you're basically just getting visible light amplification then.

Passive thermal vision that might pick out a warm object in the dark require sensing of ~10,000 nm/10 um infrared, a very different problem! And one that wouldn't​ be easy to solve biologically, as it basically requires being colder than the object you are sensing. Edit: but not impossible! https://en.m.wikipedia.org/wiki/Infrared_sensing_in_snakes


Are you referring to sensing body heat of other animals? That's a long wave IR emission, ~8000 nm and up. Some animals do indeed sense that and it is quite useful at night. This laser, and the paper I linked to, are operating with much shorter IR wavelengths (~800 to 1550 nm). This spectral region overwhelmingly comes from sunlight under natural conditions, and its intensity drops sharply at night like the ordinary visible range.


Nocturnal animals function by having a large proportion of rods. This was true of our anthropoid ancestors, resulting in humans having very limited high acuity color vision. We get round this by rapidly scanning and stitching the world together (the inability to do this is a sore problem for visual prosthesis). That the world seems sharp and rich in color is a testament to the feats of predictive inference our brains regularly perform.


Considering ~55% of the sun's energy that hits the earth is in the form of IR, my guess is IR systems would be marginal. I think it bears testing though to be sure.

As you say other creatures are much more sensitive to IR differences.


That's a good point, but also reminds me of moths circling a flame or porch light. The vast majority of light we see in the world comes from the sun, but when the sun is elsewhere, our artificial light triggers changes in how insects behave.

I wonder if ground-source lidar at night would have similar effects on animals that are "used to" IR coming from above.


This wavelength of light is used for long range telecomms. We would know if it did anything. Animals can't see this wavelength, not even snakes or fish. In addition to that the laser spot is spread rapidly over a very wide area and it won't actually be visible except to extremely fast sensors, which eyes are not.


1500nm is way outside the the range of usable solar radiation in nature.

https://upload.wikimedia.org/wikipedia/commons/e/e7/Solar_sp...

Snakes pick back up around 5000nm since that's the temperature other living animals radiate, which is helpful in hunting. https://en.wikipedia.org/wiki/Infrared_sensing_in_snakes


This is a good point. But same goes for other sensors in cars. I always wondered how annoying AF ultra-sonic sensors are for dogs. It looks like most ultrasonic sensors are out of range for dogs, but the ultrasonics for sure mess with bats. I wonder how bat populations will be affected when all of these ultrasonic sensors mess with their ability to hunt.


Ultrasound transducers are not the only producers of ultrasound, nor the loudest. Most things that vibrate create ultrasound, including almost anything that moves. Noise pollution is mostly produced accidentally.

Also, nocturnal bats will largely be unaffected by cars in no small part because they don't cohabitate with humans very much.


Cops in my city installed subsonic sirens on their cars to alert people who are playing loud music etc. Its always bothered me a bit about not knowing the potential impact to animals


Can I direct your attention to the street lights? Those death traps, causing millions of insects to fly in circles and possibly die of exhaustion.

You will save many more insects, raising awareness to this problem.


Exactly we are blasting so much radiation everywhere, why should we start caring now with the IR radiation.


The best way to start fixing a problem is to stop making things worse.

LIDAR may or may not have an adverse impact on wildlife, but we should at least find out and see if there are simple ways to mitigate any possible side effects.


Deer cannot see at 1550 nm. I'm not aware of any mammals that can.


What you care about is the maximum permissible exposure at close range - because it's entirely possible that a kid is going to walk up to your car and stick their face up against the glass. The laser beam will already be fairly wide (order of mm) as it exists the system, but it needs to be reasonably low divergence to give you accurate spatial information 200 m away. We should assume to be pessimistic, that you get the full beam into your eye (up to 7 mm).

This is really back of the envelope stuff, but there are lots of factors to consider. You could probably get away with even a 100mW laser with the right setup. Obvious disclaimer that you should respect laser safety and not trust someone's internet napkin calculations.

According to this you're allowed 1W/m^2 at > 1400nm for an accidental exposure (100 seconds): https://workspace.imperial.ac.uk/physics/Public/physicsdocs/...

Equivalently that's 1 mW/mm^2. So the question is, what power is the laser kicking out? A Faro scanner can put out as much as 20mW/mm^2 at 1m and their systems are thus Class 3R. So far, so bad.

However... laser safety ratings are given as "Under normal operating conditions". The laser is being scanned, and very quickly at that, so you'd only be getting very short bursts of light (I don't know if it's truly a pulsed laser). If you stared into the beam continuously you would have problems, but since the beam is only going to hit you for a fraction of a second it's unlikely to do any damage. And indeed Faro say that for practical purposes their system is eye-safe, even though it's got a stonking laser in it.

The tables say you can go up to 10^4 J/m^2 for short exposures and suppose we only have the beam in our eye for 10^-3 seconds (additionally the pulse length will be very short), you're allowed a light power of perhaps 1000 times higher. Maybe divide by a factor of 100 or so because you'd get several repeat exposures if you stood in the same place or if you're standing on a street filled with these things.

Have a look at Faro's manual, Appendix E for a real example of this kind of analysis: https://doarch332.files.wordpress.com/2013/11/e866_faro_lase...


I don't know about IR but Kestrel's (the bird) are UV sensitive. AFAIK all organics emit IR so it's probably not going to be a problem.


Many types of common headlights, and other light sources, emit tons of IR radiation already so we'd probably know.


Fish are sensitive to IR, it penetrates water better than visible or UV.


some fish see some IR. Carp etc. would be able to see the 900 nm light in normal lidar, but even they can't see 1550 nm.


> IR, it penetrates water better than visible or UV.

What? IR is lower energy than visible red and that's the first color to go when you get a few feet under water (with no local light source).


Water does absorb infrared well but it's not because it has lower energy. Infrared is absorbed well due to the spectrum of vibrational transitions. Ultraviolet is also absorbed well, due to electronic transitions. There is a "window" in the visible light that is not absorbed well by many transitions.


> but it's not because it has lower energy...There is a "window" in the visible light

fair enough, I got my physics a bit wrong there, but I was under the impression that that window was in the blue/green part of the spectrum since I do know that water is good at blocking UV and even cosmic radiation.

I also know that it's good at blocking visible red, because if you go further than about 20-30 feet down on a sunny day and cut yourself, your blood looks green.

Is there an additional window in the infrared part of the spectrum that supports GP's point?



So it's a LIDAR + galvanometer scanner in a box? This is nothing particularly novel in itself, except they seem to have solved the scanning-speed problem. If you look at the videos of the scan, you can see that rather than the ring-like scans you'd get from a Velodyne system, you get a kind of snake-path. They're scanning (extremely quickly, by the looks of it) with a mirror and doing a kind of raster pattern.

From a skim of their patent it's a multiplexed system - http://www.freepatentsonline.com/y2017/0131388.html

Seems like they're using a galvo to scan the LIDAR and they're beamsplitting the laser to get multiple returns.


> except they seem to have solved the scanning-speed problem

By way of a GaN chip, which is the standard solution for anything requiring super high clock speeds.

Considering another company is pitching a $50 solid-state LIDAR for autonomous vehicles, I don't think this qualifies as a breakthrough[0]. It's just the market pushing prices down.

[1]: http://spectrum.ieee.org/cars-that-think/transportation/sens...


I was thinking more of the mirrors. I've used galvo systems at work before, and the off the shelf ones couldn't come close to the sorts of speeds they're showing off. Especially not in a raster pattern. 3D LIDAR using galvos has been done before for space applications (one was used on the space shuttle, or at least a ground-based model), they used a lissajous scanning pattern which is much kinder on the motors because you can do continuous movements rather than making the mirror do a rapid turn at the end of each row. It's called TriDAR:

http://adsabs.harvard.edu/abs/2006SPIE.6220E...7R


Back when the only really powerful lasers that could be found for optical tables were IR, I know a couple of guys who lost enough of their retina to be considered legally blind from light they couldn't see. One of them told me, "You know you're in trouble when your eye starts to hurt and you realize your safety glasses aren't on."


As someone who works with lasers on a daily basis - this is my nightmare. Luckily they're almost always overlaid with visible light (except during some special production steps), making things much safer.


They wouldn't be in that trouble if their quote was shorter:

  "You know you're in trouble when ... your safety glasses aren't on."


The actual candor around the limitations, from a potential beneficiary of the Lidar boom, is refreshing.


My thoughts exactly. Issues with rain/snow are a common concern, but I had never heard of the "dark car problem" :

Current lidar systems can’t see a black tire, or a person like me wearing black—Velodyne’s [Puck] wouldn’t see a 5- to 10-percent reflective object [30 meters away]. It’s the dark car problem—no one else talks about it!”


Known problem. I first saw it in 2003, when working on a DARPA Grand Challenge vehicle. Some materials are very low reflectivity in IR. Charcoal grey furniture upholstery material, as often seen on office chairs, is one. A SICK LMS LIDAR can't range that material at point-blank range.

The safe solution to this is not to look for "obstacles", but to profile the road ahead with a high-mounted laser. If you're not getting a return, either it's not very reflective, or there's a cliff ahead. In either case, you don't want to go there at speed.

As you get closer to the trouble spot, you're more likely to get a bounce from even a low-reflectivity material, because the returned light increases. So you can deal with this by slowing down until the data improves. Off-road vehicles must do this. On-road vehicles can use other data sources for assurance there's a road there, and radars are capable of detecting anything as big and metallic as a car.


So even if this thing can handle very low reflectivity in IR, are there other problems that you would need radar to solve anyway?


Radar gets you range rate. It's also a good backup to the primary system. A very simple computation from radar returns will answer "is a collision imminent"? This should apply the brakes even if the "AI system" doing the driving (or the human driver) do not.


Considering nearly every pedestrian walking along the streets near me at night is wearing dark clothes (and there are typically no sidewalks) that's a terrifying statement.


It's not a problem at night. It's a problem during the day, when everything is washed out. The lidar can still see something there, but not how far away it is.


I'd speculate that the biggest hazard would be a stationary object (tire in the road). If so, pedestrians crossing traffic seem less likely to cause problems.


That would show up as a kind of "hole" in the road. Clever software would be required to see it as a danger.


Isn't the asphalt (without lane markings) black as well? How does it recognize a black object in a black background? I don't see it showing up as a hole in that scenario.


aslphalt is a few times more reflective than rubber


Ok. Still not remotely good.


Note that it's a matte black car problem. Shiny black cars are visible. It's also not a problem at short distances, or if there are significant parts of the car that are visible.

It's a similar thing to not being able to see men carrying a glass windowpane. It's a somewhat niche problem.


LIDAR is not the only sensor input that self-driving cars will need to operate safely. Cars (on the road anyway) will be warm enough to show up on passive IR cameras. Their tires alone should emit enough heat to detect.

Parked cars, on the other hand, may be a tougher challenge. If I were building a self-driving car, my nightmare scenario would probably be a tire lying in the middle of the road, or a piece of matte-textured furniture.


A big rolling tire, complete with wheel, once was hurdling towards me on the highway.

I honked the horn at it.

Thankfully, it missed me despite that brilliant decision.


As the owner of a flat black car, I feel strongly about this 'niche'.


Glass is probably a bigger issue- take glass storefronts and glass bus stops. Niches are still important, but there are a hundred other problems of this level that also need to be solved. This problem is not particularly more important than any other, but I don't mean it's unimportant.


Are you already terrified then? Because human drivers are also terrible perceiving people dressed like this at night.


Every time I see it I cringe, yes. I don't want to see the process of mowing down pedestrians get automated, however.


Not a Death Race 2000 fan, I'm guessing.


Even then, it was sport, not automated. :)


At night that same pedestrian probably is a quite significant infrared emitter.

Matte black cars, not so much.


I'd expect the exhaust in rear and the engine compartment in front to strongly radiate infrared on most of the cars on the road.


Car parked on side of road?


I think license plates and tail lights are reflective


It seems increasingly like autonomous vehicles will rely on numerous types of sensing including lidar, radar, ir and the visual spectrum. So this doesn't seem like a huge obstacle. Also, the bar is not "perfection" but rather just doing a lot better than humans.


Is it? I personally agree, but I'm not sure the population at large will. People like being in control, and they always overestimate their own competence.


95% of cars in the US are automatics. The reason manuals are more popular in other countries is because they are cheaper, more fuel efficient, and (at least in Europe) learning how to drive a stick is (basically) a required.

There is a major price difference right now, but the switch to electric and market forces will drive those costs down. Tack on insurance discounts and the price delta will be smaller than that of automatic vs manual.


those people will be able to drive themselves. There will be plenty of desire to not drive regardless of those people.


I'm not surprised, it's probably the same reason my Roomba keeps bumping into black objects at full speed yet manages to slow down before touching bright objects / walls.


I had to cover the cliff detectors on my Roomba because it interprets the black patterns on my carpet as holes in the ground. Fortunately I live in a 1 bedroom that doesn't have any actual cliffs.


I wonder if this will put a new twist on the 'personal injury lawsuit' scam where people intentionally get hit by a car in order to sue.


It does seem like those limitations are brought up as competitive advantages, so I'm not sure I'd consider it candor.


He talks about why it won't be cheap as well, demos raw data from the sensor, etc.


It's interesting that he uses InGaAs, that's a relatively expensive material and closes off the possibility of silicon photonics.


Not necessarily. Bonding III/V materials onto Silicon is a thing[0].

Not to say it would be easy, but it should be doable.

[0]https://optoelectronics.ece.ucsb.edu/sites/default/files/201...


The article states:

“Current lidar systems can’t see a black tire, or a person like me wearing black—Velodyne’s [Puck] wouldn’t see a 5- to 10-percent reflective object [30 meters away]. It’s the dark car problem—no one else talks about it!”

How are the current autonomous vehicles from the Google, Uber et al compensating for this shortcoming?

If anyone has some recommendations for learning more about Lidar that they could share I appreciate it.


Using more than one sensor helps, radar could see a black car, while lidar not.


The article mentions that the primary advantage of this approach is that it provides the long distance coverage necessary to handle highway speeds. It also states that, because of the device design, you'd need 4, one at each corner.

It seems to me that you don't need 200m coverage to the rear, and maybe the rear of the vehicle could be handled by a less expensive 360-degree lidar unit?


Can someone clarify this: "...particularly at 200 meters and beyond. That’s how far cars will have to see at highway speeds if they want to give themselves more than half a second to react to events."

200 meters, half second? That would mean the car is covering 1km in 2.5 seconds, or 1440km/h. Not sure what type of car is going that fast.


My reading is that the half second only accounts for reaction time, not stopping distance. Two cars head on at highway speeds would barely be able to stop from a 200m detection.


I think the author is slightly exaggerating here; not that the larger range wouldn't be useful.

Assuming that there's a stationary obstruction that you must come to a complete stop.

According to [1] the stopping distance of a car travelling at 70mph is 245ft (75m).

A car travelling at 70mph is doing approx 31m/s.

With a 0.5s reaction time, the car would travel only 90m, or; With a 200m range, the car would have 4s to initiate an emergency breaking manoeuvre.

[1] http://www.government-fleet.com/content/driver-care-know-you...


Processing lag probably contributes


I wonder if this is a problem that can be solved by public-private partnerships. What if we have a public utility that maintains a network of cameras throughout the city that can be used to create a real-time self driving guidance system. Users of the system would pay fees to patch into the data feed so that their cars can use that data. The utility would turn the data into anonymized object data. Since we would want to have standards around this type of infrastructure data, it would be regulated in the early days and then can move to deregulation where third party providers can layer other data and analysis into the stream. This could be cheaper and more effective than the current attempts at per-car LIDAR with all of its limitations. Alternatively, you could aggregate the data collected by all of the cars and provide a single vision of the local area that is not limited by your single LIDAR unit.


Think of the nightmare situations if that infrastructure data was hacked...


In the US, most local Governments capable of rolling out such infrastructure already record everything a lot of information. Seattle public transit records everything (including audio!). Many have license plate scanners in busy streets. Then they have a roving band of police who with dash and body cameras.


Yeah, but I'm thinking about a situation where the data is modified to cause your vehicle to drive to the wrong place. Like that fellow who was following his car navigation system directions and drove up a bike path and crashed.


As oppose to all the car companies having the data?

You don't think the car companies won't record, collect and analyse the data for advertisement purposes?


That's why telematics are already in most new vehicles.

Chevrolet wants to be a data company.

https://www.bloomberg.com/news/articles/2016-07-12/your-car-...


It's interesting that the perspective in this article is that the high price is unimportant. Isn't the problem with existing spinning mirror Lidar that it's expensive? It makes me think there is some other advantage that solid state Lidars have to generate so much investment.


Moving parts and economy of scale. If you want to buy an industrial 1D LIDAR off the shelf today, you're looking at a few thousand dollars. A galvanometer scanner like these guys use is again, several thousand dollars (just look at Thorlabs for a basic system). InGaS array sensors are also not cheap.

Frankly I'm skeptical that a galvo based system will last long in the real world.

In principle, flash LIDAR solves all of the problems. You can image an area at long distance in real-time. It has no moving parts and it's presumably amenable to MEMS production. Basic (non-MEMS) units from people like ASC cost tens of thousands. We just need someone with a serious bankroll to develop one. We saw exactly the same with Time of Flight cameras. Microsoft bought Canesta and suddenly a $5k+ camera cost $100.


You seem to know a lot about LIDAR, I'd love to chat sometime if you'd be interested. I couldn't find your email anywhere. Care to give me a ping at sixsamuraisoldier[at]gmail[dot]com?


Also that it is relatively fragile. It is one thing to have expensive precisions instruments, but in cars, they need to be able take the bumps, shakes and any other forces that the car takes. That can be tough on tiny bearings with low intended duty cycles. Of course, you can beef up all of that to make it last, but if you avoid mechanical components all together it is definitely better.


Plenty of moving parts in a car, but the vast majority of them are metal and most with pretty forgiving tolerances.

Fast spinning mirrors aren't particularly hard, but the tolerances are pretty unforgiving. Being able to tell if a car is partially in your lane at 200 meters is a pretty small angle.

Cars take quite a bit of abuse over their 100-200k mile lifetime. Vibration (at numerous frequencies), huge temperature variations (120F to -20F or so).

Not sure I buy that Lidar is better than cameras for cars. Especially since they are cheap enough for you to have a dozen of them. Better price point, and more likely to have the same limitations as eyes do (to fit social expectations). Sadly society seems more focused on failures than a strict rational decision of deciding if a technology is worthwhile based on the numbers of lives saved vs lost.

Sure, if someone can make a solid state lidar using the lower frequency light (so they can use 40x stronger laser) it might compete. I'd still worry about blind spots because of fog, smog, snow, sleet, rain, blowing dust, sand storms, etc.


I wonder how the tradeoff between precision, speed of movement, and robustness of the system was handled. Will a moving car on a bumpy road suffer from corrupted signals?


As a cautionary warning, a laser with a 1550 nm wavelength can still cause damage to the human eye. I don't know how collimated the laser in this system is but when we piped this color of light down single-mode fiber, looking in the other end could leave you with a dead spot in your retina.


They said in the article it was low mW.

Yes, looking at the other end of a laser fiber is stupid, particularly when your eye can't see the wavelength. If you're working with 1550nm light and single-mode fibers you should know that, and if you don't you should've received better laser safety training.


>But, because it hasn’t got the 360-degree coverage of a rooftop tower, you’d need four units, one for each corner of the car.

The technology is certainly interesting and it does have its advantages, but the need to use 4 of these units, for complete coverage, isn't exactly going to make it cheap.


Pencil beam with fast scans coupled to a moving, unstabilized platform can introduce stitching/alignment problems that must be solved to maintain high accuracy.

I wonder how well this system does it -- it is not trivial.


Maybe I missed it, but I see no actual power output, just multiples of some standard. IS this thing putting out 5mW or 1W?


probably tens of milliwatts, normal lidars are fractions of a milliwatt average output. The HDL-64 has a max laser power of 1mW.


This website is really frustrating to use.


It's fine if all JS is disabled using uMatrix


What is the output of LIDAR ? A point cloud or stripes ?

What are the specs of the output of a LIDAR such a Velodyne or this one ?


It's a point cloud, but you get stripes because of the way the systems scan.

Unless you're using a flash LIDAR (mucho dinero), all LIDAR are single point sensors which have to be scanned. That is, you send out a pulse of light and a single photodiode detects the return signal.

A Velodyne LIDAR has up to 64 rx/tx pairs arranged into a grid which is rotated rapidly, so you end up imaging a series of rings around the sensor.

Specs - the best LIDAR systems scan at around 1Mpt/second over a full hemisphere. The Velodyne system gives you around 20-30k measurements in the field of view a typical car camera. Accuracy is centimeter level.

Stripes are produced by laser triangulation systems, but they're not true LIDAR. You project a line and look at how the image of it shifts compared to a known distance.


Generally LIDAR outputs a point cloud.

Velodynes tend to output very stripey point clouds.

Edit to add more info: the Puck's data sheet [0] claims a range of 100m and 300,000 points per second.

[0] http://velodynelidar.com/docs/datasheet/63-9229_Rev-F_Puck%2...


The front page video on their site gives a good visual on their output: https://www.luminartech.com

The rainbow colors are, of course, optional.


You can be super-human with just cameras. Lidar is not required.


The article is unbearable to read.


Just because I can't see infrared doesn't mean it won't damage my retinas through my fully dilated pupils at night when you pump up the power by 40x.

Edit: Emphasis on fully dilated. The amount of infrared energy that reaches my retinas in daylight is reduced due to pupil contraction.

Additionally, this has moving parts so what if it suddenly fails to spin and you are exposed to a continuous IR beam with fully dilated pupils at night? I'm sure that it's possible for it to heat and damage your retina in this failure scenario.


Nah, you'll be fine. You receive a couple hundred watts of near infrared radiation (ie the 1550 nm in the laser) just by being in direct sunlight. That's spread over your entire body, of course.

The laser in this lidar has a power of ~40 mW, which technically puts it out of being a class 1[1] laser, ie you can shine it directly into your pupil through a magnifying glass without fear. However the class is just a guideline.

Using a maximum permissible exposure chart gives a better sense of danger. Lidar pulses are on the order of 10 nanoseconds, or 1/100th the smallest division on this chart[2]. Even so, you can see that the maximum safe power for 1550 nm is somewhere between 8 and 10,000 watts, at least 800x more than the lidar emits.

1550 nm doesn't chemically damage biological receptors. The damage would have to be thermal, but the laser is pulsed for an incredibly short time and even if it wasn't it's well under what you can be exposed to indefinitely.

[1] https://en.wikipedia.org/wiki/Laser_safety#/media/File:Laser...

[2] https://en.wikipedia.org/wiki/File:IEC60825_MPE_W_s.png


> You receive a couple hundred watts of near infrared radiation (ie the 1550 nm in the laser) just by being in direct sunlight.

The visible light in sunlight makes pupils dilate, decreasing the amount of infrared energy that reaches and heats the retina.


You don't think the safety charts don't already understand that effect?


That's irrelevant, this assumes the spot is smaller than the pupil. The pupil would have to grow to 800x the area for it to even matter in the first place.


No, it doesn't just mean that. But it's rather a quick judgement to suggest that's their line of thinking. It may well be fine to do this if increasing the wavelength to over 900nm makes a beam far less damaging.

A more useful comment would be to ask what the wavelength-damage relationship is for different power levels.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: