Hacker News new | comments | ask | show | jobs | submit login
OpenSimpleLidar (github.com)
605 points by lovelearning 10 months ago | hide | past | web | favorite | 107 comments

Hi everyone. I am the author of the OpenSimpleLidar. If you have any questions about this project, you can ask them here.

Here is another my Lidar project: https://github.com/iliasam/OpenLIDAR

That project has a wiki: https://github.com/iliasam/OpenLIDAR/wiki

A big article in Russian about it: https://geektimes.ru/post/275442/

This is the same low-cost triangulation approach used in other inexpensive laser rangefinders, like the one on a Neato vacuum: https://www.diva-portal.org/smash/get/diva2:995686/FULLTEXT0...

It's dramatically different from the $20k+ LIDAR units you'd find on a self-driving car, which generally use time-of-flight.

Also dramatically different than the low cost $13.95 time-of-flight sensors. (https://www.adafruit.com/product/3316)

Now that's a nice device, and is a true time of flight sensor. There's a similar model with 1.2m range.[1] Don't know whether it's a pulse LIDAR or an RF-modulated beam. The ones that use a pulse laser can have much more range and can work outdoors. They must far outshine the sun, but only at one wavelength and only for a nanosecond. That's quite possible.

[1] https://www.adafruit.com/product/3317

According to the VL53L0X datasheet:

> During the ranging operation, several VCSEL infrared pulses are emitted, then reflected back by the target object, and detected by the receiving array.

VCSEL is Vertical cavity surface emitting laser. Therefore I would say a pulse laser.

That one detects phase difference of the reflected signal.

>The VL6180X can detect the "time of flight", or how long the laser light has taken to bounce back to the sensor.

Sounds like no?

I am suspicious about this statement for the following reasons:

* Some ST patents [0] say they are doing time of flight but have actually came up with slight variation on reflected signal phase shift measurement (see reply by Animats).

* Seeing internal API functions like this [1] in APIs they provide for sensors of this family. Of course, this might be something else (e.g. phase shift of internal PLLs/whatever).

* The shortest time in which this sensor performs measurement is on the order of few tens of milliseconds, and the high accuracy modes take up to 100ms. True time-of-flight systems (e.g. [2]) should have the answer ready on the order of nanoseconds (for the distances this sensor works at) and I haven't yet seen designs where there is so significant post-processing of the data which could explain this latency.

Don't take me wrong, I've worked with VL6180X and VL53L0X (and look forward to work with VL53L1X), and these sensors are the best in this size but I'm just suspicious that they are directly measuring time for the signal to bounce back, but are instead inferring the time based on some other measurements.

Actually, this made me think that I probably have photodiodes with wide enough bandwidth laying around so I could check the transmitted signal on oscilloscope.

0: https://patents.google.com/patent/US20160047904A1/en?q=time&...

1: https://linx.wot.lv/selif/s5ylbs51.jpg

2: http://www.ti.com/product/TDC7200

Short range low cost time of flight devices are usually modulated-beam things. You modulate the outgoing light with an RF carrier around 20MHz or so, and then detect that carrier on the receive side. Measure the phase difference between the two to get distance. There's a neat trick borrowed from FM radio to do this - down-convert both input and output signals with the same local oscillator. The resulting down-converted signals have the same phase difference but at a lower frequency, where you can count it easily.

Are those actually referred to as "time of flight" devices though? Phase-difference measurements are cheap and extremely effective, but I usually see them labeled as such.

100% this.

Any laser scanner you'd put on a vehicle is going to be ToF, as far as I know. While this is useful for indoor sensing, you'd completely wash out the laser illumination once you have to compete with sunlight.

This would be better described as a structured light scanner. The principle is more similar to what something like a Kinect uses than a Velodyne/SICK/Hokuyo etc.

Some out there are pursuing non-ToF systems: https://blackmoreinc.com/

Also, your accuracy is very dependent on the baseline of the laser vs. the image sensor and it's very sensitive to proper calibration.

Plus expect some scale changes in the output data when the thing heats up or cools down.

I guess you also need time-of-flight to counter "aliasing" problems (e.g. laser light bouncing off of objects and illuminating nearby objects).

The scan rate and accuracy may not be the best, but the fact that they are using a linear photodiode array impresses me. The fact they were able to do all this for $35 is even more impressive, and I await the day I can purchase this from chinese electronics hobbyist websites.

Me too! I've been looking for something like this, but wasn't been able to find anything. Very nice to see this here, perhaps I'll build one if I have time at some point, or order one if it becomes available.

Thanks for this. When I first started working in LiDAR nearly ten years ago the cheapest hardware you could acquire was $25,000 and completely proprietary.

It’s nice to see a simple hardware hack for playing with the technology. Maybe this will encourage some intrepid engineers to build the next wave of hobbyist point cloud collection devices.

There are lots of fun applications like home surveying to 3D model generation.

10 years ago was 2008. You could buy a Hokuyo for under $3k back then.

Now depth cameras were a different story. SwissRanger was $10-15k, but were usurped by the sub-$200 Kinect just a few years later.


Personally, anywhere I can't see a price without contacting a sales rep from the company I hesitate to call hobbyist.

I agree, definitely not for hobbyists. List price seems to be $8,000 if you're just buying one of their cheapest product.


Eh some motors i needed for building an r2 droid came from such a site.

But yes on the whole i concur

At the time Velodyne brought out their HDL-64E for ~$75,000 you could buy a SICK LMS-200 for about 10% of that.

That's why loads of the DARPA Grand Challenge vehicles were plastered in SICK LIDARs [1] back when Velodyne only had one prototype, and it was on a truck they were putting in the competition themselves [2].

Course, Velodyne's product may be 10x the cost, but it has 64x the number of lasers, and good range and direct sunlight performance, so it's understandable that it took off.

[1] https://www.alamy.com/stock-photo-sep-28-2005-fontana-ca-usa... [2] https://www.popsci.com/scitech/article/2005-10/popscis-darpa...

Seems... dangerous? Project states it uses "3mw 780nm Infrared IR Diode Laser". 3mw is enough to damage your eye if you don't blink or turn away. Since it's infrared, you won't blink, or possible even notice anything (there are no pain receptors in the retina) until it's too late.

Wikipedia list "780–1400 nm (near-IR) - Pathological effect: cataract, retinal burn"

Laser is rotating fast enough so it is less dangerous. Also laser current is modulated, so laser's average power decreased twice.

Which safeguards are implemented to detect that rotation and modulation are taking place as intended?

I was just going to mention the THERAC 25: https://en.wikipedia.org/wiki/Therac-25

Now that is a question only a true engineer would ask.

(If you didn't ask the same question then demote yourself down to "Computer Nerd")

In the US 5mW is the legal limit for laser pointers. I think that can be used as a guideline, 3mW IR is much less energetic than a 5mW green/blue laser, perhaps it's not powerful enough to cause huge concern? But I agree no amount of laser is safe for the eye. Police LIDAR guns are 905nm at 50mW, that can technically damage you too.


The limit for visible light is as high as it is because of the blink reflex. An IR laser, having a beam you don't perceive as being bright, won't make you blink.

Wouldn't it matter how narrow the laser beam is? 50mW spread out over a larger surface area sounds like it should be safer.

Of course, then the question is whether that also comes with an equal reduction in sensitivity, or whether a larger sensor can compensate for that.

Well, actually it's near infrared so at this "high" power (3mw) you will definitely see it. You can actually see past 800nm if the source is bright, but I don't recommend it.

The wavelength range where optical radiation is visible does not have sharp borders. Here, the wavelength band of 380 nm to 780 nm is used.


Agreed - I don't enough to say about this particular project, but the lasers in LiDAR sensors do have the potential to be powerful enough to harm eyesight, so I would advise caution. This is not a concern for most commercial sensors, as they are constrained by regulations to a certain power for this same reason. I'm guessing there's no rules for DIY, but it's something to keep in mind if a product came out of this.

Joe Grand developed a similar product for Parallax a while back, with a lot of interesting technical details on his site: http://www.grandideastudio.com/laser-range-finder/

The laser range finder you linked to is different than the OP's 360 degree scanning LIDAR (... as the former doesn't scan!) :-)

Pretty cool. This could be a good primer for Uber devs.

Last article that was talking about Uber I asked why LIDAR wouldn’t have lit up that lady up. The response was “LIDAR might have been disabled”... as if that was any sort of acceptable scenario.

Running without LIDAR, at night especially? If that is true, I can imagine that's why they paid off the family so quickly.

It could be related to the fact that LIDAR tech was supposedly stolen from Waymo.

Source: http://money.cnn.com/2018/02/07/technology/waymo-v-uber-tech...

"Because I was driving with my eyes closed" or "Because I wasn't looking" would be the human equivalent. Imagine if the driver used the same excuse, when asked why she didn't see the pedestrian and take over.

So motorcyclists here have an acronym "SMIDSY" - stands for "Sorry mate, I didn't see you" - which is almost universally the first words out of a car drivers mouth after they've driven into a motorcycle.

Sounds believable to me. The human eye doesn't have as broad of a field of view as the brain's post-processing would lead you to believe, so it's very easy for its search pattern to miss small objects.

Which is because almost universaly motorbikes move faster than traffic and between lanes. So unless you stare at your rear view mirror constantly you may well miss them aproaching. Alas, at least here seeing a motorcycle driving at/under speed limit is an exception, not a rule.

Parent poster was talking about car drivers hitting motorcyclists.

You're saying that car drivers don't check their mirrors correctly when manouvering, which to me sounds dangerous.

Not saying I agree, but he leaves room open for other alternatives such as hitting a motorcycle while weaving within a lane. You won't hit a car that way, but you might take out a lane splitting bike.

I sneezed.

They have one (compared to Waymo's six): https://www.reuters.com/article/us-uber-selfdriving-sensors-...

I still don't quite understand how this works. Main components are:

- Laser

- Lens

- TSL1401 line image sensor

I was always under the impression that LIDAR includes a time-of-flight measurement, which does not appear to be the case here - the TSL1401 sensor has integration times and pixel transfer times in the range of dozens of microseconds, a timespan in which light travels dozens of kilometers.

Based on the diagram, I believe they're estimating distance based on parallax rather than time of flight

I agree.

The wikipedia article on LIDAR says

"Lidar [...] combine[s] laser-focused imaging with the ability to calculate distances by measuring the time for a signal to return using appropriate sensors and data acquisition electronics."

So this is not LIDAR. Still impressive though, I love the simplicity!

That's probably an overly specific definition. Their source for the lidar definition is a NOAA webpage [0] with a broader definition that doesn't include specifics about implementation:

> LIDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances) to the Earth.

Most lidar does use timing but I'd argue that's a type of implementation instead of a necessary part of the broader category of lidar devices.

[0] https://www.webcitation.org/6H82i1Gfx?url=http://oceanservic...

The LIDAR acronym doesn't specify the specific measurement technique, although Time Of Flight is by far the most common. There are other designs with interesting differences, for example Strobe[0] which varies the frequency of the transmitted light and measures the freq and phase delay of the returning light rather than straight delay.

I believe this design still qualifies as LIDAR, just with significantly worse performance than a typical system.

[0] https://spectrum.ieee.org/cars-that-think/transportation/sel...

As far as I know "varies the frequency of the transmitted light and measures the freq and phase delay" is the standard way of measuring ToF in case of lasers. You modulate the laser diode by a signal of a wide range of frequencies and you subtract returned signal (after cleanup). The observation is done on a spectrum. Low frequencies give you wide range but low precision and high frequencies higher precision but lower range but you can use them enhance precision of the former.

For me paralax-based sensors (in the sense of draw a line with swept laser and look at it with some kind of imager) are the first thing I imagine when hearing "LIDAR". For a long time most industrial lidar sensors used that principle (the 25k+ ones ten years ago).

This approach based on triganometry is called "structured light"

About 10 years ago, I worked on a project that used triangulation to measure distances with a laser. The specular reflection is falling on a pair of photodiodes. The relative amounts of light on each diode is indicative of the distance to the specular reflection.

A good visual aide that helped me is to imagine a measuring stick laid horizontally across your field of vision, but at a 45 degree angle, such that the left side is closer to you, and the right side is further away.

If you abstract your vision to a 2D projection of a 3D volume, you can easily see that as your eyes follow the measuring stick to the left, the distances get smaller, because they actually are closer to you. Conversely, as your eyes follow the measuring stick to the right, the measurements get larger, because they actually are further away.

Your brain is just really good at trig.

What kind of precision can you get with a device like this? The image on the github page looks very imprecise, like on the order of cm of error. If one were to scan a 3d model, could one get sub millimeter resoluton for example?

According to the hackaday page [0], the precision is indeed in the order of centimeters: 5 cm at 2 m and 10 cm at 3 m.

[0] https://hackaday.io/project/20628-open-simple-lidar

The Neato sensor does anlittle better, about 2cm at 5m

Did the same exact thing back in 2015 that used the omnivision sensor in wiimotes, having the benefit that the sensor readily reports the position of detected infrared spots, so no cpu-heavy postprocessing needed. The 20MHz clock for the sensor was taken from CLKOUT from the controlling avr cpu, so only three parts on the rotating part (laser, cpu, sensor).

I would recommend adding a slit and a bandgap filter to reduce the amount of non-laserlight, because i had big problems with false positives from stray light (sunlight etc).

This is super awesome! I've been taking Udacity's SDC and I wanted to play with some LIDAR mapping on my own. Been debating for the longest time to get the Neato XV-21 LIDAR but it doesn't really make sense to get an obsolete hardware that people salvage from old vacuums. I wish s/he will sell the PCB on a crowd-funded run like Crowd-Supply or Tindie

I was going to buy the RPLidar when they did a fire sale on the first generation. But I read there was some refresh rate error so I didn't pull the trigger. Do you know if the Sweep (by Scanse) is comparable to the RPLidar (Dev version)?

it is not, afaik. The sweep has greater range in theory but its data is much noisier.

Having only played with simple breadboard stuff years ago, could someone explain how the wiring works when connecting to something that's spinning?

edit: found it in case anyone was curious: https://en.wikipedia.org/wiki/Slip_ring

They use a slipring. In some ways, its kind of ironic, because these are now most prevalent in brushless DC motors. Slip rings work in the same fashion as brushed motors. To achieve continuous rotation in a brushed motor, there are carbon brushes that are closing different circuits as the rotor spins. This allows the phase of the electricity to stay constantly ahead of the phase of the motor, giving it its acceleration. In a brushless DC motor, this same effect is achieved through a feedback loop, obviating the need for the brushes, which wear over time, and leave carbon deposits.


another option, if your power and signals aren't too low frequency: https://en.wikipedia.org/wiki/Rotary_transformer

Slip rings are noisy and wear out. A lot of scanning LIDAR systems use a rotating mirror with stationary electronics to avoid them. You could also use stationary magnets with coils on the rotating part to transmit power (basically a mini-alternator on the spinning side), and IrDA or something to communicate.

Exactly what the Neato 2nd gen does. Coils for power and IrDA for data.

It's basically how an electric train does it.

What happens when someone looks into the laser? Is the damage cumulative?

The thing you're looking for is the class of the laser. Class 1 is completely harmless, class 2 is harmless unless you intentionally stare at it, class 3 is immediately harmful and class 4 is the kind of laser that cuts bone and metal.

I'm not sure what class this is, but it should be mentioned somewhere.

The BOM[0] specifies that this is a "3mw 780nm Infrared IR Diode Laser" with a link to eBay[1].

I'm finding 780nm 2.5mw and 3mw laser diodes rated at classes 3B and 3R. See [2] and [3]. The 3B class rating is given for a high end Edmund Optics laser which most probably gives out the "full" 3mW. A 3B class laser is "hazardous for eye exposure" [4].

So... hard to say, but not great?

I fondly remember a sticker that was on a lab's (very scary) laser: "Do not look into laser with remaining eye"...

[0] https://github.com/iliasam/OpenSimpleLidar/blob/master/Total... (XLS file)

[1] https://www.ebay.com/itm/322300023463

[2] https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=14...

[3] https://www.edmundoptics.com/lasers/laser-sources/780nm-3.0m...

[4] http://www.lasersafetyfacts.com/3B/

I've been told retinal damage from exposure to IR usually is cumulative.

As somebody with software engineering experience but little hardware experience, how do I get started with building my own Lidar? I notice he provides STL files for base plates and a component list https://hackaday.io/project/20628/components. Still not too sure how to start though.

There's a bit of work, especially on the PCB side (there are 3 separate ones). You would have to order PCBs based on the Gerber files, and all of the components, and have a way of soldering them (they are "surface mount", which adds to the complexity).

Overall, it would be much easier if the "final" PCBs were sold as a kit. Assembling this kit would be much easier (and fun) then starting from scratch.

> Assembling this kit would be much easier (and fun) then starting from scratch.

What is so fun on assembling a kit? The populate-the-PCB is probably the most mind-numbing part of the whole hardware engineering business. Design and debugging is where it's at

It's kinda surprising to find comments in Russian in code :) https://github.com/iliasam/OpenSimpleLidar/blob/master/Firmw...

It just translates the code above it into Russian (or perhaps the one above translates it into English?).

Shipped with a ROS driver?? I love you. I’ve beeen stuck on v1 Xbox 360 kinects.

Ima build this immediately.

The new Intel Realsense D435 is giving us fantastic data, and I've got two of them plugged into ROS.

This is great to hear. The new D400 devices look quite promising on paper but I have seen virtually no 3rd party reports on how well they perform in the real world.

Are you using the D435 outdoors as well?

We've only done some quick tests outdoors for now, as our device will only operate indoors. But it works well, even when pointed into the sun. Email in profile, send me a message if you want some sample video.

Dig little deeper on Youtube: https://www.youtube.com/user/iliasam3/videos


A cool project would be an autonomous LIDAR-enabled vacuum cleaner, that would actually map out the room it's cleaning to optimize its path.

Neato and Xiaomi vacuum cleaners have integrated Lidars.

Can Lidar provide the topography of a sidewalk?

Yes, but this system won't. It's an infrared triangulation system, not a LIDAR and at 780nm it probably won't work outdoors.

My flatmate has a robot and he has a laser that does 3d scanning of a room, pretty much like this but 3d.

What is the maximum range of this?

Well, do you want an eye-safe laser? And want to operate outdoors?

The Neato is about 5m indoors, 2m outdoors with a sun shade. That reflects a lot of effort fighting the "sun blindness" problem that having a powerful IR source shining in through your picture window causes for the vacuum cleaner.

Practically, you want to modulate the laser so that you can filter for an AC signal.

> "sun blindness" problem that having a powerful IR source.

Just point out it's not as bad as one would think since sunlight has much less IR than visible light energy. The deeper IR you can go and the narrower your filters on the receiving side the better.

Other issue I had dealing with an IR comm link 30 years ago was it sucks compared to RF because your sensor aperture is very small compared to an antenna. And as you mentioned the the ground floor is much higher with IR than RF.

Oh man this is cool! Now we just need OpenSimpleLaserWindow and we can listen to conversations and track people's movements just like the CIA!

I forget where I read about this, but essentially, as you speak, your sound waves vibrate nearby windows, and these vibrations can be picked up with a laser and translated back into sound.

There's a few ways you can do this. The cheapest is to just get a laser and a photo diode. It's rough, but you can get sounds. An interferometer works better. It's harder to make and more expensive, but in the hobby range, at least to get decent quality. There's tons of videos and documents that you can find showing you how to make either. You can also do a lot of cool measurements with these setups

I had never heard of an interferometer, it actually sounds like something out of a movie haha. Thanks for sharing!

The design is extremely simple. The hard part is aligning things. That's actually the hard part about most optics. If you're really OCD you need to be even more.

Here's a little animation of one: https://www.youtube.com/watch?v=UA1qG7Fjc2A

I really want to know the math behind the triangulation process

You know the angle of light entering the lens. The tangent of that angle gives you the ratio of (the distance between the lens and the laser) and (distance between the laser and the target). Which lets you work out the distance between the laser and target.

Trigonometry rocks!

nice, high quality LIDAR are quite expensive for any robots still.

This is not LIDAR.

It isn't time of flight, but I'd say that it is lidar.

No. I'm sure that you could argue that technically, if you go by the literal acronym, it is "LIDAR", I really don't think that flies. When you use the term LIDAR, you imply a certain amount of robustness and error rejection you only get using ToF.

Sticking the same parallax technology people have been using since the 80s on a board and calling it "LIDAR" isn't really honest.

With the latest Tesla crash in the news, seeing the point cloud got me thinking.

Since mirrors do not create the specular reflections required for LIDAR to work, does this mean a box truck covered in mirrors would be able to render at least the LIDAR portion of an autonomous vehicle useless?

Sounds like an attack vector to me.

It would only apply to a mirror that's offset to point your LiDAR to the sky or at some odd angle. If it's a flat mirror, it will just see itself approaching from 2x the distance / speed between the car and mirror.

Do I interpret your comment correctly as saying there is some redundant system that is specifically looking for anomalies in sensor readings? Would a top of the line lidar system be able to "understand" that its current inputs were resulting in erroneous outputs?

I am reminded of one of the recent Japanese satellites that was effectively an infant mortality, because of some quirk in its redundant systems. I've forgotten the details, but more or less, there were cascading failures across a primary system, its secondary redundancy, and its tertiary redundancy. So the feedback loops designed by the engineers to be negative, and mitigating, ended up being positive, and therefore aggravating, in the cruel vacuum of space. It came down to some error in an orientation sensor, and somehow, the redundant system actually ended up relying on information from the primary system. The boosters designed to slow down the rotation relied on the notion of the satellite's actual rotation, according to the original sensor.

As the story goes "Does the machine ever give the right answer given the wrong inputs, Mr Babbage?" Nearly 200 years later, we may finally becoming around to seeing that the woman that asked the question had a point, and Charles simply dismissed it out of hand.

I'm only suggesting that mirrors work both ways.

If you stand 1m away from a mirror, you will see "yourself" 2m away as a reflection. (1m you:mirror + 1m mirror:reflection) If you move 1m backward, you will see the reflection now 4m away (2m you:mirror + 2m mirror:reflection). Speed works the same way, if you step toward the mirror at 1m/s your reflection will approach you at 2m/s.

The same thing applies to LiDAR, either it sees the mirror and calculates the proper distance, or it sees the things reflected in the mirror at 2x actual distance and/or 2x relative speed.

There's also a company called Scanse that has a very affordable LIDAR unit, and a ROS driver. Not open source hardware, but at $350, a pretty good deal. They use the Garmin Lidar Lite v3, which has an I2C interface, and all measurements are actually computed in the sensor. The sensor itself is around $120. They also use a shaftless gimbal motor, with a slip ring, which allows for continuous rotation.


They were also generous enough to leave an exploded CAD model of their design on their kickstarter page, about halfway down, just in case you wanted to see whats going on inside, or perhaps make your own, as its mostly commercial off the shelf parts, with a custom PCB to control the motor.


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact