Here is another my Lidar project:
That project has a wiki: https://github.com/iliasam/OpenLIDAR/wiki
A big article in Russian about it: https://geektimes.ru/post/275442/
It's dramatically different from the $20k+ LIDAR units you'd find on a self-driving car, which generally use time-of-flight.
> During the ranging operation, several VCSEL infrared pulses are emitted, then reflected back by the target object, and detected by the receiving array.
VCSEL is Vertical cavity surface emitting laser. Therefore I would say a pulse laser.
Sounds like no?
* Some ST patents  say they are doing time of flight but have actually came up with slight variation on reflected signal phase shift measurement (see reply by Animats).
* Seeing internal API functions like this  in APIs they provide for sensors of this family. Of course, this might be something else (e.g. phase shift of internal PLLs/whatever).
* The shortest time in which this sensor performs measurement is on the order of few tens of milliseconds, and the high accuracy modes take up to 100ms. True time-of-flight systems (e.g. ) should have the answer ready on the order of nanoseconds (for the distances this sensor works at) and I haven't yet seen designs where there is so significant post-processing of the data which could explain this latency.
Don't take me wrong, I've worked with VL6180X and VL53L0X (and look forward to work with VL53L1X), and these sensors are the best in this size but I'm just suspicious that they are directly measuring time for the signal to bounce back, but are instead inferring the time based on some other measurements.
Actually, this made me think that I probably have photodiodes with wide enough bandwidth laying around so I could check the transmitted signal on oscilloscope.
Any laser scanner you'd put on a vehicle is going to be ToF, as far as I know. While this is useful for indoor sensing, you'd completely wash out the laser illumination once you have to compete with sunlight.
This would be better described as a structured light scanner. The principle is more similar to what something like a Kinect uses than a Velodyne/SICK/Hokuyo etc.
Plus expect some scale changes in the output data when the thing heats up or cools down.
It’s nice to see a simple hardware hack for playing with the technology. Maybe this will encourage some intrepid engineers to build the next wave of hobbyist point cloud collection devices.
There are lots of fun applications like home surveying to 3D model generation.
Now depth cameras were a different story. SwissRanger was $10-15k, but were usurped by the sub-$200 Kinect just a few years later.
But yes on the whole i concur
That's why loads of the DARPA Grand Challenge vehicles were plastered in SICK LIDARs  back when Velodyne only had one prototype, and it was on a truck they were putting in the competition themselves .
Course, Velodyne's product may be 10x the cost, but it has 64x the number of lasers, and good range and direct sunlight performance, so it's understandable that it took off.
Wikipedia list "780–1400 nm (near-IR) - Pathological effect: cataract, retinal burn"
(If you didn't ask the same question then demote yourself down to "Computer Nerd")
Of course, then the question is whether that also comes with an equal reduction in sensitivity, or whether a larger sensor can compensate for that.
The wavelength range where optical radiation is visible does not have sharp borders. Here, the wavelength band of 380 nm to 780 nm is used.
You're saying that car drivers don't check their mirrors correctly when manouvering, which to me sounds dangerous.
- TSL1401 line image sensor
I was always under the impression that LIDAR includes a time-of-flight measurement, which does not appear to be the case here - the TSL1401 sensor has integration times and pixel transfer times in the range of dozens of microseconds, a timespan in which light travels dozens of kilometers.
The wikipedia article on LIDAR says
"Lidar [...] combine[s] laser-focused imaging with the ability to calculate distances by measuring the time for a signal to return using appropriate sensors and data acquisition electronics."
So this is not LIDAR. Still impressive though, I love the simplicity!
> LIDAR, which stands for Light Detection and Ranging, is a remote sensing method that uses light in the form of a pulsed laser to measure ranges (variable distances) to the Earth.
Most lidar does use timing but I'd argue that's a type of implementation instead of a necessary part of the broader category of lidar devices.
I believe this design still qualifies as LIDAR, just with significantly worse performance than a typical system.
A good visual aide that helped me is to imagine a measuring stick laid horizontally across your field of vision, but at a 45 degree angle, such that the left side is closer to you, and the right side is further away.
If you abstract your vision to a 2D projection of a 3D volume, you can easily see that as your eyes follow the measuring stick to the left, the distances get smaller, because they actually are closer to you. Conversely, as your eyes follow the measuring stick to the right, the measurements get larger, because they actually are further away.
Your brain is just really good at trig.
I would recommend adding a slit and a bandgap filter to reduce the amount of non-laserlight, because i had big problems with false positives from stray light (sunlight etc).
edit: found it in case anyone was curious: https://en.wikipedia.org/wiki/Slip_ring
I'm not sure what class this is, but it should be mentioned somewhere.
I'm finding 780nm 2.5mw and 3mw laser diodes rated at classes 3B and 3R. See  and . The 3B class rating is given for a high end Edmund Optics laser which most probably gives out the "full" 3mW. A 3B class laser is "hazardous for eye exposure" .
So... hard to say, but not great?
I fondly remember a sticker that was on a lab's (very scary) laser: "Do not look into laser with remaining eye"...
 https://github.com/iliasam/OpenSimpleLidar/blob/master/Total... (XLS file)
Overall, it would be much easier if the "final" PCBs were sold as a kit. Assembling this kit would be much easier (and fun) then starting from scratch.
What is so fun on assembling a kit? The populate-the-PCB is probably the most mind-numbing part of the whole hardware engineering business. Design and debugging is where it's at
Ima build this immediately.
Are you using the D435 outdoors as well?
The Neato is about 5m indoors, 2m outdoors with a sun shade. That reflects a lot of effort fighting the "sun blindness" problem that having a powerful IR source shining in through your picture window causes for the vacuum cleaner.
Practically, you want to modulate the laser so that you can filter for an AC signal.
Just point out it's not as bad as one would think since sunlight has much less IR than visible light energy. The deeper IR you can go and the narrower your filters on the receiving side the better.
Other issue I had dealing with an IR comm link 30 years ago was it sucks compared to RF because your sensor aperture is very small compared to an antenna. And as you mentioned the the ground floor is much higher with IR than RF.
I forget where I read about this, but essentially, as you speak, your sound waves vibrate nearby windows, and these vibrations can be picked up with a laser and translated back into sound.
Here's a little animation of one: https://www.youtube.com/watch?v=UA1qG7Fjc2A
Sticking the same parallax technology people have been using since the 80s on a board and calling it "LIDAR" isn't really honest.
Since mirrors do not create the specular reflections required for LIDAR to work, does this mean a box truck covered in mirrors would be able to render at least the LIDAR portion of an autonomous vehicle useless?
Sounds like an attack vector to me.
I am reminded of one of the recent Japanese satellites that was effectively an infant mortality, because of some quirk in its redundant systems. I've forgotten the details, but more or less, there were cascading failures across a primary system, its secondary redundancy, and its tertiary redundancy. So the feedback loops designed by the engineers to be negative, and mitigating, ended up being positive, and therefore aggravating, in the cruel vacuum of space. It came down to some error in an orientation sensor, and somehow, the redundant system actually ended up relying on information from the primary system. The boosters designed to slow down the rotation relied on the notion of the satellite's actual rotation, according to the original sensor.
As the story goes "Does the machine ever give the right answer given the wrong inputs, Mr Babbage?" Nearly 200 years later, we may finally becoming around to seeing that the woman that asked the question had a point, and Charles simply dismissed it out of hand.
If you stand 1m away from a mirror, you will see "yourself" 2m away as a reflection. (1m you:mirror + 1m mirror:reflection) If you move 1m backward, you will see the reflection now 4m away (2m you:mirror + 2m mirror:reflection). Speed works the same way, if you step toward the mirror at 1m/s your reflection will approach you at 2m/s.
The same thing applies to LiDAR, either it sees the mirror and calculates the proper distance, or it sees the things reflected in the mirror at 2x actual distance and/or 2x relative speed.
They were also generous enough to leave an exploded CAD model of their design on their kickstarter page, about halfway down, just in case you wanted to see whats going on inside, or perhaps make your own, as its mostly commercial off the shelf parts, with a custom PCB to control the motor.