They try to spin the scanning pattern as being an advantage as you get a more dense sampling if you point at it in a certain direction, but I'm not convinced. It is only an advantage if your device sits on a tripod. On a moving car, more predictable, structured patterns such as the Ouster OS-1  make it much easier not only for deep learning, but also for SLAM (e.g. the LOAM algorithm  extracts feature points row by row, i.e. by ring). For a moving car or drone, any scanning pattern will sweep into a dense 3D point cloud anyway.
The pricing seems to be competitive if you only need a small field of view.
For 360 degrees, it takes four Mid-100s, which would cost $6000, or twleve Mid-40s, which would cost $7200, to obtain the same field of view coverage and the same number of points (1.2 M points per second) as a single Ouster OS-1, which is available to university researchers for $8000. However, buying a single Ouster OS-1 saves you the headache of extrinsic calibration between many lidar units, and the Ouster OS-1 only draws 14 W whereas four Mid-100s would draw a whopping 480 W. Four Mid-100s also weigh twenty times as much as one Ouster OS-1. For high density drone mapping, the Ouster OS-1 seems like a much better choice.
That said, the Livox does have a range advantage over the Ouster OS-1.
 (PDF) https://www.thorlabs.com/images/tabimages/Risley_Prism_Scann...
 Zhang, J., & Singh, S. (2014, July). LOAM: Lidar Odometry and Mapping in Real-time. In Robotics: Science and Systems (Vol. 2, p. 9).
If you had a working, quality thing, you'd expect they'd show a clear image. Warning!
(aka don't steal our stuff and don't) " (j) disclose to the public the results of any internal performance testing or benchmarking studies of or about the Productswithout first sending the results and related study(ies) to LIVOX, and obtainingLIVOX’s written approval;"
There is a picture of it at checkout though!
Someone, I forget who exactly, tried to limit their restriction. It said something like you could not publish benchmarks of their product against your products that had benchmark restrictions unless (1) you published full specifications and configuration information so others could reproduce, and (2) you gave everyone permission to do so.
Hokuyo 2D systems start from about $1k and are well regarded in researcher. SICK systems were historically used on research autonomous vehicles (eg the grand challenges), but they're far too heavy for drone use.
Edit - I thought it was 1D initially, apparently the scan pattern is circular, think spirograph (or a lissajous). You'd normally do this with galvos, but this seems to be solid state.
Of course the could be lying...er...overly optimistic and hope to get them ready after signing a contract, ala Bill Gates with DOS.
Some of the footage either shows real sensors or is extremely convincing CG :)
Unfortunately there are no standards (yet) to benchmark different LIDAR solutions, but most of their competitors stick to more realistic assumptions.
Then, in general, the website is light of any tangible information:
- Company information: Management, History, Size, Investments, strategic partners, location of company
- Concrete working principle of their LIDAR: Detector principle, wavelengths, laser type used, means of scanning, optical aperture
- Images of actual hardware
- Does their price-tag still apply after they have managed to implement proper functional safety processes and measures?
Very weak, especially considering how much competition is out there. There seems to be a new LIDAR start up every other weak.
Just pricing out some 98-'02 hondas and acuras for daily mileage gobblers as well, they're insanely cheap and pretty reliable. This is a 20 year old car that is in high availability very very cheap ($1700 in seattle), and I hvae no complaints about driving.
30 years may be optimistic except that the kids aren't learning to drive as much being the down pressure.
Super cool that this stuff is getting into pricing ranges reachable by mere mortals.
edit: for just one: https://store.dji.com/de/product/livox-mid?vid=48991
You can do this, but unless it's super cheap, why? Price/performance isn't that good.
LiDAR for autonomous vehicles is cool and all, but how will this perform if there is not the occasional car making test runs through town, but when all vehicles on a busy road or intersection are equipped with a one or more units?
I understand that both radar and ultrasonic suffer from interference, so will this be different?
Are there plans for doing any kind of (cooperative?) frequency or time division to avoid interference between cars?
How DOS-able would LiDAR be? Would I be able to bring traffic downtown LA to a halt in 2040 by bringing my portable LiDAR jammer?
The reasons lidars are so useful it that they're very directional in a way that radar isn't. That means that an effective lidar jammer can't just emit enough to interfere when the lidar is pointed at it, it needs to cause every other object present to also emit that strongly. This is possible but will cause nearby people to catch fire, at which point it probably doesn't make sense to call the object a "lidar jammer".
You can already "jam" human eyes pretty easily with an "eye jammer" aka. 5 mW green laser pointer.
This is called "jamming" and was a real problem for early radar solutions.
Curiously you well rarely see anyone mention this in the LIDAR space. For many of startups adressing this issue seems to be an afterthough.
One proper way to solve this is to use orthogonal codes in the scanning scheme. There are different ways to implement this. The complexity depends a lot on the used LIDAR architecture. It would not be surprising to see some of the startup being weeded out by that during real-world tests.
Hang it under a drone and bring down air traffic at the same time!
> Compared to other sensors, lidars are not that prone to interference because:
> * it only takes 1 microsecond to make a ranging measurement up to 150 m, so your detector is on for a short time
> * lasers only illuminate a small spot, and the detector is also looking at a similarly small spot, so it is unlikely for two lidars to point in the same spot
> Now, even if it does interfere, you may see a stream of random points pointed towards the interference source. This may happen if, say, you point a lidar directly at the sun, or if you have multiple lidars mounted on the same vehicle. Such random points are easily rejected as outliers and do not affect the vast majority of the scene. Most self driving cars (I hope) should have outlier rejection schemes that deal with outliers caused by this and other sources, such as snow, smoke, and so on.
Now, as for bringing traffic to a halt by bringing your portable lidar jammer... https://www.xkcd.com/1958/
Even that will not be necessary, a splat of dirt on the sensor is enough.
I don't understand their obsession with LADARs. MM wave radars were successfully handling the same task in the industry for a few decades while being much cheaper and reliable.
NIH too strong. I feel calling that here is 100% appropriate and objective.
They are really great at figuring out whether cars are braking, though, since cars are moving with respect to the ground you can just filter the ground out and with radar you can sense the change in speed directly rather than having to figure out the speed by comparing distance over time.
EDIT: Actually, I think automotive radars don't give any directional information at all since they don't use big dishes. I'm used to working with aerospace radars but even with those I wouldn't rely on one for car navigation.
It does, and quite well. Without any additional processing, you easily get few cm accuracy, and sub-wavelength resolution is possible if you do. That's more than enough.
Latest automotive radars use electronic beam steering – not much different from what is used in latest missiles.
I'd say, companies opting for ladars for driving assists don't have good engineering expertise.
Apparently also some Nissan cars, but only as an emergency braking sensor.
Does such an agreement have a legal force (in US/EU)? I get the patent laws and not violating the IP, but I find it strange that just the curiousity of seeing how it works can get lawyers attacking you.
Earlier this year, the Tenth Circuit court upheld a preliminary injunction granted in favor of an electronics equipment manufacturer against a reseller of its goods in a trademark infringement action. In Beltronics v. Midwest Inventory Distribution, the reseller (Midwest) argued that it was able to resell the manufacturer’s goods based on the first sale doctrine. The court, however, disagreed with this assessment and ruled that the resellers violated the manufacturer’s trademark rights because Midwest’s sales caused consumer confusion.
Isn't our perception of "reflectivity" essentially logarithmic? What we perceive as the "middle" between black and white only reflects 18% of light. Meaning anything that's darker than gray will be exponentially less reflective than that.
The metric really just reflects (:-D) the signal-to-noise ratio and dynamic range of their sensor. Max range, min/max reflectivity, SNR, accuracy, and everything else are all intertwined, so it's very difficult to compare things on equal footing unless you know exactly how it was measured. LIDAR OEMs seem to have settled on a 10%/80% rule of thumb.
As with software performance, it's all about reasonable benchmarking. If you're in some lidar application, for example building a self driving car, and you currently use velodyne HDL-64 sensors that can register returns from 10% targets at 80m, then Livox's specifications give you a clue as to how their unit might compare in a similar circumstance. That's all. Past that, you have to rig up a test with the unit yourself and profile, it's the only way. I'd also add that many objects would appear different in brightness if you looked at them under a pure wavelength like 905nm, rather than the while light your eye sees.
All that said, your concerns aren't misplaced. One of the leaders in the 'new wave' lidar OEMs is Luminar, and one of their original value propositions was that they went to a different wavelength (1550nm) which has a higher eye-safe power limit. This means that they could pump out higher energy pulses, and thus get more photons back from low reflectivity targets such as tires and dark cars. The jury is still out on what works best to cover the real world range of reflectivities, largely because there are just a lot more variables at play that a simple thresholding would imply.