Hacker News new | past | comments | ask | show | jobs | submit login
Lidar industry unites to make roadways safer for pedestrians (autojobsnow.com)
34 points by sbuttgereit on June 18, 2022 | hide | past | favorite | 41 comments



Not quite on topic, but most here will be familiar of the problem of autonomous cars and valuing the life of the driver vs that of a kid running in front of a car. (An autonomous car does have time to react, so do you crash the car, endangering the drivers or keep course and kill the child?)

I recently learned of a new perspective when talking to a lawyer and learning of his first reaction. Apparently here in Germany courts assume a certain degree of responsibility for an accident by you simply deciding to drive. So if there is a crash of two cars, it starts with a 50/50 culpability and the circumstances (who crashed into whom because of what) decide the final accountability. The argument is that you knowingly stepped into something with very well known dangers. You took a risk when stepping into a weapon.

The child didnt take any such decisions. As such you could (in theory) make the argument for suicide programming in such situations.

I only mention it because despite having talked about the scenario in university, this perspective never came up.


The trolley problem is considered to be a pretty theoretical problem in the autonomous vehicles space. The 'real' solution is that you drive defensively enough to not get into those kinds of situations. That's what the law requires too.


That makes sense. Real life trolly problems do happen in places like an ICU, where people are already in a position where a marginal call can cause a death.


The Netherlands also has such a legal principle. For example, in the case of a car crashing into a bicyclist or pedestrian, the car driver is automatically assumed to be at-fault unless the other party made a severe traffic error.


A related thought: if autonomous cars always prioritize pedestrians, I wonder what will happen to downtown main streets? Will you still have to give way to cars crossing the road? Or will there be enough pedestrians that you could simply walk across the road at any point and the cars will be forced to wait?

The traffic lights around me are really annoying for pedestrians, you have to wait for ages. I'm happy just crossing on a red if there is no traffic. But maybe if every car is autonomous, I would just make cars wait for me?


Why don't you already do this? Human drivers generally don't want to kill pedestrians and will stop, but they'll still get mad at you. Just because they're not controlling the accelerator and brake doesn't mean they'll be any less mad about you forcing them to stop.


Depends on where you are at - for example, in New York City pedestrians have the right of way and drivers are responsible for avoiding them. In Texas automobiles have the right of way and it is the pedestrian’s responsibility to avoid the automobiles (I didn’t believe that first time I heard it but it’s true, almost nobody walks in a Texas and I think they figure if you are walking you can more easily get out of the car’s way then they can avoid you). In India I think everyone involved just prays and goes for it, on my cab ride to the airport the driver jumped a curb and started doing 50 through people’s front yards. Boss level.


> In Texas automobiles have the right of way and it is the pedestrian’s responsibility to avoid the automobiles

Source?


I think they would be less directly mad. If it’s truly driverless and the passengers are doing leisure activities, they may not even notice the first time or two unless it becomes a huge issue.


Having been stuck for 15 minutes trying to make a right turn by the street mall in Santa Monica, yes there are places where defensive driving will totally stop you. There's a spot there that's only one lane each way when cars are parked and pedestrians will not stop and have a walk during the green. If you want to go right or left you have to nose in and cut people off at certain times of day.


You don't need autonomous cars for that to happen. Many cities turn some of their downtown streets into pedestrian streets in the summer and it forces cars to wait to cross at intersections.


One should mention that this is applied to civil liability not criminal (in Germany).


He mentioned this as well ( and i forgot to) Could you elaborate on the implications/relevance? It seemed important to you and him.

edit: So is it lawyers being pedantic (no offense, its the job description after all) or is there an argument why the logic of criminal law would be more applicable here?


Lawyer here in Germany. I don’t actually know the answer. However, criminal law and civil law regulate different relationships. Criminal law regulates the relationship between the state and the individual. Civil law regulates the relationships between individuals. Criminal law is bound much more closely by the constitution. Specifically, the state may not punish an individual for engaging in lawful behavior. So while another citizen may raise the objection, “But you chose to drive this two ton machine knowing full well that it was capable to inflict harm on other people even when driven with the greatest care”, the state may not. Also, for civil liability, there is (mandatory) insurance. You, as a driver, do not usually end up paying for this (provided that you have not been negligent in your conduct of the machine). Criminal law, on the other hand, always hits you as the individual. You cannot insure against it. This is to say that, if the same standard were applied in criminal law, nobody could afford to drive a car, or operate any dangerous equipment at all, because one mistake, or even just one instance of bad luck, could land you in prison.


The relevance is difference between getting sued and getting thrown in prison.


I was aware of that. But unless you make the argument that a suicide programming would be a hard coded death penalty for careless driving, and as such only possible under criminal law, i still dont see how its relevant to the example at hand. Could you elaborate?


I have no clue what you mean by "suicide programming" and I don't see the relevance of the death penalty (criminal manslaughter would not get you sentenced to death even in America), so no I can't elaborate on that.


>suicide programming

The autonomous car crashing itself with you on board instead of running over a child. Its what these posts here are about if you look at the initial post. The question is, does the distinction that the rational of implied culpability only applies to civil law have any relevance for this example?

You answered that one gets you thrown in prison and the other sued. I dont see the relevance of this for the question at hand.

I didnt see the relevance of the death penalty either, i was guessing how criminal law might be more applicable then civil law. And its the only thing i could come up with.


I admit you have me confused, but I'm sure I don't want my car to kill me to spare me from a lawsuit. For that matter, killing me to spare me from a criminal manslaughter conviction is quite a raw deal too!


Complete misrepresentation: LIDAR manufacturers are trying to mandate LIDAR for driver safety systems.

The net result of that would be less consumers being able to afford driver safety systems, and that progress on driver safety systems would be impeded.

By citing statistics about human errors.


This seems transparently like the lidar industry uniting to ask for laws requiring car companies to purchase lidar for their driving systems.


Yes. That doesn't necessarily invalidate their points, but as with any information one has to consider the bias of the source.


The thought of automotive lidar units being repaired in the same manner as other automotive parts, with dodgy parts ordered off ebay, terrifies me. Advertised power rating and filtering of lasers sold second-hand online only have a loose relationship to reality and can give people permanent vision damage very quickly.


If TSLA succeeds without Lidar, it will make them obsolete.

Time to push for Lidar now before it is too late.


this comment would probably be a lot strong if Tesla didn't have such an abysmal safety record.


Sorry what? They are literally the safest cars on the road in terms of BOTH

A) accidents per driven mile and B) probability of injury in an accident


Is this compared to other autonomous cars with lidar, like Waymo or Cruise?


looks like waymo uses chrysler pacifica, which got a 4/5 for rollover, so worse in crashes than tesla (perfect score).

https://www.nhtsa.gov/vehicle/2019/CHRYSLER/PACIFICA/VAN/FWD...

in terms of crashes per mile, nhtsa reported waymo having the most total crashes last year of any automated driving system.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADS-SGO-...

waymo had 62 crashes in 2.3 million miles driven which is 1 crash per 37K miles, compared to autopilot 1 crash per 4.31M miles or 1 crash per 1.9M miles with a human driving (plus emergency safety features included free in every vehicle).

so the average autopilot vehicle instance travels ~2 full waymo fleet years of driving before crashing once, rather than crashing 124 times.


I think you are conflating multiple reports. The second paper you linked compares the crashes between Waymo and Tesla (62 vs 1), but does not show the number of miles driven for each system. Also, those numbers are for Automated Driving Systems (ADS) which are defined as "vehicles equipped with SAE Levels 3 through 5". On the other hand the 4.31 million miles you cited is from Tesla's "Autopilot" program [1], which is only a level 2 system [2]. The second paper you linked also confirms this as it says "there are currently no ADS vehicles for sale to the general public". So the crashes reported in that paper seem to be from company tests and robotaxi programs, not from consumer drivers.

[1]: https://auto.hindustantimes.com/auto/cars/tesla-cars-registe...

[2]: https://en.m.wikipedia.org/wiki/Tesla_Autopilot


are there actual numbers for this?


Just make drivers liable no matter what if they hit a cyclist or pedestrian.


Only when cities prioritize and are designed for pedestrians instead of cars will pedestrian safety improve.

We need more spaces that are pedestrian and bicycle only.


But hear me out...

What if we could make every car $5-10k more expensive instead?

</satire>

I once thought that using AI to make all cars self driving might be the key to making pedestrians and cyclists safe. Self driving cars are playing a game of Russian Roulette. The systems will get it wrong occasionally, with LIDAR or without. Not if, but when. Whether or not someone dies depends on the situation.


I work on self driving vehicles. I would also like more walkable, bikeable cities.

But I've also sat in those city council meetings and seen the inane opposition people have to any sort of positive reform in that direction. Self driving vehicles have the potential to actually improve road safety because local governments won't be involved.


> I work on self driving vehicles. I would also like more walkable, bikeable cities.

Two questions, then. Firstly, do you think that self-driving vehicles will ever get even close to human standards of driving? And second, what do you see as the big challenges to getting them to be acceptably safe?


I obviously can't talk specific numbers, but there are reasonable arguments to be made that in certain limited scenarios, we may already be hovering around or exceeding equivalent human metrics. Turning that into "unequivocally safer than humans all the time, everywhere" is still an open problem.

As for safety, that's both a big topic and a "I have explicitly told not talk about this in public by legal" topic. The teams and organizations I've worked for take it very seriously, but things can always be improved. Phil Koopman puts out some excellent information about where we are currently and where industry could broadly improve.


> we may already be hovering around or exceeding equivalent human metrics.

Okay, bearing in mind your second paragraph, what are the conditions under which they're safer? I've been in a few self-driving cars and I'd struggle to see how they would ever get to an acceptable standard - like, pass UK driving test kind of standard.


The satire is at the heart of it all, innit?

Why think through and implement hard political decisions when you can throw money at nerds to make a profit instead?


That's fine actually. We (American cities) should adopt systems like they have in Singapore where the certificate of entitlement to purchase a car costs, buy itself, $70k-120k, on top of the cost of the car. The externalized costs of private cars are extremely high and it's completely insane that we have what amounts to welfare for drivers.


It's seems like there is very little voter will power to make this happen.

Simple traffic cameras would also make roads much safer but they are politically very difficult.


And they need to connect, efficiently.

It's just not possible for soft 70kg humans to safely be near hard 2000kg objects moving quickly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: