Hacker News new | past | comments | ask | show | jobs | submit login

This highlights an interesting general point - in many situations, there is no simple safe fallback policy. On a highway, an emergency stop is not safe. This is a general problem in AI safety and is covered nicely in this youtube video, as well as the paper referenced there - https://www.youtube.com/watch?v=lqJUIqZNzP8



> On a highway, an emergency stop is not safe.

That depends, there could simply be no traffic behind you, which an experienced driver and hopefully and automated one would be monitoring.

Besides, there are many situations on the highway where an E-stop is far safer than any of the alternatives even if there is traffic behind you. Driving as though nothing has changed in the presence of an E-stop worthy situation is definitely not the right decision.


How intelligent is the ML driving the car? If the car slowed down and hit the 49 year old at a reduced speed the insurance payout to a now severely disabled individual would be far more expensive than the alternative insurance pay out with a pedestrian fatality. A choice between paying out for 40 years worth of around-the-clock medical care vs. a one-time lump-sum payout to the victim's family would be pretty obvious from a corporate point of view.


Are you seriously suggesting that the better software strategy is to aim for the kill because it is cheaper than possibly causing 'only' injury?

That should be criminal.

I'm all for chalking this one up to criminal negligence and incompetence, outright malice is - for now - off the table, unless someone leaks meeting notes from Uber where they discussed that exact scenario.


My point is that it's a black box and nobody outside of Uber knows what its priorities are. It could have just as easily mistaken the pedestrian leaned over pushing the bike for a large dog and then proceeded to run her over because it's programmed to always run dogs over at full speed on the highway. Outside of Asimov's "Three Laws of Robotics" there is nothing that dictates how self-driving cars should behave, so my unpopular idea above isn't technically breaking any rules.


You should check out what happened to Volkwagen for a similar trick.


Fines.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: