Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why Tesla’s ‘Beta Testing’ Puts the Public at Risk (nytimes.com)
14 points by edward on July 31, 2021 | hide | past | favorite | 6 comments


Looking at the article, self-driving cars will likely face a much higher psychological safety hurdle than human drivers.

We have grown a bit fatalistic towards the fact that some fellow humans will cause deadly accidents by snoozing or texting behind the wheel; there are efforts to reduce the problem by policing, but newspapers of record rarely preach about these well-known evils.

Killer robots, that is something different, right in the uncanny valley. Even isolated accidents are taken much more seriously and have more publicity.


For me it's more about skin in the game.

What happens to Musk and friends when his car catches fire by itself and kills the people inside ? Not much, it seems. So "the people" do have a point and it's not so much that there is a higher safety hurdle, but there is a clear lack of accountability to guarantee he can't screw up too much.

Take the opposite: if you make an experimental automotive technology and happen to get into an accident there will be consequences - you pay a fat fine and serve jail time.


If the car catches fire from a faulty battery, Tesla will be accountable, just like many other automotive corporations before. Technical issues killing people isn't exactly a rare thing in car history.

If the car gets into trouble because of faulty AI, that is a different story, because our legal systems cannot really grok AI yet.

That said, the same was true about the Internet, hacking, nuclear technology, airplanes, electricity etc. There isn't a single technical development in history that would be just put on hold until the lawyers constructed an optimal legal structure around it, and only then allowed to proceed.

It is always geeks first, attorneys and lawmakers later.


Agree and people will die on the roads for lack of automation while the media makes a circus out of it but eventually there'll be self driving cars


This seems to be a great example of humans inability to calculate risk. Do we blame automakers for selling cars to people without having them pass a driver's test or do we just accept that people who have a driver's license passed a driver's test at some point and are therefore qualified to drive a car. Maybe what we need to do is include auto driving in our standard drivers test in order to absolve Tesla of responsibility for the misuse or inattentive use of their auto drive technology. I'm confident that over time days will show that the risk of automatic driving will eventually be an infinitesimal fraction of the risk of human driving.


In other news: "Student drivers put Public at Risk"

Beta driving automation with a knowing driver may be less safe than the current shipping automation. But without statistics showing there is a significant additional risk, simply declaring it irresponsible is ... irresponsible.

What would be irresponsible is shipping software to everyone before beta versions were tested by volunteers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: