Hacker News new | comments | show | ask | jobs | submit login
A Tesla has crashed backing out of a garage (washingtonpost.com)
12 points by dwighttk 64 days ago | hide | past | web | favorite | 10 comments



Has tesla ever taken the blame for any incidence caused by it's autopilot?


Here's a useful study for autopilot

Group A - autopilot drives

Group B - participant drives

First few sessions, nothing happens. Then a dog, human, car... appears - situations that self-driven cars have been known to malfunction.

Aim of the study is to gauge reaction times between these two groups.

It will help us know what to do with bored - consequently distracted drivers caught in accidents.

We can have a realistic third group. An autopilot group whose initial sessions show the car successfully handling two or more near-misses.


Having not owned a Tesla or even been very near to one, does the summon feature have an option for "hitting the brakes" as the article states happened? Quote - "The maimed Tesla looked as if it would have kept driving, Gururaj said, if his wife hadn’t hit the brakes."


On other vehicles, like BMW, you have to hold the button on your fob while it moves and releasing it causes it to come to a stop, but Tesla does it slightly different I think.

https://www.teslarati.com/how-to-use-tesla-summoning-video-d...


So the Washington Post is saying that the Tesla is so successful at mimicking human behavior that it now mimics our common mistakes as well?


It was probably distracted while browsing internet for news on new iPhone.


In other news, a man has crashed his car backing out of garage!


This is misleading. I read your comment and was ready to dismiss the article, but decided to click on it and discovered that this was an automation feature that failed, not a driver error. Please try not to make such simplistic dismissals.


I had a hard time editing down the article's headline to post it. I may have gone a little too far now that I'm reading it. Maybe should have left something in about using the summon feature.


A Tesla using an automation feature doesn't operate as expected resulting in minor property damage is a guaranteed headline and so many eyeballs.

But the massive number of fatalities that occur every day in cars are rarely or not even mentioned by local news. We need fewer automation features and more assistive features like braking.

More hardware and more lines of code written by fallible humans isn't going to take human drivers out of the equation in my lifetime.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: