I don't understand this anti-autonomy cheerleading. It's like people on HN live in a parallel universe where there have been a bunch of deaths from cars running Autopilot, whereas in the world I live in, it seems to be somewhat safer than a human alone. Like, people can mess up either way, but they seem to be less likely to do so when the car is also looking out for them. What am I missing?
You have to compare the one death using autopilot to one death of people driving Teslas without autopilot. Musk tried to compare it against the universe of drivers (Teslas, kids driving crappy cars, etc), which was a complete false comparison.
So the reason it was a big deal is because it was a huge fatality. Tesla drivers are generally a pretty safe bunch. Statistically, if autopilot hadn't been engaged, that death would not have occurred. Autopilot makes Tesla drivers less safe, not more safe.
Also, the government is doing self driving industry a huge favor. These fatalities could screw over the whole industry if they get out of hand. Musk is giving self driving a bad name.
The dash cam video was only released last week, in conjunction with a lawsuit. Now it's on all the mainstream news outlets, from the Wall Street Journal to the New York Times to Fox News.
This is yet another "Tesla hit slow/stopped vehicle on left of expressway" accident. There are now three of those known, two with video, one fatal. Watch the video. The vehicle is tracking the lane very accurately. Either the driver is very attentive or the lane following system has control. Then, with no slowdown whatsoever, the vehicle plows into a stopped or slow-moving street sweeper.
Here's one of the other crashes in that situation.[1] This was slower, so it wasn't lethal. There's another one where a Tesla on autopilot sideswiped a vehicle stopped at the left side of an expressway.
So don't brake suddenly when unnecessary, only in case of emergency. In other cases just slowdown to full stop and alarm human driver in process, giving him more time to react.
IMHO, emergency braking must be mandatory for every new car with top speed greater than 60 km/h.
Last I heard, nobody has been able to verify if the car was actually in autopilot mode. However, the emergency braking also clearly failed, if the police report that no attempt to stop was made is true.
It's the offence that an engineer feels about something being marketed as something it's not.
Tesla is fooling the public. The opinion of the general public who don't drive Tesla's cars is that automated driving is already here and Tesla is leading the way.
In your reply to a well-argued post, you offer no similarly well-argued refutation of its points, just an emotional appeal based on generalities (I say 'emotional' because you take an explicit 'it is us against them, you are either for us or against us' position.) What you are missing is that opposition to simplistic arguments and false dichotomies where safety is an issue is not opposition to autonomy. What you are missing is that one can look forward to autonomy while advocating reasonable caution.
Autonomy will be great. We aren't there yet. Tesla is/was deceptively marketing their capabilities so they can risk their customer's safety with the opposite of informed consent (mislead consent? [0]). They are doing it in order to collect data that will get them to real autonomy first. That's fucked up and greedy. The other comment demonstrated that it is in fact risking their customer's lives. It's safer than some random human alone, but not against a comparable human alone.
[0] You can say they tell you to keep your hands on the wheel and all that, but they themselves manufactured/fanned a ton of hype to the contrary. It's like arguing that you should have paid more attention to the EULA.