Hacker News new | past | comments | ask | show | jobs | submit login
Autopilot was active when a Tesla crashed into a truck, killing driver (arstechnica.com)
25 points by close04 on May 16, 2019 | hide | past | favorite | 16 comments



>"...our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance."

This is the key phrase. Tesla is all in on marketing "Autopilot" as a fully autonomous driving agent when it's not. It's an advanced lane-keep assist system combined with dynamic adaptive cruise control. No more advanced than GM Super Cruise or Honda Sensing. The lies from their marketing department and Musk himself are causing these deaths, not the technology. If they were responsible in representing what the system really is, people would actually be more attentive to the fact that they are still driving a car, and not blindly trust their lives to a level 2 system. Honda and GM do this, and have yet to see any fatalities.


>> "...who is prepared to take control at all times..."

I honestly don't see the point of "autopilot" if I have to be prepared to take control at all times. It's not like it's any less work than actually driving manually (the hard part is maintaining focus, not twiddling a steering wheel back and forth). In reality, I don't think people who use it are actually prepared to take control at all times. Hence the accidents.

> It's an advanced lane-keep assist system combined with dynamic adaptive cruise control.

Can these features be activated independently in Tesla vehicles?

I like the idea of adaptive cruise control (ACC), but I don't think I'd like it if my car steered for me. That seems reasonable, and many cars do have ACC without lane-assist, but what about the other way around? What if I want to control the gas pedal, but have the car keep itself in the lane? Like if I'm eating a sandwich and need to take my other hand off the wheel for a few seconds, but I'm not prepared to put it into full auto mode.


It is WAY less work than actually driving. You maintain your alertness, but it takes away much of the physical load of driving. It might even preserve your alertness.

It's the BEST THING EVER in bumper-to-bumper traffic.

Also you can steer yourself and let it maintain the distance to the car in front of you by turning on only Traffic Aware Cruise Control.

That said, you should be alert while you use it. If you're a text-while-you-drive person, the only positive thing I can say is it protects other people from you.


You can have Adaptive Cruise Control without Auto-Steer enabled but it's not possible to have it the other-way around.


>"...our data shows that, when used properly by an attentive driver who is prepared to take control at all times

If it needs vigilance at all times to be safe, then it's no "autopilot".


These crossing-the-T accidents are difficult to avoid manually but radar might have helped. The article's claim that "Radar systems lack the angular resolution to distinguish a truck crossing the road from a large street sign suspended over the road, so they tend to simply ignore stationary objects" is spurious because signs don't typically move across one's straight-ahead line of travel. Edited to add that a doppler-only radar would fail, perhaps they meant that, or one with awful vertical resolution, but a radar exploiting a static return as in the Bosch adaptive cruise in my ancient chariot has enough horizontal angular discrimination to usually succeed at distinguishing the car in my lane of travel from vehicles in adjacent lanes subtending less than 10-15 degrees off of my heading.

It's not clear what time of day we're talking about but if it was in the afternoon on a clear day the truck, south of the Telsa, traveling from w->e, possibly was in or near sun glare and given poor roadside maintenance, obscured by all of the foliage along both roadways.


The overarching tone of the article is "Tesla isn't doing a very good job with its autopilot and could have prevented this". I wonder if there's any data about how many Tesla's get in crashes versus other cars. I'd suspect that given that only 2 people have died in over 1 billion miles driven in autopilot that Tesla's are safer than other cars.


When a human driver causes an accident you can assume it's an individual mistake. But with a self driving car you know it's a "collective" issue, every (identical) car would probably make the same mistake under the same conditions since they share the hardware, software, and learning data.

This is an advantage when learning but also a disadvantage until that learning is done. Which is why beyond Tesla's intentionally misleading branding for the AP I consider the drivers are ultimately responsible for controlling the vehicle.


I feel like that's absolutely irrelevant. If you trusts a device with your life, it shouldn't kill you under any circumstances other than actual malfunction.

I like to bring up an example of a radiotherapy machines here - a machine that treats you with radiation cannot have a state in which it kills the patient, even if the chance of that happening is 1 out of a million. In that scenario it doesn't matter that a manually operated machine would kill more people on average - an automatic one should kill absolute zero.

Or maybe for another example - plane autopilot has saved countless lives. But any time one fails, every single plane of that type is removed from service until the issue is found and fixed.

I feel like at the moment the approach these companies take is "well yes the autopilot can make mistakes, but it's still safer than manual driving so it's fine, yeah?". No tesla, it's absolutely not fine on any possible layer. Tesla should be disabling autopilot on every single Model S and Model 3 sold until the issue is found and fixed.


That's a staunch overreaction. You're correct in saying that you can equate Tesla's auto-steer and adaptive cruise control to the autopilot in an airplane, however what you fail to account for is that in an airplane operating on autopilot, the pilots are still required by law to be attentive and paying full attention to the system. This is very similar to how Tesla portrays current-generation of autopilot. Sure, their marketing team does state that all cars are capable of self-driving (i.e. they have the required hardware bar HW3 processor), but they never state that current generation cars are self-driving currently. It's on consumers to understand this difference and pay full attention while driving. My mom was a flight attendant on a major U.S. airline until a few years ago, and it wasn't uncommon at all to hear of a an autopilot system malfunctioning ever-so-slightly in that it steers slightly off-course due to a miscalibrated sensor but, since the pilots are actively paying attention, this error never puts the lives of those onboard the plane in-danger. I would say that Tesla's software should be held to the same standard, in that the operator is required and absolutely needs to pay full attention to the system, but there is no need for Tesla to "disabl[e] autopilot on every single Model S and Model 3 sold until the issue is found and fixed"


If anything, it's underreaction - not only Tesla should disable the autopilot on all their models temporarily, there should be heavy fines for both allowing it on the roads in this state and for calling it autopilot in the first place.

>>It's on consumers to understand this difference and pay full attention while driving.

Then this is a crazy assumption to be making. Like literally crazy. Pilots are trained over and over again to pay attention - and they still fall asleep during flights. With cars, even engineers who are employed and paid(!!!) to pay attention have fallen asleep while testing level 3 autonomy[0]. Expecting a regular customer to both understand the difference and pay attention is wishful thinking. The lowest bar for this technology that we should be aiming for is "the vehicle should never under any circumstances fail to observe and react to a stationary object in front of itself". Simple as that. If Tesla's autopilot cannot meet this bar it shouldn't be on the road, no matter how many times safer on average it is than a regular driver.

[0] https://www.thedrive.com/tech/7730/ford-engineers-are-fallin...


More than 2 people have died using Teslas, and more than 2 people have died specifically using Autopilot.

What's extremely worrisome is that multiple people have died from the exact same failure in Autopilot, indicating that Tesla's claims of "data-driven" learning are bunk.


It feels like we are getting further away from autonomous vehicles, rather than closer.


Autonomous vehicles had an investment / media hype and it's now on its last legs.

We still might have them some day -- but the vision sold throughout the 2010s of us having them anytime soon is bogus, like other such fads before it (fuel cells, VR, AR, and tons of other things besides).


>it's still worth asking whether lidar could have prevented the crash that killed Jeremy Banner.

Lidar doesn't automatically make a car safer without the software written to accompany its use. Its a very trite summation on a very complicated and cost conscious subject. The same sentence could be equally re-written:

>it's still worth asking whether paying more attention to the road, while using a system you are supposed to could have prevented the crash that killed Jeremy Banner.

And it's more to the subject.


This discussion is in the context of relying upon an automated driving system. Depth perception is a key requirement of a self driving car. When these conversations mention lidar, the implication its s information is fused with oper sensors to produce a coherent mapping of the environment.

The technical limitation of every Tesla vehichle thus far is their inability to reliably understand the proper depth of objects.

People have died as a result of their design flaws. That is the ultimate cost in making an automated car.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: