Hacker News new | past | comments | ask | show | jobs | submit login

> Currently AP's "self-driving" features cannot save a life, they have killed people.

That is just not true. Distracted/impaired human driving is a real thing that kills people. Just two weeks ago there was a DUI event where a Tesla driver on AP literally passed out behind the wheel and the car was trivially brought to a stop anyway.

> This is tricky to think about, but consider

This is postulating a two-mode failure! That's just bad engineering. Yes, that can happen, but by definition (involving two rare events overlapping) it's an extremely rare condition and will show only a vanishing frequency in any final statistics.

Never compromise core features (in this case "don't let distracted drivers hit the thing in front of them") chasing this kind of thing. Work on the secondary/tertiary modes once you have everything else down.




> Just two weeks ago there was a DUI event where a Tesla driver on AP literally passed out behind the wheel and the car was trivially brought to a stop anyway.

Nope, and this is a perfect example!

AP didn't save that person's life, or more specifically, AP wasn't not needed to save this person's life

Tesla made Emergency Lane Departure Avoidance standard on all Teslas a while back.

Emergency Lane Departure Avoidance only intervenes if the driver deviates from expected behavior

So situations where AP would avoid a collision while being in control, the car still has a feature using the same sensor suite and the same capabilities as AP to intervene.

A Tesla without AP would have stayed in it's lane in the absence of driver input and slowed to a halt with the flashers on.

Honestly a better outcome...

It's as simple as, AP is half heartedly handling the happy path + the unhappy path.

It's ok to not have perfect handling of the unhappy path as it does currently, something is better than nothing in 99% of cases. But it's not ok to have this mode that falsely gives drivers the impression they're covered on the happy path.

Like your example could not be more perfect because a person driving under the influence relied on AP to handle the happy path for them. Now you can get into "well maybe they would have driven anyways if it didn't have AP"

And that's where emergency LKA and AEB come in. They wouldn't have given this driver a crutch to get on the road, but if they still chose to drive, they'd avoid an accident in the same manner AP did once things went off the rails.

This all goes back to the fact that the convenience features by definition cannot save a life. AP, as in the feature manually activated to assist the driver actively, cannot save a life, because it only works on the happy path. Other features like LKA and AEB are what save lives

> This is postulating a two-mode failure! That's just bad engineering. Yes, that can happen, but by definition (involving two rare events overlapping) it's an extremely rare condition and will show only a vanishing frequency in any final statistics.

> Never compromise core features (in this case "don't let distracted drivers hit the thing in front of them") chasing this kind of thing. Work on the secondary/tertiary modes once you have everything else down.

I have no idea at all where any of this is coming from... please quote where I suggested core features should be compromised for secondary features.


> A Tesla without AP would have stayed in it's lane in the absence of driver input and slowed to a halt with the flashers on.

That's not how those systems work. You're confused, I think. If you let Jesus take the wheel in a Tesla under manual control it will absolutely depart your lane and hit whatever you are aimed at. The emergency features are last minute corrections to avoid an imminent collision, and they work about as well as other car vendor's products do, they reduce the likelihood and velocity of an immiment collision. They don't drive the car for you. The feature that does is AP.

> please quote where I suggested core features should be compromised for secondary features.

You want to refuse autonomy solutions to everyone to save hypothetical bikers at sunset in Mendocino! (Edit: sorry, I got the wrong elaborate example and was paraphrasing another poster. You want to save drivers where they incorrectly countermand an incorrect autonomy decision in a circumstance where the incorrect autonomy would have been correct. Or something. But the logic is the same.)


> If you let Jesus take the wheel in a Tesla under manual control it will absolutely depart your lane and hit whatever you are aimed at.

You are wrong.

https://www.teslarati.com/tesla-lane-departure-avoidance-sav...

I understand your confusion though, Tesla is well behind other manufacturers here.

VWs have had this behavior for years now, almost as soon as they started dabbling in advanced safety assists, meanwhile Tesla got it just two years ago.

https://youtu.be/TITEf_taUto

Kind of shows where the priorities were early on, saving incapacitated drivers vs "look ma no hands!"

-

Also your second quote just leaves me with more questions.

What on earth are you talking about

> You want to save drivers where they incorrectly countermand an incorrect autonomy decision in a circumstance where the incorrect autonomy would have been correct.

What?


I don't know what to tell you. I have tested this feature on my own car, and its behavior matches the public documentation (c.f. https://www.tesla.com/blog/more-advanced-safety-tesla-owners). It will scream at you as it departs the lane, but it will depart the lane. It will correct only if it detects an object or barrier it needs to avoid. It does not do what you think it does. Tesla does offer a product to do that. It's called "Autopilot" and it absolutely works great. For example it prevented a collision (or even lane departure) entirely in that DUI case we're discussing.


Your test sounds like it failed then... did you read your own link?

> Emergency Lane Departure Avoidance is **designed to steer a Tesla vehicle back into the driving lane** if our system detects that it is departing its lane and **there could be a collision, or if the car is close to the edge of the road**.

And again, AP acted as a crutch that an intoxicated driver attempted to lean on.

Now an intoxicated driver could have decided to drive without the crutch... but emergency lane departure avoidance would have kicked in when they passed out and centered the vehicle to a stop instead of trying to maintain a certain speed.

(And let's head-off the straw-man here, no I'm not blaming AP for an intoxicated driver but I am stating the reality, that AP. Autopilot. Is marketed in a way that causes sober people to think of it as a co-pilot, let alone drunk people)

AP literally disengages when off the happy path, it's other safety features that help when things go wrong.

It's amazing that people still haven't caught onto this.


You just keep digging here, and it's all wrong. I don't know how else to reply but with facts:

> AP literally disengages when off the happy path, it's other safety features that help when things go wrong.

AP never disengages autonomously, ever. You turn it off via driver action, either via steering input that countermands it or via the stalk control. (Same deal for TACC -- it's on until you hit the brakes to kill it).

Really, go rent a Tesla for a few days and experiment. I think you'd find it enlightening.


Wrong again...

https://jalopnik.com/this-video-of-a-terrifying-high-speed-t...

https://www.techtimes.com/articles/254126/20201112/watch-tes...

> Autopilot disengaged about 40 seconds prior to impact due to the Tesla issuing a Forward Collision Warning (FCW) chime

Here's a reckless driver in a Tesla assuming AP could handle unsafe speeds on their behalf.

Once again AP is not the one to blame but... a driver tries to use it as a crutch and it disengages due to the FCW.

Due to the complete lack of input from the driver, many suspect the driver didn't realize AP would be disengaged by the FCW, and while AP disengaging would have given an alert, it'd be at the same time as the multiple FCW alerts they got.

So just like the DUI, the safety features would have worked *better* without AP. If they had been driving without AP they would have not have expected a now-disabled feature to respond to the FCW for them...

-

Seems like you didn't realize this was the case either with FCW and AP, so you're welcome....

Ps: My Model S test drive (remember those) was not going so hot when the headliner started peeling off the demo car, but then went cold when the not-a-salesperson panicked when I didn't disable AP fast enough (since we were approaching a section of lane markers in their test circuit where it was known to veer towards barriers)

I'm pretty familiar with Teslas though, I was a huge fan until gestures to everything from AP marketing to silent CPO->used change, etc.

Pps: Please don't go and try this on a public road to prove an internet stranger wrong.

You already admitted to trying to test ELKA (which won't intervene until things are going seriously wrong.)

You're not going to be able to test a FCW with AP without doing something silly...


It’s a setting. You can have a just a warning or a corrective maneuver applied.


Emergency LKA (which maneuvers) turns on every time the car is started regardless of the setting and has to be re-disabled unless something changed since the press release




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: