Hacker News new | past | comments | ask | show | jobs | submit login
Lawsuits test Tesla claim that drivers are solely responsible for crashes (washingtonpost.com)
19 points by mikequinlan 17 days ago | hide | past | favorite | 39 comments



Meanwhile, the company has defended itself in court documents by arguing that its user manuals and on-screen warnings make “extremely clear” that drivers must be fully in control while using Autopilot.

Somewhat unrelated, I have a personal list of automatic liar indicators, "extremely clear" is high on that list.


> drivers must be fully in control while using Autopilot

Sales guy: "You will love the Full Self Driving feature"

Customer: "So I can just let the car drive?"

Sales guy: "Technically, you must be fully in control, but yes, the car will drive itself"

Customer: "But I thought I had to be fully in control"

Sales guy: "Oh ya, absolutely, you must be fully in control"

Customer: "So I'm fully in control but the Full Self Driving is fully driving?"

Sales guy: "Absolutely, it's an amazingly advanced feature that you will love"

Customer: "But if I'm fully in control, when does Full Self Driving get a chance to be in control"

Sales guy: "The minute you turn it on, there's no delay, it immediately begins Full Self Driving"


This is exactly the problem. I think there is very dangerous gap between cruise control (just controlling the speed) and full self driving (the real thing, not the Tesla marketing). I don't know what the exact line is (automatically keeping distance from the car in front seems fine to me, staying in the lane doesn't, but hard to say) but it seems like there should be laws against this dangerous middle area. Personally I think after this point the manufacturer should have to take full responsibility for accidents. (Although questions about maintenance, repairs and modifications will get very complicated.)

Humans are really bad at focusing during repetitive tasks. Driving is already pretty repetitive and the more tasks you take away (or take away 99.9% of the time) the more difficult it is for people to stay attentive. I think self-driving is great and will make our roads safer and free up people's time. But I find this "safety driver" situation far too dangerous.


I believe that the official term is "Sales Advisor", would be interesting to see the script they follow in such cases.


I don't know what happened but its like all the regulators disappeared and its just anarchy in corporate america


How much responsibility does Tesla have in attempting to encourage willfully irresponsible and negligent humans to be less so? Or are we all beholden to the lowest common denominator.

The warnings are clear when enabling the driver assistance system. You are responsible. Pay attention. If you don’t, you’re liable. Any Tesla drivers who say otherwise are a combination of ignorant and liars.


Maybe they should call it something like "driver assistance system" rather than "full self driving"? I've never driven a Tesla, so I'm not aware of what warnings are in place, but doesn't it seem reasonable to think that a full self driving system could fully self drive the car? Perhaps it is Tesla who is lying if the car is not capable of driving itself fully?


This "Full Self Driving" thing is absurd. It should already be illegal to sell "Self Driving" cars that are 0% responsible for any crash. But Elon thought it wasn’t enough and decided to go Full Hypocrisy and called it "Full Self Driving". What a joke.


I guess that the outcomes of these lawsuits will tell.

I'd phrase your question differently though: how much responsibility does Tesla have in selling a mass consumer product that allows consumers to be irresponsible and negligent in a friction-less manner.


Consumers are already irresponsible (DUIs, road rage, inattentive driving), how is Tesla making it worse with a system that aggressively determines attention via camera and wheel torque? Should they not sell to irresponsible drivers based on their driving record and insurance scoring? I’d tolerate that if it meant responsible drivers don’t get dragged down by the low to median quality driver.


I am not obliged to contribute to Tesla's business plan and strategy, the question is about a legal obligation, not profit maximization.


Agreed; hopefully the law rules for the responsible, not the irresponsible. But this is America, so could go either way unfortunately.


How do you define "responsible" in this context to represent Tesla?

Tesla has had to be dragged kicking and screaming into compliance for safety issue after safety issue:

Early AP/FSD required a hand on the steering wheel only once every FIFTEEN MINUTES.

Tesla was the cause of the requirement to consider an incident "within 5 seconds" of an AP/FSD disengagement to be considered for those factors.

Tesla was the last manufacturer to implement cabin monitoring.

(I despise them too, for the record) Musk has said that he'd have Tesla pay a fine for every vehicle shipped just to avoid having the airbag warning on sunvisors.

Many other tales abound, from component firmware developers being aghast that when shipping new firmware to Tesla that had a 36 hour test suite and cycle, that they'd get emails 3-4 hours later that said "looks great, thanks" and would explain that they "just flashed it onto a car they had and took it for a drive".


These actions have not incurred a material number of deaths. Zero deaths is unattainable, a handful are tolerable.

https://www.tesladeaths.com/


You have presented a lower bound on deaths. You can establish that it is at least as dangerous as X. To demonstrate safety you need to establish a credible upper bound, then you can credibly assert it is at most as dangerous as X.

Even if we do not attempt to estimate the true underlying rate, there are over 1,000 reported crashes that Tesla has declined to investigate. The upper bound can not be tightened further than that without investigation or a robust predictive/statistical model. Even underestimating the credible upper bound at 1,000 fatalities constitutes a material number of deaths as that would be many times less safe than a human driver.

You need to tighten the upper bound before your assertions that there have not been a material number of deaths are credible.


Pretty bold of you to decide people who were killed (particularly non-Tesla owners) because of issues with drivers using Tesla's barely competent (at BEST) software are "tolerable".


I did not decide. Regulators have decided; the fleet retains the driver assist capability. Changes have been made of course to appease regulatory bodies, which is good and desirable. Ergo, this quantity and types of death incurred is within government regulator risk appetite.

As mentioned up thread, irresponsible drivers hold the blame for deaths caused. If you can’t use the feature safely, you shouldn’t use it, or have access to it. Irresponsible drivers in non Tesla vehicles will continue to maim and kill others just fine on a daily basis (~40k deaths/year, ~109 per day).

(I despise Elon too, but the data tells the story for us)


The standard is not prominently advertising the opposite of what the manual and fine-print say.

People regularly do not read the manual. People regularly do not read warning messages. Everybody knows that. Claiming that you have cleared up confusion because you wrote a warning message that says the opposite of the advertising you created is deliberate and calculated deception.

NOBODY in that room made an honest mistake: "Oh my goodness, I thought people read and carefully consider warning messages." BULLSHIT. Everybody knows this is pure legal head-up-ass-covering. Twisting ethics into a pretzel to come up with implausible excuses for such blatant deception is just plain absurd.


Just calling it Autopilot pushes the idea that the car requires no attention.


Schrödinger's autopilot. It's fully self-driving when you're paying for it, but it's all your fault when it crashes.


The 'self-driving' feature is as least an attractive nuisance. They want to press their point (that drivers are responsible) it's simple: disable the self-driving feature.


I’m in the ballpark you are given warning that you must be in control at all times. It’s not a multi page agreement it’s a simple and direct message that anyone should be able to digest.

Send people back to the DMV and make them take a self driving technologies endorsement before enabling it on the vehicle. So the people that want to cater to the lowest common denominator of stupidity can really feel the effects of what that feels like.


Then what use self-driving? If you're poised to take over at any moment, it's providing nothing.

No, the very act of promoting self-driving creates some kind of expectation that the human isn't doing the driving. Right? That seems like a simple, direct message that anyone should be able to digest.


I disagree it’s well understood L2 requires human intervention. Sounds like you have higher expectations of the capabilities than what the system offers. It’s an assist. That’s it.

Whether it’s driving or not isn’t the focus here it’s the expectation of who has ultimate control. The driver.


Nothing screams "it's an assist; that's it" like "full self driving".


Where does it say it’s “full self driving”?

“Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. It does not turn a Tesla into a self-driving vehicle nor does it make a vehicle autonomous.”

https://www.tesla.com/support/autopilot#:~:text=Do%20I%20sti....


Um, the mode that you put it into, which is labelled 'Full Self-Driving'.


I think it should be partial Tesla responsibility. If you are L2/5 then it’s 40% teslas 60% driver


The claim is not really based on Tesla's technology - the problem is Tesla's dishonest advertising and Elon Musk being an unethical idiot.

Your rule makes some sense as a guideline, but it only works in the case where the driver and the auto manufacturer are doing the right thing yet something goes wrong. It does not work if the driver was watching YouTube instead of paying attention, and it does not work if the manufacturer routinely advertises a L2 system as having L4 capabilities. Even the names "Autopiliot" and "Full-Self Driving" are misleading. IMO its so misleading that Musk deserves to go to jail.


"Multiple civil cases -- and a federal investigation -- contend that Tesla’s technology invites ‘drivers to overly trust the automation.’"


By nagging every 30 seconds and giving you a strike against the account if you look away for more than a few seconds. Older cars dont have a cabin camera, but the nags for newer vehicles are frequent. At some point you have to give liability to the end user or we can't have ADAS anymore. Which means we'll delay self driving for personal vehicles for many years, costing thousands of lives every year.


At some point you have to give liability to the end user or we can't have ADAS anymore.

Alternatively, the ADAS providers could take the liability on themselves. That would be a strong show of confidence and commitment to their technology and to saving lives.


What do you think would have happened to the automobile if Ford had to take liability for every accident? There will be a few years when FSD is worse than human drivers, and then it will be super human. But to get there you need data. Or you can go the slow route of Waymo, but delay this technology by 5 years and you've killed a great number of people, probably thousands. 40k die in car accidents every year in the US, only a handful of them are ADAS-related. Not only that, but Tesla releases statistics that show cars using autopilot have fewer accidents than those who don't. There may be issues with the data, but it's clear that it's not significantly worse.


If Ford Model T was offered without a steering wheel, I am thinking jail time.


Why focus on Tesla then?

All but MB in some super tight scenario (maybe 1-5% of total) are not covering liability.

Why moral panic around tesla? Why each time Tesla is mentioned this forum turns into some fascist bootlickers?


The problem is the people developing these don't care about who gets endangered in the process of rolling it out. They care about being beneficiaries of the fiscal engine that creates it. I don't need moneygrubbers developing such a system.


Your comment is a prime example that operators are confused and unclear about how to operate the system. You call it a "nag", but the manual [1] clearly states:

1. Autosteer Bubble 1: "Keep your hands on the steering wheel at all times, be mindful of road conditions..."

2. Navigate on Autopilot Bubble 2: "Always keep your hands on the wheel and your eyes on the driving path in front of you"

3. Navigate on Autopilot Bubble 3: "keep your hands on the steering wheel at all times, and remain aware of your navigation route"

4. Autosteer on City Streets Bubble 2: "Keep your hands on the steering yoke (or steering wheel) at all times, be mindful of road conditions..."

A system that requires hands on the wheel at all times for safe operation should, like a regular car which also requires hands on the wheel at all times, only have hands off the wheel for a few seconds at most. This warning is provided side-by-side with the warning to always pay attention and thus demands a similar degree of importance and adherence. A warning after 30 seconds of failing to operate the system safely is not a "nag". You are operating it incorrectly for a tremendous amount of time every time that warning occurs.

In fact, it should generate after 1-3 seconds like the attention monitoring as it is something you must do at all times. They could remove that restriction by designing a system that does not require hands on the wheel at all times, but while that is a requirement it should be enforced correctly instead of enforced as a useless legal ass-covering afterthought.

That you are confidently asserting the belief that you must take responsibility for attention, but the equally important requirement to keep hands on the wheel is viewed as a unnecessary nag, and repeatedly referenced and advertised as such by official spokespeople, shows you are fundamentally confused about the safe operation of the system and the commercial messaging about it is inherently confusing even to sophisticated operators.

[1] https://www.tesla.com/ownersmanual/modelx/en_us/GUID-E5FF5E8...


For FSD or even AutoPilot?


Maybe just FSD.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: