Hacker News new | past | comments | ask | show | jobs | submit login

> As soon as it’s good enough for Tesla to accept liability for accidents.

That makes a lot of sense and not just from a selfish point of view. When a person drives a vehicle, then the person is held responsible for how the vehicle behaves on the roads, so it's logical that when a machine drives a vehicle that the machine's manufacturer/designer is held responsible.

It's a complete con that Tesla is promoting their autonomous driving, but also having their vehicles suddenly switch to non-autonomous driving which they claim moves the responsibility to the human in the driver seat. Presumably, the idea is that the human should have been watching and approving everything that the vehicle has done up to that point.




The responsibility doesn't shift, it always lies with the human. One problem is that humans are notoriously poor at maintaining attention when supervising automation

Until the car is ready to take over as legal driver, it's foolish to set the human driver up for failure in the way that Tesla (and the humans driving Tesla cars) do.


> The responsibility doesn't shift, it always lies with the human.

Indeed, and that goes for the person or persons who say that the products they sell are safe when used in a certain way.


What?! So if there is a failure and the car goes full throttle (no autonomous car) it is my responsibility?! You are pretty wrong!!!


You are responsible (Legally, contractually, morally) for supervising FSD today. If the car decided to stomp on the throttle you are expected to be ready to hit the brakes.

The whole point is that is somewhat of an unreasonable expectation but it’s what Tesla expects you to do today


> If the car decided to stomp on the throttle you are expected to be ready to hit the brakes.

Didn't Tesla have an issue a couple of years ago where pressing the brake did not disengage any throttle? i.e. if the car has a bug and puts throttle to 100% and you stand on the brake, the car should say "cut throttle to 0", but instead, you just had 100% throttle, 100% brake?


If it did, it wouldn’t matter. Brakes are required to be stronger than engines.


That makes no sense. Yes, they are. But brakes are going to be more reactive and performant with the throttle at 0 than 100.

You can't imagine that the stopping distances will be the same.


My example was clear about NOT about autonomous driving. Because the previous comment seems to imply for everything you are responsible


Autopilot, FSD, etc.. are all legally classified as ADAS, so it’s different from e.g. your car not responding to controls.

The liability lies with the driver, and all Tesla needs to prove is that input from the driver will override any decision made by the ADAS.


The point at which we decide that a defect is serious enough to transfer liability is quite case-dependent. If you knew that the throttle was glitchy but hadn't done anything to fix it, yes. If it affected every car from the manufacturer, it's obviously their fault -- but if you ignore the recall then it might be your fault again?

In this case, the behaviour of the system and the responsibility of the driver is well-established. I'd actually quite like it if Tesla were held responsible for their software, but they somehow continue to skirt the line wherein they require the driver to retain vigilance and any system failures are therefore the (legal) fault of the human not the car despite advertising it as "Full Self Driving".


> The point at which we decide that a defect is serious enough to transfer liability is quite case-dependent. If you knew that the throttle was glitchy but hadn't done anything to fix it, yes. If it affected every car from the manufacturer, it's obviously their fault -- but if you ignore the recall then it might be your fault again?

In most American jurisdictions' liability law, the more usual thing is to expand liability, rather than transferring liability. The idea that exactly one -- or at most one -- person or entity should be liable for any given portion of any given harm is a common popular one in places like HN, but the law is much more accepting of the situation where lots of people may have overlapping liability for the same harm, with none relieving the others.

The liability of a driver for maintenance and operation within the law is not categorically mutually exclusive with the liability of the manufacturer (and, indeed, every party in the chain of commerce) for manufacturing defects.

If a car is driven in a way that violates the rules of the road and causes an accident and a manufacturing defect in a driver assistance system contributed to that, it is quite possible for the driver, manufacturer of the driver assistance system, manufacturer of the vehicle (if different from that of the assistance system) and seller of the vehicle to the driver (if different from the last two), among others, to all be fully liable to those injured for the harms.


>> When a person drives a vehicle, then the person is held responsible for how the vehicle behaves on the roads, so it's logical that when a machine drives a vehicle that the machine's manufacturer/designer is held responsible.

Never really understood the supposed dilemma. What happens when the brakes fail because of bad quality?


> What happens when the brakes fail because of bad quality?

Depends on the root cause of the failure. Manufacturing faults would put the liability on the manufacturer; installation mistakes would put the liability on the mechanic; using them past their useful life would put the liability on the owner for not maintaining them in working order.


Then this would be manufacturing liability because they are not fit for purpose.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: