
Lawsuit: Autopilot didn't prevent Model 3 from driving under semi at 68 MPH - lukewrites
https://www.sun-sentinel.com/local/palm-beach/delray-beach/fl-ne-tesla-autopilot-lawsuit-20190801-zokyrcqxz5cn7jpjeenaoud6iy-story.html
======
danso
As the article notes, the driver killed in this case died in circumstances to
similar to the very first driver killed using Autopilot in 2016 [0]: driving
into the side of a tractor trailer. However, back then, Tesla implied that the
Model S cameras may not have seen the tractor trailer because of the lighting
conditions:

> _Neither Autopilot nor the driver noticed the white side of the tractor
> trailer against a brightly lit sky, so the brake was not applied._

The trailer in this recent case was also white (not sure what the lighting
conditions were). Regardless of whether it is legally the driver's fault, how
can AP be making what seems to be the same exact error 3 years ago when Musk
says Telsa's full self-driving will be ready in 2020? White trailers are not
exactly an edge-case on U.S. roads.

(worth noting that in 2016, Tesla's AP was built by Mobileye; now it is in-
house)

[0] [https://www.tesla.com/blog/tragic-
loss](https://www.tesla.com/blog/tragic-loss)

~~~
zaroth
It’s a broad side of the trailer which is a challenge. Tesla will stop for the
rear-end of a trailer, but may disregard the broad side of a trailer
stretching across the highway because it looks too much like an overpass.

Definitely corner cases like the broad side of a tractor trailer stretched
across the highway will prevent FSD from being remotely possible in 2020.
Complex traffic light arrangements in multi-lane intersections, unprotected
left turns, even driving over a lane marker to avoid a fixed obstacle are
still TBD features in the AutoPilot code base.

Those are the features they will be adding in 2019/2020 to get what Musk is
calling “feature complete” but regardless of what he has claimed, from
personal experience, I do not see how the cars could operate without any human
driver in 2020 except under highly constrained conditions like “in parking lot
not to exceed 3mph” and even that I think will expose corner cases with cars
getting stuck and needing driver assistance.

What I do think we will see in 2020 is a respectable percentage of trips being
completed start-to-finish with AutoPilot engaged the entire time.

However, then and now, the driver absolutely, positively, must always be
looking out the window when operating AutoPilot.

~~~
yborg
Musk is Tesla's worst enemy here - first in naming the thing "AutoPilot" and
then in doubling down with his absurd claims about "Full Self Driving" within
a year. What the general public thinks "Full Self Driving" means is Uber - I
get in a car, I set a destination, and it goes there as if a human driver was
at the wheel - I can browse the web, take a nap, whatever. We are a decade
from that, if not more. But as long as Tesla insists on calling it "AutoPilot
(now with Full Self Driving!)" they will keep getting sued when an unfinished
technology encounters a problem.

~~~
spaceisballer
I think autopilot is an apt name. It’s not like planes on autopilot do
everything and just fly themselves. It’s more like setting the cruise control.
I still need to be paying attention when my car is on autopilot, real pilots
don’t set their autopilot and then just leave the cockpit.

~~~
Jasper_
Reminder that the start of the marketing video for Autopilot says "The person
in the driver's seat is only there for legal reasons. He is not doing
anything. The car drives itself." [0]

[0] [https://www.tesla.com/autopilot](https://www.tesla.com/autopilot)

------
abalone
Let's analyze Tesla's statement here, because it's useful training in
intellectual self defense:

"Our data shows..."

Note the use of the keyword _data_. This appears frequently in Tesla
statements. It implies certainty and technical understanding.

"...that, when used properly by an attentive driver who is prepared to take
control at all times..."

Interesting condition.

"...drivers supported by Autopilot are safer than those operating without
assistance"

This is garbage logic. It's just saying any errors that aren't corrected by
the driver are cases of "not using it properly." Of course the "data" will
show improved safety _if you exclude all the accidents._

~~~
kazinator
Is it garbage logic?

Dev: "The compiler's automatic error checking didn't catch the array overrun
in my code, resulting in a costly security breach!"

Compiler Vendor: "Our data shows that compiler diagnostics, when used by a
skilled and conscientious programmer who is prepared to review and test his or
her code, reduce software defects compared to no diagnostics, making software
development safer."

~~~
babyloneleven
Your examples are garbage logic because array overruns are so harmful that
they should be hard errors. Of course, this requires a runtime cost, that can
be elided in most cases by the compiler's optimizer. But even low level
languages nowadays can be safe by default (see Rust) - the unsafe behavior
should always be a conscious, well studied decision by the developer.

------
a3n
They really should stop calling it "auto pilot." Because it isn't fit to be
used _as_ autopilot in its environment, yet people use it as such.

~~~
bigkm
It does exactly what autopilot does in a plane. Which will slam into anything
in its path.

------
siliconc0w
Driver facing camera has to become standard for systems like autopilot that
are still 80% solutions so they can disengage when drivers invariably start
leaning on the system too much and start using it inappropriately.

------
DoofusOfDeath
Unless the autopilot sped up the vehicle, it sounds like the driver was going
_at least_ 13 mph over the speed limit. By speeding, the human driver
increased the danger to himself _and everybody around him_.

I truly am sorry he died. But I don't think Tesla should be on the hook for
the scenario he seemingly created.[1]

[1] In my ethical framework, adults are responsible for their choices, and in
a functioning democracy, for obeying even laws they don't personally like. I
realize I'm in the minority on this.

~~~
jmpman
I don’t think it was the 13mpg over which killed him. It was the lack of
braking before impacting the semi.

------
belltaco
Asking for damages of only $15,000 ?

That seems quite low, are they hoping Tesla will just settle without fighting
it since the plaintiffs know their case is weak?

~~~
calny
The plaintiffs allege "more than" $15,000 in damages. The $15,000 figure is
likely just a jurisdictional minimum. In other words, they need to allege
damages over that amount in order to stay out of small claims court or
something similar (am a lawyer--never practiced in Florida, but this would
make sense). It's almost certainly not their full request, and they'll ask for
much more at trial.

~~~
rhacker
I hope so. When I read that steam came out of my ears. All I could think was
the Hulk Hogan got $140M for.. cheating (yeah I know it's more complicated
than that) and here we have this poor family that is only going to get $15k.
Just... not sure where this world is heading.

------
mikeash
Ten seconds from engaging Autopilot to ramming a truck, and he didn’t see it
before he engaged Autopilot?

~~~
danso
The article doesn't say anything about the truck being in front of the Tesla's
path for 10 seconds:

> _The crash in west Delray Beach happened four months ago when a tractor-
> trailer pulled out in front of a bright red Tesla Model 3 driven by 50-year-
> old Jeremy Banner._

~~~
mikeash
Neither did I! But it surely had to be a visible threat. A road where you can
drive 68MPH is going to have enough visibility that you’ll see a truck that
might cross your path at least that far in advance. Especially in flat-as-a-
pancake Florida.

~~~
danso
Is it Autopilot's standard behavior to decelerate/pump the brakes when it
detects a vehicle at an intersection? Yes, I agree 10 seconds should be more
than enough time for AP to begin preliminary evasive/mitigating maneuvers, but
according to what u/zaroth posted [0], the preliminary findings do no not
indicate this.

[0]
[https://news.ycombinator.com/item?id=20597980](https://news.ycombinator.com/item?id=20597980)

~~~
mikeash
I’m not saying Autopilot should have handled this. I’m saying the driver
should have seen the potential threat sometime before engaging the system, and
realized it was not a safe time to let it run unassisted. (It’s never truly
safe in the current incarnation, but there are degrees.)

~~~
danso
I don't have a Tesla and so don't know how AP is used day-to-day, but I'm
assuming it's frequently activated while driving at highway speeds, even amid
traffic – similar to standard cruise control? But you're saying that drivers
should never activate AP while driving, if any vehicles are also on the road
within 5-10s of traveling speed?

~~~
mikeash
Autopilot is a driver assistance feature. You are required to keep your eyes
on the road and your hands on the wheel while using it.

Naturally, not everyone does that, although they definitely should. If they do
choose to break those requirements, they should only do it for long enough
that safety can be assured.

If you use Autopilot for a bit, you get a feel for how it handles different
situations, and adjust your attention accordingly. If the road is wide, clear
and well marked then I can look far ahead and don’t have to focus too much. If
I’m in a construction zone driving right next to a barrier, I need to keep a
firm grip to ensure it stays away from that barrier and within the lines. If
there’s stopped traffic ahead, I need to start watching to ensure it brakes,
and take over to do it myself if it doesn’t.

This guy never should have taken his attention away from the road. If he did,
he never should have done it for so long. If he did, he never should have done
it in a situation where a threat could turn into danger within that time.

------
NextHendrix
gdpr-walled in europe

~~~
alexjm
What about this archived version?

[https://web.archive.org/web/20190803003025/https://www.sun-s...](https://web.archive.org/web/20190803003025/https://www.sun-
sentinel.com/local/palm-beach/delray-beach/fl-ne-tesla-autopilot-
lawsuit-20190801-zokyrcqxz5cn7jpjeenaoud6iy-story.html)

