
Third fatal Tesla Autopilot crash renews questions about system - SolaceQuantum
https://www.reuters.com/article/us-tesla-autopilot/third-fatal-tesla-autopilot-crash-renews-questions-about-system-idUSKCN1SM1QE
======
Roark66
I suggest people watch the actual Tesla autopilot in action
[https://www.youtube.com/watch?v=sk0eZRVw9x4](https://www.youtube.com/watch?v=sk0eZRVw9x4)
before making definite judgements. I watched this video, and I counted at
least 3 situations where the car would've had a minor crash and one situation
where the crash could be head on collision as the car couldn't detect wavy
lanes properly without road markings.

The technology Tesla is using in the autopilot is pretty cool, but the feature
definitely has a wrong name. It should be called something like Driver Assist
Plus etc.

Oh, and there is no way in hell they'll have functioning robotaxis outside
predetermined geographic areas in one year's time. I'll print this post and
eat it if they do!

~~~
notaki
The entire video is Autopilot in city driving. Autopilot, at this time, is
only for "driving on dry, straight roads, such as highways and freeways. It
should not be used on city streets."

~~~
cmsonger
Completely agree, but then if users followed Tesla's instructions about AP
then we would not be here.

There is precedence that "it's up to the user" is not sufficient as a societal
standard for what is legally allowed. (For example seat belt laws). It is
reasonable to ask: "Should we allow such a feature in cars on our road?" For
example, what if the car had struck a cyclist rather than a truck and the
fatality went the other way?

~~~
notaki
What precedence? If "it's up to the user" is not sufficient, then all vehicles
should be speed capped at the legal speed limit and prohibited from moving if
seat belts are not on. Clearly that's not the case. Unfortunately, freedom
includes the freedom to do dumb shit. Some people do dumb shit and are
responsible for the consequences.

~~~
cmsonger
> Unfortunately, freedom includes the freedom to do dumb shit.

You are not legally free to do all dumb shit. For example, you are not free in
many states to ride around without a seatbelt on. Car makers are not free to
make cars without seatbelts.

That's the precedence.

~~~
notaki
That's exactly my point. Seat belts are required by law. Obeying the speed
limit is required by law. But users are not forced to obey these laws by
restrictions placed on the vehicles by manufacturers. If the user disobeys the
law, they are responsible. If a Tesla driver disobeys Tesla's Autopilot
warning, they are responsible. Are you proposing that we need a law that makes
Autopilot illegal to use on city streets? Or highways as well?

------
DannyBee
"immediately removed his hands from the wheel."

Tesla keeps saying this, but as far as i know, they can't actually detect that
(because it's a torque sensor and can only whether they are moving the wheel).

Does anyone have different info?

You can even find plenty of reports that even light touching of the wheel
causing it to issue warnings that hands are not on the wheel.

They never have let one of the lawsuits get past discovery (AFAIK) before
settling, and this would almost certainly come out there if true.

~~~
hef19898
So the sensor is completely useless on a straight road.

~~~
Domenic_S
Nah. Autosteer is constantly applying counter-torque unless you exceed the
takeover limit, deactivating autosteer. Someone calculated the required force
and it's something like 2.5 lbs. My commute has lots of straight freeway and
resting my hand on the wheel works fine.

------
SOLAR_FIELDS
If Tesla backed down on the Autopilot branding and called the feature what it
is, fancy cruise control (Cruise+ maybe?) they wouldn’t have to have a PR
battle every time something like this happens. It’s way too late for that now
though unfortunately.

~~~
sjwright
Imagine a driver driving down a parkway at speed and thinking "I'm on a
parkway, so I should shift the car into Park."

That's how dumb this argument is.

If a driver ignores the requirements of their driver license AND the vehicle's
clearly worded warnings, then the driver is entirely responsible. Claiming
that the driver is not at fault because they interpreted the word "autopilot"
to mean "I can disregard my responsibility as a driver and ignore the
vehicle's very clear warnings" is deeply absurd.

~~~
rjdagost
I would be more sympathetic to your argument if Tesla marketed Autopilot as a
driver assistance feature. But that's not at all how they market it. When you
go to Tesla's Autopilot page, you immediately see a video of a Tesla driving
itself- the user doesn't touch the steering wheel for minutes as it navigates
complicated scenarios:
[https://www.tesla.com/autopilot](https://www.tesla.com/autopilot)

Tesla's own Autopilot page video has this disclaimer in bold at the very start
of the video: "THE PERSON IN THE DRIVER'S SEAT IS ONLY THERE FOR LEGAL
REASONS. HE IS NOT DOING ANYTHING. THE CAR IS DRIVING ITSELF". Then there are
multiple videos of Elon Musk driving hands-free on Autopilot for extended
periods of time, like this one:

[https://youtu.be/MO0vdNNzwxk?t=123](https://youtu.be/MO0vdNNzwxk?t=123)

So Tesla can bury whatever carefully worded legalese they want in their terms
and conditions, but they know that these videos will get shared and watched
millions of times, and that they strongly convey the message that "Tesla's
drive themselves on Autopilot". These videos set the expectation of how
Autopilot should function, and I don't blame Autopilot users for not
understanding this.

~~~
mannykannot
So the driver is not there to take over if the system does something
dangerous? But that's what Tesla says every driver killed in one of these
crashes was not doing, and why they were at fault. This is an unambiguous case
of Tesla sending a mixed message, and completely unacceptable.

------
mouzogu
Maybe I misunderstood but why was the vehicle doing 68mph on a 55mph road? Is
that the driver's fault or autopilot?

Perhaps they should rename it to assisted driving or something like that. I
think I saw something about someone sleeping while on autopilot. There's only
so much parenting the manufacturer of the car can do. The ethics of all this
are much more complex of course - but perhaps the messaging needs to be
clearer that autopilot is more akin to advanced cruise control as opposed to
you can go to sleep, the car can drive itself.

~~~
Shivetya
As an owner and autonomous driving fan I keep asking the same question. Will
autonomous cars be legislated to always obey the law or will that still be a
judgment left to the driver if the car can be driven either way.

I am surprised there are not laws to have cars alert you to the fact your
speeding for the systems that can read speed limit signs in addition to have
current maps.

~~~
gambiting
>>I am surprised there are not laws to have cars alert you to the fact your
speeding for the systems that can read speed limit signs in addition to have
current maps.

Well, there are systems that do this. Many new cars have speed limit sign
recognition built in and they will display the current limit on the
dashboard(and sometimes show you an icon indicating that the vehicle thinks
you are speeding).

The reason why this is not more aggressive is because it's only seemingly a
simple problem - in reality it's far far from it. In my experience of driving
around the EU - I had a Mercedes CLS(brand new 2019 model) with adaptive
cruise control and speed limit signs recognition, and while amazing when it
worked, it worked maybe 50% of the time. Issues I noticed:

1) it kept detecting speed limit signs on roads parallel to the one I was on
[0]

2) it kept detecting speed limit signs on side roads - leading to actual
dangerous situations, because I was on a 50mph road, going past a 20mph side
street and the car would suddenly hit the brakes because it detected a 20mph
speed limit. That's 10000% unnacceptable in a production vehicle. [1]

3) it kept reading speed limit signs on the slip roads leaving the motorway -
so you're doing 70mph on the motorway, and it suddenly reads the 50mph speed
limit that's for the motorway exit. Same problem as with point 2.

4) In situations where there are no speed limit signs, it would use the map
data - and even though this was a brand new, March 2019 vehicle, it had no
idea that a stretch of motorway next to my house is not limited to 50mph
anymore, and hasn't been for the last 2 years.

In practice, I had to keep overriding the speed limit chosen by the car
constantly, and after few days I switched it fully off as it led to me nearly
shitting my pants as the car hit the brakes for seemingly no reason whatsoever
few times.

And then I have concerns about countries where the speed limit is only valid
to the nearest intersection - and the definition of an intersection varies.
There are paved roads merging with the main road which are not intersections
because the side road leads to private property, and then there are dirt
tracks which are intersections because technically it's a public road open to
motor traffic. There are no signs to indicate either - the driver is just
"supposed to know". It's idiotic, I know. And in some countries(Poland) the
speed limit can change as often as 4-5 times in a span of a single kilometre.
What's even worse is that there are situations when the speed limit can
actually increase(!!!!) the speed limit set in a certain zone(built-up zones
have a 50km/h speed limit, but this can be increased by a regular speed limit
sign - and then yes, it goes back down(!!!) to 50km/h after an intersection).
All of this stuff would be incredibly hard to code into a system like this,
and it's one of the reasons why I suspect the proposal made by EU to have
automatic speed limiters built into cars will either fail, or people will just
keep disabling it altogether, making the whole idea pointless.

[0] Here it would think the side road has a speed limit of 50mph, because it
would read the wrong sign:

[https://www.google.pl/maps/@54.9929676,-1.5788994,3a,75y,78....](https://www.google.pl/maps/@54.9929676,-1.5788994,3a,75y,78.69h,85.53t/data=!3m6!1e1!3m4!1ssEpcslIqt97nfVrtZtQYhA!2e0!7i13312!8i6656)

[1] Here for example the 20mph sign is placed at an angle to the road, so the
car would read it going past it - and set the adaptive cruise control to 20mph
which is incorrect for this street:

[https://www.google.pl/maps/@55.0074774,-1.615172,3a,41.3y,90...](https://www.google.pl/maps/@55.0074774,-1.615172,3a,41.3y,90.87h,86.39t/data=!3m6!1e1!3m4!1s5WsAtaR3hv8QI1b5PG371w!2e0!7i16384!8i8192)

Same here:

[https://www.google.pl/maps/@55.0071611,-1.6214135,3a,75y,240...](https://www.google.pl/maps/@55.0071611,-1.6214135,3a,75y,240.68h,82.02t/data=!3m6!1e1!3m4!1s9NH6vTDZaU5smuIYopqBMQ!2e0!7i16384!8i8192)

~~~
tomp
Not to mention that "speed limit recognition" is a huge waste of resources in
the era of GPS. If countries want that, they should just keep an up-to-date
database of roads with their speed limits, and update that appropriately.
Alternatively, companies could drive cars around (like Google does for its
Maps), run speed limit recognition, do manual adjustments, and upload that to
their online maps.

~~~
DFHippie
You still need to deal with temporary speed limits in construction zones.

~~~
maccard
And smart motorways

------
gibolt
It doesn't seems to have been mentioned here that this happened over a year
ago, in March 2018.

I believe that the time between each crash is lengthening, even as the number
of cars with AP balloons. This would speak to the regular improvement.

~~~
taneq
Maybe they meant to post this one?

[https://www.theverge.com/2019/5/16/18627766/tesla-
autopilot-...](https://www.theverge.com/2019/5/16/18627766/tesla-autopilot-
fatal-crash-delray-florida-ntsb-model-3)

~~~
unityByFreedom
No, 2018 is the model of the car. Commenter above misread the first sentence.

Accident happened on March 1, 2019, and the car was a 2018 Model 3.

> Tesla Inc’s Autopilot system was engaged during a fatal March 1 crash of a
> 2018 Model 3 in Delray Beach...

------
pwncake
Every incident is obviously tragic, but statistically with the number of
Teslas on the road is self-driving roughly on par with the number of expected
deadly accidents with manually driven cars? This is still a nascent technology
with a lot of promise and it is naive to expect that there will never be a
fatality or issue.

~~~
notahacker
Statistically, Teslas are much, much more dangerous than other new cars in the
same price bracket.[1] To be fair, I suspect the driver demographics skew
younger, maler and more urban than most luxury cars and that has at least as
much impact as any actual driving characteristics, but the statistics really
don't bear out the _people killed by glitches or systematic flaws in Tesla 's
software pale into comparison compared with those saved by it_ line of
defence.

I find it truly remarkable how on the one hand HN can rage for days over
Boeing introducing software with flaws that killed people and on the other
hand when it's Tesla's software flaws killing people in a similarly repeatable
manner, the general consensus veers towards deaths from poor software
implementation not mattering nearly as much as the potential of the
technology...

[1][https://medium.com/@MidwesternHedgi/teslas-driver-
fatality-r...](https://medium.com/@MidwesternHedgi/teslas-driver-fatality-
rate-is-more-than-triple-that-of-luxury-cars-and-likely-even-
higher-433670ddde17)

~~~
taneq
> Statistically, Teslas are much, much more dangerous than other new cars in
> the same price bracket.[1] To be fair [...]

To be fair, how many of the other new cars in the same price bracket and
category class (midsize sedan, large sedan) can be ordered with factory 700+hp
and sub-3.3s 0-100km/h times?

I haven't seen numbers but I'd suspect that the vast majority of Tesla
fatalities are due to the fact that the cars are stupidly much faster than
anything new Tesla drivers will have driven before.

~~~
leetcrew
the model s with the performance characteristics you mention starts around
$90k (or $110 with ludicrous mode option). at this level you can start getting
into 911s, BMW m4/m5, or the highest trim American muscle cars. these cars
have comparable acceleration figures to the performance model s and the sports
cars will thrash it in any situation that doesn't involve 0-60 in a straight
line.

edit: I reread your post and realized you are talking about midsize sedans and
larger. Mercedes amg e63 is a better counterexample. most of the cars I
mentioned don't have 700hp either, but you don't need it when you start with a
500-1000 lb weight advantage.

~~~
taneq
Part of the discrepancy is also that I'm from the Australian car market and
our prices for fast cars are insane compared with the U.S. so my baseline is
skewed. :/

Also a P3D is much cheaper for similar price
([https://forums.tesla.com/forum/forums/p3d-base-
price](https://forums.tesla.com/forum/forums/p3d-base-price)), and the 3.3s
0-100 mentioned was for a P3D+ not a S100D.

~~~
leetcrew
oh okay, I shouldn't assume everyone can buy cars at the US price. still if
you move down to BMW m3 or amg c63, price and performance are comparable to
p3d+ (though the Tesla does have a 0.3 - 0.7 lead in straight line
acceleration).

I think your point still has some merit though. it could be that fast and
quiet EVs don't provide as much sensory feedback to say "hey guy, you're
actually going really fast". it could also be that people who are not usually
interested in performance ICE vehicles might go for a performance EV for the
novelty.

------
miguelmota
Elon needs to turn it down a notch on marketing the autopilot system as a full
self-driving option or he risks becoming seen as the Theranos of automobiles

~~~
mimixco
And of space exploration, transit tunnels, lithium batteries, and solar
panels!

~~~
neural_thing
He already is the Theranos of solar panels. Promised a fake product (solar
roof) to bailout himself and his family out of a failed company.

~~~
zaroth
Except that they have actually installed solar roof on a few dozen houses as
test cases while they continue to refine their product.

Theranos never had a functioning product, they used tradition equipment
secretly behind the scenes.

Is this house secretly using solar panels?

[https://youtu.be/zMR10vbo3A0](https://youtu.be/zMR10vbo3A0)

Tesla wants to iterate on Solar Roof a few more times to increase yield and
performance before they install it on 10,000 houses with a Lifetime glass
warranty and a 30 year weatherization warranty.

~~~
neural_thing
The value proposition is fake, not the technology. Solar roof tiles have been
commercially available since 2005, they are just too expensive.

Musk promised that their roof tiles will be cheaper than regular roofs. I have
seen zero evidence of that. The roofs on houses you mention cost WAY more than
regular roofs.

~~~
unityByFreedom
Precisely. Musk is a con artist, none of his businesses are or ever will be
profitable. He exists to facilitate the flow of money from middle class to the
powerful few.

~~~
zaroth
This is just plain libel. SpaceX is profitable. Tesla has been profitable on a
quarterly basis.

------
jannyfer
I always wonder if the Autopilot software has an exception that lets it drive
through white rectangular objects in the middle of the road, because tunnel
exits can look like a white rectangle to a low dynamic range camera.

~~~
kylecordes
I suspect the real problem will turn out to be extremely deep and difficult.
That human drivers drive not only with great pattern recognition around
shapes, but also with an actual understanding of the idea of a truck and the
idea of a tunnel and the idea of a sign. There is a substantial chance that
will turn out full self-driving is a fully general, human level AI problem.

That said, even the existing partial self-driving is extremely useful. I would
be hard-pressed to consider another brand of vehicle until the others catch up
with this.

------
zaroth
> _The NTSB’s preliminary report said the driver engaged Autopilot about 10
> seconds before crashing into a semitrailer, and the system did not detect
> the driver’s hands on the wheel for fewer than eight seconds before the
> crash._

> _Tesla said in a statement that after the driver engaged the system he
> “immediately removed his hands from the wheel. Autopilot had not been used
> at any other time during that drive.”_

It was obvious from the crash that autopilot was engaged and the driver was
distracted. It’s heartwrenching to think the system had only just been engaged
mere seconds earlier.

It seems like the driver had something happen in that 10 seconds and thought
they could attend to it with their eyes off the road.

I hope one day they get the system to the point where it is intended to be
used like that.

~~~
dahdum
I’ve read many times Tesla owners saying the warnings pop up even if you have
your hands on the wheel if you have a light touch. Has Tesla fixed this issue
yet?

~~~
zaroth
On the Model 3 at least I believe the sensor is based on torque. You have to
apply turning pressure to the wheel. Just enough so it knows you’re there, not
too much to disengage AP.

I find jiggling the volume wheel or the cruise speed wheel on the steering
wheel up/down a click is more reliable because that won’t ever disengage AP.

I have no idea what they mean “did not detect hands on the wheel” because it
doesn’t detect your hands on the wheel at all. It detects steering input on
the wheel.

~~~
mehrdadn
> I have no idea what they mean “did not detect hands on the wheel” because it
> doesn’t detect your hands on the wheel at all. It detects steering input on
> the wheel.

Wow! Really?! That sounds like one hell of a lie on Tesla's part... cynical
part of me wonders if they deliberately avoided trying to detect hands for
liability reasons?

~~~
reubenswartz
Hands on wheel doesn't really mean anything. You could be asleep. Applying a
slight torque to the wheel or nudging one of the buttons requires conscious
action, unless you deliberately hack it. (from this thread it sounds like
there are some tricks that work, if you don't particularly value your life or
the lives of others on the road.)

That said, Tesla's PR often seems misleading at best.

------
tedsanders
I don't understand why a person would say his cars will be fully self driving
within a year when they still run into semi trailers. I guess it's because he
wants people to give him money. It's sad from all directions.

And let me pre-empt the replies by saying yes, what matters in safety is the
statistics, not a single data point. And let me pre-empt the replies to the
reples by saying yes, but the statistics on self-driving cars are not at all
encouraging.

I suspect Autopilot's running-into-semis rate is worse than humans', for three
reasons:

(1) Tesla has been cagey about their Autopilot safety stats. If Autopilot was
clearly safer, I think they'd be less cagey. Absence of evidence is evidence
of absence. This is just speculation, of course.

What concrete data they have shared is summarized here:
[https://www.tesla.com/VehicleSafetyReport](https://www.tesla.com/VehicleSafetyReport)

Teslas with Autopilot on crash roughly once every 3 million miles. To their
credit, that's really good! However, they don't say: (1) how often Teslas
crash on freeways with Autopilot off, or (2) how often drivers have to take
control to prevent Autopilot from crashing.

(2) The fatal crash in 2016 was also with a semi trailer. 2/4+ fatal crashes
having this failure mode suggests this failure mode is especially high for
their vision based system. Not a good look.

(3) Pretty much all self-driving companies have disengagement rates way worse
than humans. Waymo is leading the pack - by far - at something like once every
10,000 miles in California. There are tons of caveats in interpreting these
numbers, and they should probably be treated as an upper bound, since it's at
the discretion of the company to only self-report disengagements that would
have led to crashes, I believe. But human crash rates are something like once
every 300,000 miles. So even lidar based systems are an order magnitude away
from humans, by one super rough measure.

Lastly, a small correction to the title: This is the third known US autopilot
crash. I believe there was a suspected autopilot crash in China some years
back as well, and I'm not sure about other countries.

~~~
nick_kline
Apparently many or all of the current self driving cars don't do well when
running into fixed (ie stopped) physical objects. It must mean somehow in the
way that they detect things, they must have to discount stopped objects, and
'moving objects', like cars are detected in a different way. I'd love to hear
more background about why this is the way it is.

I agree that when you can't handle this kind of situation in a tesla it really
challenges the idea that we'll have total self driving in a year. There must
be a lot of software that is under testing that could be more reliable than
what they have today, yet the issue is still there. The last part (5 or 10%)
of regular software dev is really hard, and it should be even harder for self-
driving, never yet approved or safe enough self driving software.

~~~
ygra
As far as I remember, it is because radar gives you too many points that are
usually false positives if you don't discard stopped objects (which include
signs, guiderails, anything on the border of the road (which you'll be driving
right _at_ in a corner, after all)). This is something that ACC in all radar-
using cars does, AFAIK. AEB works differently in that stopped objects _are_
tracked, but it only works well at slower speeds. If I had to guess it's
because they have to track a potential stationary object over time to
determine that, yes, it's an object that concerns us, it's in the way, and
it's not just noise.

In case of the first trailer crash it apparently looked similar to an overhead
sign to the sensors, also approaching a stopped object would be AEB territory,
not ACC, which doesn't work well at higher speeds.

~~~
gamblor956
Step 1: identify _where_ stopped objects are relative to the sensor

Step 2: if stopped object is immediately not in front of the car, otherwise,
ANALYZE IMMEDIATELY because a stopped object is right in front of the car.

It's not rocket science, and it's something so ridiculously basic that Tesla's
failure to do something this simple borders of gross criminal negligence. It's
something their competitors already do; in fact Waymo's problem is that it's
_too_ sensitive to stopped objects in front of the vehicle, hence the Waymo's
constantly stopping and starting or driving very slowly...but that's the
better and safer way to do it, especially when you're testing on real streets
with third-party vehicles.

~~~
tapland
It's an issue that needs to be fixed but it's not extremely simple.

The car always has still-standing objects in front of the car (the road,
anything around the road, barriers, anything on opposite sides of barriers,
things in the direction of the ar but the road is turning etc.

The car has to know that the road doesn't slope up over the truck or around
it. I'm guessing that's where the split-barrier fails occur.

It really needs to create a real-time 3d map of the entire environment
(including what's visible from eye-height, not only front of hood) and map a
route through that.

~~~
gamblor956
_The car always has still-standing objects in front of the car (the road,
anything around the road, barriers, anything on opposite sides of barriers,
things in the direction of the ar but the road is turning etc._

Right, none of which are _directly in front of the car_ when the objects and
care are plotted in 3d space. Easy (but not trivial) to do with LIDAR but not
yet doable with purely visual systems like "Autopilot." Which is my point, and
which is the reason every other self-driving car company uses a combination of
sensors to get data about car's surroundings.

------
paulsutter
What about those non-autopilot vehicles?

\- Nearly 1.25 million people die in road crashes each year, on average 3,287
deaths a day

\- An additional 20-50 million are injured or disabled

From [https://www.asirt.org/safe-travel/road-safety-
facts/](https://www.asirt.org/safe-travel/road-safety-facts/)

~~~
camjohnson26
This idea gets brought up every time autopilot is discussed here but it’s not
really relevant. If autopilot were safer than human drivers there would be an
argument for allowing it, but there’s no evidence that it is. Sure Tesla
claims that it is based on very specific, cherry picked internal metrics, but
independent analysis shows there’s just not enough data to say. Also the class
of errors that autopilot makes is completely different than a human. Most
human car crashes are preventable, either driving under the influence or
distracted driving. I would much rather be in a crash with a drunk driver
where liability was clear than have my car drive underneath a tractor trailer
in the middle of a clear day just because I hit some neural network edge case.

~~~
Faark
> I would much rather be in a crash with a drunk driver where liability was
> clear than have my car drive underneath a tractor trailer in the middle of a
> clear day just because I hit some neural network edge case.

Preferring to die by human error instead of shitty software doesn't make
logical sense to me and feels like your trying to justify something
irrational. Autopilot either drives safer than me or doesn't. And I wouldn't
let a drunk friend drive me either.

> but independent analysis shows there’s just not enough data to say

Do you link a trustworthy analysis? I can't remember reading one that wasn't
from a highly polarized source.

~~~
EthanHeilman
>Preferring to die by human error instead of shitty software doesn't make
logical sense to me and feels like your trying to justify something
irrational.

The poster, camjohnson26, did not say _die_ , the poster said _crash_.

Camjohnson26's point as I understood it was to compare two situations:

1\. A crash that occurs because one of the drivers was extreme drunk,

2\. A crash that occurs because of an autopilot in one of vehicles did not
make the current action.

In the first case most people and the legal system will likely conclude that
the drunk driver was at fault. In the second case it will be more difficult to
determine who was at fault. This second case may add legal, insurance and
ethical complications that camjohnson26 wishes to avoid. For instance what if
your insurance decides you are at fault for activating the autopilot under
unsafe conditions?

~~~
Domenic_S
> _what if your insurance decides you are at fault for activating the
> autopilot under unsafe conditions?_

Seems reasonable to me. The driver is always responsible.

~~~
EthanHeilman
Which driver? How do you determine it was actually the autopilot that caused
the crash? What if it wasn't the autopilot?

------
josefresco
I drove this morning, a new car with adaptive cruise control. Having never
experienced this tech before, it was pretty amazing. It makes the cruise
control in my own vehicles seem crude and dangerous by comparison. I venture
to guess, if standard cruise control was introduced today, it would be
rejected by the public as too dangerous. Is there any data on how many
accidents occur with standard (non adaptive) cruise activated?

~~~
rodeoclown
Part of the safety of standard cruise control is the fact that it is clearly
crude and dangerous - it keeps the driver engaged.

~~~
josefresco
Good point. It was my first time, and as a result was very engaged. But I
could imagine as I got used to and depended on the feature, how I could lose
some of that attentiveness.

It was my first experience with lane departure warnings as well which was
pretty neat but soon became a small distraction. I could see myself turning
that off, if it was my daily driver.

The best feature of adaptive cruise control is how it keeps you a safe
distance from the car in front (about 4 seconds) - whereas if I was using my
own judgement, the spacing would probably be halved.

------
rayiner
I find it amusing that in marketing Musk is always like the “hands on policy
is just there to appease the regulators.” But after every fatal crash, the
first thing in the press release is “the driver took his hands off the wheel!”

~~~
mehrdadn
Would you have a link on the first one? Would be nice to keep around for
later!

~~~
Donald
This is precisely their messaging on the embedded vimeo video on
[https://www.tesla.com/autopilot](https://www.tesla.com/autopilot)

"The person in the driver's seat is only there for legal reasons"

[https://i.imgur.com/74XHPiL.png](https://i.imgur.com/74XHPiL.png)

In fact, the entire video shows the driver with their hands off the wheel.

~~~
berberous
This is misleading. That video is demonstrating Tesla's "full self driving"
feature, which is not yet launched. You can future proof your car by paying
for it, but Tesla is clear the FSD is still forthcoming, and that their
current cars are not fully autonomous.

I think you can rightfully give them shit about calling the current gen
"autopilot", and think their full of shit for claiming FSD is as close as it
is, but this video is not intended to suggest you can operate the existing car
in that manner.

~~~
Donald
You should look at things from the perspective of someone not familiar with
Telsa's product line. Sure, if you own or interact with Telsas, the feature
set of autopilot vs FSD is clear.

But, if you're a newbie, you just Googled "tesla autopilot" and clicked on the
first result. You see that tesla has the "future of driving" \- that must be
autopilot, right? You scroll down and watch the video. There's a notice that
people are optional and a guy being driven around hands free.

The messaging is painfully there - autopilot does not require hands and the
feature set of FSD vs autopilot is conflated in a very misleading way on that
page in general. If you scroll down (albeit a heading that describes FSD), you
see "The system is designed to be able to conduct short and long distance
trips with no action required by the person in the driver’s seat."

Tesla's autopilot landing page is clearly designed to mislead unsuspecting
consumers into believing that autopilot is a self-driving capability.

~~~
fastball
How does one demonstrate a car driving itself (at any level) in an online
video if there is someone in the driver's seat with their hands on the wheel?

To the viewer, that would just look like a person driving a car.

~~~
romwell
>To the viewer, that would just look like a person driving a car.

...which is _exactly_ how current autopilot systems are _meant to be operated_
, if you listen to the post-crash analysis.

Which, yes, defies the whole point of having these systems entirely.

Which is why they would be hard to sell.

Which would be _right_.

~~~
zaroth
Every single manufacturer demos their cars with hands off the wheel in YouTube
videos explaining their lane keeping system.

~~~
romwell
They also label it as _lane keeping system_ , not _self driving capability_.

The difference in labeling is what killed several people already.

~~~
fastball
Literally every person that has been killed has been aware that their Tesla is
not a fully autonomous vehicle, and has been aware of Tesla's rule that your
hands should remain on the wheel at all times.

------
DavideNL
Somewhat related... in The Netherlands the police tried to stop a man in a
Tesla tonight, who fell asleep drunk while driving on the highway in his Tesla
with the autopilot enabled:

(translated from Dutch to English)

[https://translate.google.com/translate?hl=en&sl=auto&tl=en&u...](https://translate.google.com/translate?hl=en&sl=auto&tl=en&u=https%3A%2F%2Fnos.nl%2Fartikel%2F2285141-politie-
wekt-beschonken-tesla-bestuurder-op-a27-met-sirene.html)

------
freddref
How many Teslas are on the road, how many have been involved in fatal crashes,
and for how many of those crashes was autopilot turned on?

~~~
close04
How many times would the AP have crashed the car but the driver intervened?
Are these instances showing up in any statistic?

A student driver never crashed because the instructor is there to take over.
Would you say student drivers are good drivers based on this?

~~~
Stubbs
and how many times has AP avoided accidents the driver wouldn't? The stats
arn't that simple any more are they.

There are plenty more news articles, especially in local news, about
fatalities involving non autopilot cars, but they're not as interesting so
don't get shared as often.

~~~
close04
Such stats are never simple. But when important data is withheld they
definitely become irrelevant. They become more of a PR tool.

Statistically speaking 100% of people are able to pilot a plane with their
eyes closed and not crash it... as long as a qualified pilot is ready to take
over at any time. Such a stat is only valuable if you want to show the standby
pilot is useful.

What would you base your assumption that AP saved even 1 life on when that
data is not provided? All we know is that it crashed a number of times
exclusively when it wasn’t supervised as required. The pragmatic conclusion is
that it works _because_ it’s supervised. I imagine Tesla would release more
data if it supported the performance of the AP.

------
pooya13
I think it is an amazing achievement by Tesla that these hit pieces cannot
find more than a handful of incidents going back 3 years while there are half
a million Tesla’s out there. Their quarterly safety statistics is a great
strategy to counter the propaganda that “Tesla cars are unsafe” or that “Tesla
cars caught fire more often”.

~~~
Fins
If only Teslas killed fewer people than cars that sell in many millions. But
alas...

------
Shivetya
As an owner of a TM3 I have never believed any promises with auto pilot, full
self driving feature. I did buy the upgrade to FSD when it became available
for two thousand dollars as a hedge against resale value.

I do believe their approach with a camera focuses system will work. however if
video on youtube showing all cameras in action and that of our available dash
cam is representative of the color richness it might be an issue where there
is not differentiation to properly see all objects in some lighting
conditions.

~~~
lolc
You'll soon get an option to reclaim those 2000. Or the option to join a
class-action suit seeking the same.

------
azhenley
It doesn't _renew_ the questions. The questions started before there was ever
a crash and continued without stopping!

------
scotty79
Does autopilot beep annoyingly when you take the hands of the wheel just like
any car when you forget to fasten your seat belt?

And if not, why not?

------
notaki
I wonder how many people see this headline but don't realize this happened
over year ago when they also saw a similar headline for the same accident.
Combine that with constant headlines about Tesla vehicle fires, financial
troubles, production problems, and Elon's tweets, and you end up with a
terribly misinformed, skeptical public, hesitant to embrace the technology and
company the world desperately needs.

~~~
Fins
Sorry, but why does the world need, let alone desperately, any company? Let
alone as questionable an enterprise led by a dishonest sociopath as Tesla?

Unless Elon is going to release Real Soon Now super-special spectacles that
let you really see those reptiloid shortsellers from Nibiru that are
conspiring in their reptiloid lairs against all that is good (for Elon)? Maybe
bundles them with those flamethrowers?

------
kevin_thibedeau
Was there a whitelisted billboard in the vicinity?

------
DeonPenny
Insane how negative even this community is about this tech. Is coming
regardless of regulation because whos going to tell tesla No once they add in
lights and stop sign detection but still.

Actually, don't they already do those things in beta for some users?

~~~
BoorishBears
I've said it before.

I'm negative because Tesla has a track record of killing people willingly.

AP1's entire sensor suite was in other cars at the same time it was in Teslas.

Other companies _rightfully_ used those sensors for Lane Keep Assist, Adaptive
Cruise Control and Automatic Emergency Braking.

Lane Keep Assist does EXACTLY what it says on the tin. It ASSISTS in keeping a
lane. It will not automatically center in a lane. It will not let you let go
of the wheel.

It WILL save you in 99% of situations where one may _naturally_ be distracted,
like drowsy driving.

Adaptive Cruise Control, does EXACTLY what it says on the tin. It is cruise
control... that adapts to traffic conditions. It will not stop the car by
itself. It will not lull you into a false sense of safety by trying to claim
it detects stopped objects. It will not give you the confidence to stare at
your phone in traffic.

Automatic Emergency Braking, does EXACTLY what it says on the tin. It's a
backup that will only activate in a real emergency. It will precharge the
brakes opportunistically, but it won't activate until the last moment
possible. It instead alerts the driver BEFORE action is taken.

AEB from Mobileye alerted me to an overturned box truck on an interstate in
pitch darkness and heavy rain before I could visually confirm it.

-

Now you'll notice if you just take these building blocks, and increase how
often they engage to _continuous_ , you can lie and say you have a rudimentary
self-driving car. I say lie because this level of autonomy is not what people
have in mind when they say "self-driving car"

And of course no one would do that.

And certainly no one would do that and then call it AUTOPILOT.

Except Tesla did. And people died.

And they're still keeping up the charade.

And that's (part) of why I went from a Tesla fanboy to a staunch hater of
Tesla and what they represent.

~~~
_Microft
If a driver uses a system based on information from advertisements instead of
reading the manual that comes with it, they might not be suited to drive _any_
car either because using a safety-critical system without knowing what its
limitations are is reckless and irresponsible.

~~~
BoorishBears
If that's what you got was the problem out of everything that I said, Tesla
should extend you an offer.

They've fostered this culture of setting up a fundamentally inadequate system,
getting the user to use it with criminally optimistic language, then blaming
the user when the inadequacies, which they knew about, result in death.

Before the accident Musk will say:

"Your hands are only on the wheels for regulatory reasons"

To imply your hands are not on the wheel for safety reasons.

After the accident Tesla will say:

"Their hands were not on the wheel as they should have been to safely use our
product"

Rinse and repeat.

-

Honestly what's impressive is how many people on here condone this.

Imagine a plane's autopilot suddenly banked into a mountain. It wouldn't
matter if both pilots had been sitting in the back of the plane chatting up
passengers, no one would hesitate to ground hundreds of planes until the cause
was fixed.

The NTSB would not release a report saying "As long as autopilot kills less
people than manual flying it's fine!"

Yet every time this happens apologists come pouring out of the woodwork to
parrot this stuff. It's honestly getting a little disturbing.

~~~
hi5eyes
totally not a cult"

------
GiorgioG
Boeing should buy Tesla, they seem to be of the same mind as it pertains to
safety.

------
thatoneuser
How many millions of miles have these cars self driven and people lose their
minds about a handful of deaths. I mean don't get me wrong - it's death and
I'm atheist. It's a big deal. But how many deaths did people incur in the
first years of cars? Hell you could die just from the crank on the front
kicking back and crushing your skull - and that was just to turn the things
on.

People take risks when new tech comes out. If they wanna be the Guinea pig
then let them do it. Everyone sees musk as two faced and playing both sides
but honestly - if you're an adult paying for a tesla then I'm fine with it
being in your hands to decide if you're willing to roll the dice. Marketing or
not, no one should be stupid enough to think ANY device has a 0% chance of
killing you, let alone a computer driving.

~~~
cromulent
I've told this story here before, but I've had an oncoming Tesla change lanes
onto my side of the road and drive right at me. Fortunately there was time and
space for me to stop and the Tesla driver to take control again. It could have
been very different.

Why should I "roll the dice" with me and my family?

And no, they weren't overtaking, etc - if you need more details I can provide
them.

~~~
cheeko1234
I'd like more details if you're willing.

~~~
cromulent
Driving down an 80km/h straight back road, single lane each way, with a car
and truck following me. The lanes are free of snow but there is old ice/snow
packed on the center line and some small drifts of minor snow, so the white
lines are obscured in places. Two teenagers in the back of my car.

Coming towards me is a Tesla model 3, followed by a Peugeot.

The Tesla neatly lane changes into my lane and drives straight at me. Closing
speed is around 160km/h. I hit the brakes and brake to a halt. The Tesla
driver seems to take control at some point as the car first heads back to his
side, but doesn't change back into his lane (maybe due to the Peugeot), then
moves across my lane to the narrow shoulder on my side of the road, while
braking fairly heavily (as was I).

We both reduce speed quickly enough that we aren't going to crash. The
vehicles behind me have also done so. As we pull level with him (facing the
wrong way on my shoulder) I can't see the driver as I am in an SUV and he is
low down, opposite side of the vehicles of course. He has stopped. I continue.
The car behind me stops to check on him.

The teenagers in my cars saw the whole thing and are exclaiming about it.

I guess maybe his car was seeking the white line and couldn't find it in the
snow, so it hunted into my side. I don't know though.

