
Tesla that crashed in Autopilot mode sped up before hitting truck: police - pmoriarty
https://www.theguardian.com/technology/2018/may/24/tesla-that-crashed-in-autopilot-mode-sped-up-before-hitting-truck-police
======
cameldrv
Tesla "Autopilot" cannot see stationary objects. It has a radar, but it is 1-D
and low resolution. It uses this to localize other moving cars. In theory, it
could use this to see a highway divider or firetruck it's about to hit, but
since it's 1-D that would also mean that it would have to slam on the brakes
when it sees an overpass coming up, because it can't tell the difference
between an overpass and a highway divider or a firetruck. It can assume that
overpasses aren't driving at 60mph, so it will see other cars. The Tesla
algorithm is "look for lane markings and try to stay between them, and don't
hit any moving vehicles. If there is any stationary object, including another
a vehicle in your lane, the "Autopilot" will plow right into it. If they're
not explaining this to customers, it's negligent.

~~~
biswaroop
But the Autopilot has cameras, and therefore it has 3+1D data (2D frames, 1D
inferred depth from stereo or training, 1D time from frame diffs). Even with a
30fps frame rate at 60mph, two frames from a video feed would see the truck
move closer by a meter. The network can be trained to distinguish a truck in
the lane from an overpass, so those two frames should be sufficient to signal
danger. Instead, the car accelerated over 70 meters before the human, again
using just her eyes, applied the brakes.

This is an edge case, and clearly something that requires more training to
avoid. The challenge in edge-case learning is to encounter them often enough,
or perhaps create them artificially, and then to ensure the learning can be
transferred and generalized to all similar situations.

~~~
pavs
This is not even the first time they hit a giant red stationary fire truck.

~~~
biswaroop
Seems like it happens whenever a car in front moves out of the way. It goes
full stupid-cruise-control mode and just accelerates to the stored speed. Of
course there's a lot of noise in terms of things in the stationary reference
frame, but still, big red firetrucks don't exactly look like lane markings.

At the very least, as a driver, I'm slightly cautious when I see a car move
out of the lane in front. I feel like the solution lies in: (a) separating the
signal from the noise in the set of things moving with the ground, and (b)
getting the car to 'understand' why other cars change lanes. Sometimes it's so
they can avoid obstacles in the road. It almost feels like cars need to have a
theory of mind of other cars. I certainly drive with an intuition of driver
intentions stuck in my head.

~~~
pavs
The thing is Tesla will never be an autonomous car or a reliable semi-
autonomous car with its current hardware. Elon needs to admit it (he insist,
this isn't the case) and he needs to reliably convince his customers this
isn't the case. Tesla needs to see a couple of cars before, in order to drive
safely (these are no edge cases, this happens in real life a lot), but it
can't see ahead without LiDAR but Elon insists they don't need LiDAR.

The problem is you can't have a pre-existing bias when working with this kind
of technology, you need to be able to adapt to whatever changes that need to
be made.

~~~
biswaroop
Well it's not completely clear to me that cameras can't look a few cars ahead.
If we can, cameras can too. Maybe they're not smart enough yet, but I don't
see the fundamental limitation. Apart from that, I agree that cheap +
effective LIDAR would help significantly to get autonomous vehicles on the
road now.

~~~
pavs
> If we can, cameras can too.

A camera is not a replacement of how our eyes work, how our brain
distinguishes between objects and understand from prior experience and can
make independent decisions on the fly. But neither does LiDAR, but with LiDAR
and good machine learning (ie, Waymo), you can drastically improve the
reliability of your autonomous system.

Elon is insisting on a no-LiDAR system at the same time advertising tesla
autopilot as an autonomous system and has said multiple times that All Tesla
on the road is already equipped with the hardware needed for fully autonomous
driving. This is dangerously false.

There is not a single autonomous car out there (at least serious ones that we
know of), that has a non-LiDAR system.

Reality Distortion Field is strong with Elon, Unlike Steve Jobs and Apple
products, this one will kill people.

~~~
nmca
So I do a lot of work with both vision and lidar data, and am pretty
intimately familiar with the public state of the art in terms of perception
systems that rely on each or both, if not planning.

I don't think you need lidar.

I'm not saying that vision-only won't take longer, because it will, but I
believe it'll probably happen unless the cost of lidar falls a great deal.

Have a look at, for example, "learning to see in the dark" (the CVPR 2018
one), DensePose (the FB research one) to get a sense of how quickly difficult
problems are being solved.

~~~
pavs
The cost of LiDar has been falling for a while now:
[https://arstechnica.com/cars/2018/01/driving-around-
without-...](https://arstechnica.com/cars/2018/01/driving-around-without-a-
driver-lidar-technology-explained/)

Waymo LiDAR is custom made and they cost a lot less than what is available in
the market. [https://arstechnica.com/cars/2017/01/googles-waymo-
invests-i...](https://arstechnica.com/cars/2017/01/googles-waymo-invests-in-
lidar-technology-cuts-costs-by-90-percent/)

I don't think it's a cost, Elon has a personal bias, or maybe he didn't
account for the fact that in the not so near future LiDAR cost will go down
drastically.

------
tobyhinloopen
Tesla has no autopilot or self driving capabilities. It is just something
called Adaptive Cruise Control and Lane Assist in other vehicles with some
minor additions. Tesla’s marketing gives the impression that it is much more,
but it isn’t.

~~~
bjelkeman-again
After talking to quite a few drivers and driving thousands of miles on
Auotopilot myself, I don’t think people are confused by the name. It is pretty
clear very quickly what the car can or can’t do. I think people are poor at
estimating risk and we would have these accident whatever you call these semi-
autonomous features. The interesting thing for me is that despite the
shortcomings of the current Autopilot, I wouldn’t go back to a car without. It
lowers the strain of driving sufficiently to really be worth it. And I am a
driver with a hand on the wheel and eyes on the traffic, all the time.

~~~
chrisper
Then tell me. Why do we hear news from Tesla cars all the time and none of VW,
BMW, etc. where we had adaptive cruise control for quite some time?

~~~
Arn_Thor
Why do we see headlines every time an iPhone spontaneously combusts, but
nothing when there are (isolated) incidents with Samsung, Huawei, LG etc.?
Because one brand generates clicks, the others don't

~~~
jonhendry18
Samsung phone combustion gets coverage. You may recall a while back where they
had to recall a phone due to them bursting into flames, and some airlines
specifically banned the combusting model.

~~~
Arn_Thor
I had that in mind, which is why I specified isolated incidents

------
blackrock
I am now starting to fear being in front of a Tesla on the freeway, or on the
streets.

Why? Because the idiotic driver might not even be paying attention to the
road, and is dazing off and looking at the clouds, or his phone, instead of
focusing on the car in front of him, or the surrounding traffic around him.

These Tesla crashes are not only a danger to the Tesla occupant, but, it is
even more dangerous to the victim that the Tesla would hit! Because that
victim can now suffer from whiplash, or worse.

~~~
dirktheman
The trouble is, I see a lot of drivers that aren't paying attention to the
road, who are dazing off, who are looking at their phones/clouds whatever. The
majority of them aren't driving Teslas. At least with a Tesla I know there's a
backup system (the 'autopilot').

I'm not saying the Autopilot is 100% faultless/reliable. I am sure however
that humans have a much higher chance of handling a situation badly than the
autopilot, simply because they're human and humans do stupid shit.

~~~
blackrock
Let's take a scenario. This seems to happen all the time on the freeway.

You're cruising down the freeway at 75 MPH. All of a sudden, traffic comes to
a screeching halt.

There are 3 cars in this scenario. The first car is the one that slams on his
brakes, and comes to a stop.

The second car, cannot stop in time, and he swerves out of the way, and safely
gets into the adjacent lane. Close call, but he drives away safely.

The third car, is the Tesla, with the driver on Autopilot, and browsing
Facebook on his iPhone. So, he is inattentive.

The whole sequence plays out in less than 5 seconds.

In this scenario, the third car, the Tesla, will not recognize the situation
that the second car swerved out of the way. So the Tesla will collide into the
first car at high speeds. The velocity of the impact kills all occupants
inside the first car, and the third car, the Tesla. So, if both cars are full
of occupants, then maybe up to 10 people are now dead.

The problem is that the Tesla does not have situational awareness. It cannot
read the current road condition. It cannot see through the tinted windows of
the 2nd car, to see that the first car has slammed on his brakes.

So, the accident severity of the Tesla, is much greater, than compared to a
scenario where a human was driving the third car.

In this scenario, I think the human might be able to see the situation, and
attempt to avoid it. Or to at least mitigate the severity of it. Like, he
would slam on his brakes, or steer out of the way, or something. He might
still collide with the first car, but probably at a much lower velocity, that
all occupants of both cars can still survive.

It just seems that the severity of the accident with current robotic cars,
just seems greater, than compared to when a human is the driver.

~~~
chc
Now let's add a fourth car, a Toyota without TACC, whose driver is also
browsing Facebook on his phone. This driver also does not recognize the
situation and will collide into the first car at a high speed. Is this worse
than the third car? No, probably not. In short, the assumption that a human
driver without Autopilot is paying attention and will swerve out of the way is
not a valid one. People drive distracted in traditional cars _all the time_.

BTW, from what I have heard, Teslas might actually be better at seeing through
other cars than human drivers are, since radar can bounce in ways visible
light can't.

~~~
jonhendry18
A Toyota driver probably wouldn't put that much faith in a mere Toyota. Tesla
drivers who are high on Elon's farts put that much faith in their cars.

"BTW, from what I have heard, Teslas might actually be better at seeing
through other cars than human drivers are, since radar can bounce in ways
visible light can't."

Or maybe not, since a Tesla couldn't even see a stopped fire truck.

~~~
chc
People who drive normal cars are on their phones while driving _all the time_.
There's no "probably" about it — I know lots of people who have had their cars
hit by inattentive drivers. I myself got rear-ended by a guy who was honest
enough to straight-up admit that he didn't see me stop because he was
messaging on his phone.

Also, I'm not sure what connection you see between recognizing a stationary
fire truck and seeing two cars ahead with radar. They seem like pretty
different tasks.

------
aresant
I don't get it.

"Autopilot" aka adaptive cruise control is going to continue failing since
drivers think it's more than that.

More people are going to die in spectacular, headlining fashion.

Imagine if that was a schoolbus instead of a firetruck.

Even Tesla's "non-autopilot" crashes are front page news because it's such a
story.

Every front page story whittles away a little at the magical Tesla brand.

A brand, which in the midst of production issues, capitalization issues, and
leadership issues is their major asset that _could_ carry them to the other
side.

Turn the thing off if hands aren't on the wheel, end of story.

~~~
michaelmior
> Turn the thing off if hands aren't on the wheel, end of story.

I assume you mean that Tesla should do this automatically.

------
aylmao
It irks me that not only this feature is called Autopilot, but it doesn't even
come with a "beta" label. Gmail was in beta for like 5 years.

You go to Tesla's website and it greets you with (as of the time this comment
was written):

"Full Self-Driving Hardware on All Cars"

"All Tesla vehicles produced in our factory, including Model 3, have the
hardware needed for full self-driving capability at a safety level
substantially greater than that of a human driver."

Which gives the impression this car can drive better than one can, and gives
little indication to the fact the _software_ is in active development even if
the hardware for "full self-driving" is already there.

------
megakid
As an autopilot regular, I hate reading stories such as this. Just last night
I had a 150 mile drive in non-perfect conditions (light rain and dusk) and had
Autopilot (v1 with a single camera) engaged for the vast majority of it. It is
probably my favourite feature of my Tesla. It is significantly better than
Audi’s lane assist features I have used and genuinely reduces strain on long
journeys. I now perform journeys without hesitation (sometimes looking forward
to them coupled with a supercharger stop to recharge myself) when before I’d
expect to arrive at my destination stressed and exhausted.

However, I will admit that to use Autopilot safely and effectively you MUST
understand the system’s inputs, what its behaviour is based on those input
(and driver controlled variables) and what its limitations are. I am a
software developer by trade and whilst I couldn’t hope to replicate the
autopilot software stack, I probably could write a detailed pseudo code
representation of the systems behaviour (without factoring in sensor failure
etc).

~~~
yellow_postit
How much do they detail those inputs, variables, and limitations vs you making
assumptions about them?

------
burlesona
It seems fairly obvious me that an imperfect autopilot is worse than no
autopilot at all, since it trains drivers to “relax” and let the computer
drive when some non-trivial percentage of the time the autopilot will get
things wrong.

It’s unfortunate that there’s no real way for there to be a perfect computer
driver, and all the real world training data these things are collecting is
needed to ever get close.

But if these systems keep making mistakes like this that seem “stupid” to
humans, I suspect these things will be banned before long.

~~~
MikkoFinell
I agree. Moreover, it's obvious to me that an imperfect human pilot is worse
than no human pilot at all.

------
haser_au
> The driver of the vehicle, Heather Lommatzsch, 29, told police she thought
> the vehicle’s automatic emergency braking system would detect traffic and
> stop before the car hit another vehicle.

> Police say car data show Lommatzsch did not touch the steering wheel for 80
> seconds before the crash. She told police she was looking at her phone at
> the time and comparing different routes to her destination.

This is not news. This is someone not following the explicit instructions of
the manufacturer to maintain control and awareness while driving. They choose
to ignore those instructions and it resulting in a crash.

Replace "Telsa" with "Audi/VW/Toyota/Holden/Ford/GM using adaptive cruise
control", and this doesn't make front page news.

aresant's comment below is spot on. "Turn the thing off if hands aren't on the
wheel, end of story."

~~~
kevincrane
Audi/VW/Toyota/Holden/Ford/GM didn't name their feature "autopilot" though.

~~~
mikejb
And don't advertise on tesla.com/autopilot that the feature is capable of
driving itself.

~~~
ddoolin
In bold letters on the same page:

> Please note that Self-Driving functionality is dependent upon extensive
> software validation and regulatory approval, which may vary widely by
> jurisdiction. It is not possible to know exactly when each element of the
> functionality described above will be available, as this is highly dependent
> on local regulatory approval.

Are you suggesting that it's not the people's fault they can't get to the
bottom of the marketing page before going out, buying a Tesla, ignoring
obvious UI warnings, and driving it without knowing that it currently doesn't
full drive itself?

~~~
Jasper_
Yes, it's not the people's fault, when there's a large video right there
showing the car driving itself. The text makes it seem like it's just a "red
tape" thing where autopilot is there like in the video but it just needs
approval in your jurisdiction. If this marketing came from any other car
company, you probably wouldn't be defending it.

------
staunch
Why doesn't Tesla have an eye tracking camera that ensures the driver is
watching the road? It would cost hundreds of dollars at most. Basic eye
tracking is a solved problem and would ensure drivers are actually paying
attention.

~~~
barbegal
It's definitely not a solved problem in the real world. Eye tracking in the
lab works great but in the real world there are lots of challenges. Low light
levels at night, drivers wearing sunglasses, drivers squinting etc.

~~~
staunch
That's solvable: if it can't see your eyes, then you can't use Autopilot. That
should still let the majority of people use Autopilot the majority of the
time.

~~~
marmshallow
I want to wear sunglasses.

~~~
kurtisc
Your desire to wear sunglasses is a lower priority than the health of the
people you share the road with.

~~~
tafycent
So being blinded with autopilot on is safer than being able to see, but
allowing the possibility of inattentiveness? That's ridiculous.

~~~
kurtisc
No. Your assumption that I'm saying that is, frankly, ridiculous. If you're
being blinded, put the sunglasses on and turn Autopilot off. Of course, I
doubt that eye-tracking works particularly well if you're squinting and
averting your eyes anyway: the act of being attentive precludes being blinded
in itself.

~~~
tafycent
Why would the attentiveness monitor be tied to autopilot? Some modern cars
without autopilot already have attentiveness monitors. It's ridiculous to
require people not to wear sunglasses. Just for the sake of attentiveness
monitors. Anybody driving east to west in the afternoon or west to east in the
morning wouldn't be able to use it at all.

------
cheerioty
Hm. Any one hear willing to talk about the fact that the driver didn't had the
hands on the wheel for 80 seconds before the crash and got away with "just" a
broken foot?

~~~
jonhendry18
"got away with "just" a broken foot?"

She's fortunate she didn't plow into the back of a church van full of kids
instead of a fire truck. The kids would probably be dead thanks to the high
speed impact of a 4,900 pound car.

------
siliconunit
I think a lot of factors could be ironed out by a solid mathematical grasp of
the problem, instead of letting customers become guinea pigs or running
simulations and then deduce a blanket solution that looks okay in those
average cases, a room with a bunch of mathematicians will surely state the
problem, limits and solution in a infinitely:) more solid way. I have seen
this happening with software countless times, spend the money on a math ph.d
and save endless testing time with half guessed solutions where limit cases
are ignored...etc.

------
mannykannot
To be fair, the fact that the car increased speed just before the crash is
entirely irrelevant here. It is not as if the car was some conscious agent
being willfully reckless.

------
whataretensors
I'm an early adopter but I have no idea why anyone would early adopt self-
driving or autopilot features, much less pay for it.

------
thelastidiot
Another one of those dreaded "autopilot" malfunction reports. When are people
going to realize that an 'autopilot' system required to be supervised at all
time is not something any non super-human driver is able to override timely.

~~~
yellow_postit
When Tesla stops advertising it as such? Maybe also when the system
aggressively disables itself when not under human supervision (hands on wheel)

------
beenBoutIT
The Darwin Awards are going to become a big Tesla advertisement.

------
aviv
The use and marketing of the word "Autopilot" will go down in history as one
of Tesla's biggest mistakes.

------
myaccountforhn
Huh, I'm a little concerned with how much information they were able to
collect and therefore possibly get her to admit she was looking at her phone
instead of the road and hands on the wheel.

I hope she's got great insurance because that guy who got whiplash is probably
going to get a bit of money.

