
Hackers steer Tesla into oncoming traffic by placing three stickers on the road - velmu
https://www.businessinsider.com/tesla-hackers-steer-into-oncoming-traffic-with-stickers-on-the-road-2019-4
======
chrisbolt
From yesterday:
[https://news.ycombinator.com/item?id=19536375](https://news.ycombinator.com/item?id=19536375)

------
gregmac
While I'm hugely skeptical of the current state of self-driving cars, you
could probably get human drivers to make the same mistake if you were to
repaint the lines. However, humans will also notice the oncoming cars (if
there are any) and avoid getting in a head-on collision.

The thing missing from this test is that critical practical piece: if there
was an oncoming car, will the Tesla do something to avoid the collision? I
would assume that not getting in a head-on crash is higher priority than
staying in the lane markings.

Without oncoming traffic, all this is testing is what the Tesla considers
valid line markings. I'm sure there's room for improvement here (such as
checking where the other lane is, raising the requirement for how well-defined
the lines have to be, etc), but those are also going to involve trade-offs
where there are legitimate situations that will stop working.

I think you could just as easily title this video "Tesla auto-pilot follows
road markings even if they're really bad".

Edit: The best shot I could get from the video [1] makes me even more upset at
this test: these look like the temporary markings often used during
construction, just before they come and paint the normal lines using the big
line-painting truck. There's not even regular lane lines after this. I
wouldn't even be surprised if this is something Tesla specifically trained the
software to handle.

[1] [https://i.imgur.com/aLbhnzQ.jpg](https://i.imgur.com/aLbhnzQ.jpg)

~~~
vkou
> The thing missing from this test is that critical practical piece: if there
> was an oncoming car, will the Tesla do something to avoid the collision? I
> would assume that not getting in a head-on crash is higher priority than
> staying in the lane markings.

Considering that a Tesla is happy to ram you into the back of a stopped
vehicle, at highway speeds, I wouldn't have that hope.

Autopilot does two things - follows the vehicle in front of you, and stays
between the lane lines.

~~~
EngineerBetter
This isn't strictly true - Teslas have the ability to perform emergency stops
if they detect something in front of them.

I don't believe that they're likely to pick up something driving straight at
them, as it's not something the neural net has been trained for.

My Model X often alerts me to parked cars, thinking I might drive into them :P

------
breakingcups
We are definitely not even close to having "solved" self-driving cars.

~~~
WhompingWindows
If you listened to the hype, yes we are not close; but that'd be your fault
for giving in to the early hype, written by tech/science journalists without
domain expertise or insider information.

If you look at the reality, certain companies have advanced a TREMENDOUS
amount in the last few years. It was less than 2 decades ago that DARPA
introduced its self-driving vehicle challenge and none of them could even make
it a couple of miles.

If you asked me 20 years ago, would there be a driverless taxi service in
Phoenix in 2020, I would have said absolutely no way, the y2k bug scare, tech
so hard, we'll never get there.

------
hello_friendos
Why are these cars legal on the road?

------
jbob2000
No shit. What about when you erase all the lines on the road? What about when
you remove the road? What's next, "Luxury Tesla defeated by a cheap piece of
duct tape over the main sensors"? "We removed a wheel from a Tesla and it
crashed into a wall!". These tests are so contrived, honestly.

~~~
mplewis
They're not contrived tests. They're attacks that work on specific NN-powered
self-driving cars but don't work on humans.

No human would be tricked into driving into oncoming traffic by the three
small stickers in the road used in this attack.

~~~
jbob2000
You've never seen a car that's accidentally driven into wet concrete/pavement?
If a construction worker places the pylons a little too far apart, it looks
like you might be able to use the lane!

~~~
SrslyJosh
Ah, the good ole "You can't make X _perfectly_ safe, so therefore we shouldn't
try to make it any safer than it is right now!" argument.

A favorite of gun enthusiasts and gun lobbyists.

~~~
ben_w
A case in point about human perception: while you read jbob2000’s comment that
way, I read that same comment as nothing more than a counter to the idea that
humans were somehow invulnerable to errors.

This seems a bit strange, given the main argument in favour of replacing human
drivers with AI is that it _should_ be possible to make one that works as well
as the best human driver, only without getting tired, high, or distracted, and
with reaction times measured in microseconds instead of hundreds of
milliseconds.

We’re not there yet. I don’t know how far we are from that future.

