
Tesla logs show that Model X driver hit the accelerator, not Autopilot - belltaco
http://electrek.co/2016/06/06/tesla-model-x-crash-not-at-fault/
======
mtgx
I wonder though, how easy would it be for someone to hack a Tesla car, crash
the car while it's on AutoPilot, and then have the logs say that it was the
driver's fault, so it was "just an accident", and not an assassination?

I imagine the people at NSA/CIA are already testing for this. Self-driving
cars are scary, especially when you consider that the vast majority of car
makers who want to build self-driving cars still aren't taking security
seriously, and even continue to send unsigned updates over HTTP. Whatever
security Tesla has for its car is actually the exception, not the rule, but I
doubt even Tesla has good enough security to protect its cars against state
actors, especially when all of its cars can be updated/attacked remotely.

------
dekhn
I think it's really interesting how Tesla instrumented their devices to
provide them with effectively audit-level diagnostics that can help them
disprove driver claims (note: humans often misattribute their own errors to
mechanisms), and purchasers of the car have effectively agreed to be audited
in this way.

The implications of this are profound and I'm curious how it will play out-
will people reject cars that produce evidence that show they caused crashes?
Embrace them?

