
Elon Musk announces another price hike for “full self-driving” package - buran77
https://arstechnica.com/cars/2019/07/elon-musk-announces-another-price-hike-for-full-self-driving-package/
======
java-man
What I really want is a hardware switch that disables any kind of input from
the "self-driving" software. Something that cannot be overridden by an OTA.

~~~
m463
First, the self-driving features are all opt-in.

Also, a lot of features are passive, like sound alerts.

You have to enable things like Autosteer, Summon or Navigate on Autopilot in
settings before you can use them. Then you have to hit a switch while driving
to let the car help you drive.

As for emergency features (which all cars get), they also have to be enabled
in settings, like Lane Departure Warning, Automatic Emergency Braking and
Object Aware Acceleration.

My question to you: what if you turned off Object Aware Acceleration and then
ran into something... or someone?

~~~
java-man
I am looking at it from a security perspective. Any sufficiently complex
software system, especially one that does not have security as prime design
goal, exhibits a large attack surface. It is not a matter of if, but a matter
of when.

Example scenario: a CEO of a large public corporation (which attracted a lot
of short sellers). A journalist who have exposed some far-reaching shenanigans
of the rich. A political opponent.

I am sure all the relevant agencies are quite busy analyzing Tesla code right
now, for exactly this reason. And I know for sure that security is NOT the
design goal at Tesla, at least when I talked to them. I would have designed
the hardware differently from the start.

Again, I am not talking about a common use scenario for regular people, or
driver turning off some safety feature. I am concerned with security of the
system as the design goal.

