After the Toyota witch hunt trial against "foreign" companies (read as: more American than Ford/Chrysler/GM who outsource everything possible including assembly, which drives down cost, but also drives down quality and safety) for the sudden acceleration problem which ended up being deemed user error by both government agencies that oversee car safety, Telsa is going to do everything possible to repeat this mistake (read as: Tesla is going to continue consistently manufacturing the highest quality cars humanly possible, and exceed every safety requirement as far as possible using existing and next generation technology, following the same model that made Toyota the best car manufacturer in the world).
Every software update needs to comply with the testing methodology that is put forth by the US government, and most likely will be tested by a battery of inputs that are impossible in the real world just to see if the software malfunctions.
Safety-oriented software (such as in cars, military weapons, or in space) is written in such a way that operations are not unbounded, hard real time can exist in hardware, possible infinite loop conditions can't happen, runtime memory allocations don't happen, and if things can be mathematically proven, they are.
See the NASA programming manual, I think its been linked here on HN before. This is the kind of programming you see in mission critical software. This is the kind of programming Tesla does.
You know you are making the same argument as the light-hearted quip of your parent post, right? Everything you mentioned that goes into safe programming combined with meatspace actors yields the bug report "if I accidentally press on the accelerator instead of the brake, I crash faster".