Although I imagine the bug reports still read similarly: "Unexpected user input causing fatal crashes"
After the Toyota witch hunt trial against "foreign" companies (read as: more American than Ford/Chrysler/GM who outsource everything possible including assembly, which drives down cost, but also drives down quality and safety) for the sudden acceleration problem which ended up being deemed user error by both government agencies that oversee car safety, Telsa is going to do everything possible to repeat this mistake (read as: Tesla is going to continue consistently manufacturing the highest quality cars humanly possible, and exceed every safety requirement as far as possible using existing and next generation technology, following the same model that made Toyota the best car manufacturer in the world).
Every software update needs to comply with the testing methodology that is put forth by the US government, and most likely will be tested by a battery of inputs that are impossible in the real world just to see if the software malfunctions.
Safety-oriented software (such as in cars, military weapons, or in space) is written in such a way that operations are not unbounded, hard real time can exist in hardware, possible infinite loop conditions can't happen, runtime memory allocations don't happen, and if things can be mathematically proven, they are.
See the NASA programming manual, I think its been linked here on HN before. This is the kind of programming you see in mission critical software. This is the kind of programming Tesla does.