While I agree with everything above, it can be humbling to consider the huge amount of people already in control of that car (at the car company, software partner, hosting partner, phone maker) but extending that trust to the local network amounts to an inexcusable security problem.
It is interesting that having legitimate control over a certificate makes this a desired feature rather than a huge security problem. The real world may not be all that black and white.
I don't think it was the bug itself that bothered me so much as their response, I sent them an extremely clear email with the exact steps I took and screenshots showing how other apps responded to my fake cert with error/warning dialogs which was escalated directly to the engineering team and they seemed to have no idea what I was describing or why it was an issue. I assumed at that point the issues went a little deeper than what I had uncovered, and it seems from this post I wasn't too far off the mark.
The only reason to do OTA updates is convienence.
Assume there's a bug - safety critical bug. You cannot reasonably call people in house all at the same time, and continue to risk their lives as they "don't upgrade".
OTA updates also over time increases software quality, enables experimentation and slow/controlled roll out.
You are worried about short term problems over long term promise.
That's how it worked for all the cars before Tesla, so... Yes you can.
Remember your windows pre updates. Full of security holes and no way to patch, or bugs that linger and create problems that could never be fully eliminated.
Let me tell you something you didnt think of. Imagine i am doing my diligence before releasing a software but didnt fully factor in all the unknowns in the process. It happens, shit breaks all the time, right? Now imagine instead of uncontrollably calling everyone in house, i start 1% roll out, and gather data, find some problems with that investigate and push the real fix. See the efficiency gains? I prevented the chaos of 99% of cars, and i did somrthing data driven
Good luck on that without an ota.
When people are relying on software for their safety, no, it doesn't. Bugs in critical systems for things like planes and cars are rare because going faster kills people. Using 1% of your users as tests is fine if you're making a website but much less fine if your new code means they might die.
Bringing cars in house makes no improvement over what i am proposing.
I know I've had cars where certain defects (non-safety) were recalled, yet it took 2-3 weeks lead to get an appointment with the dealership, and several days in the shop once it was there (without a loaner vehicle). And I'm not talking full engine rebuilds either, just simple fixes. Most of the time I don't even bother anymore because it's such a hassle.
Why do my brakes need a software update? Is that not something that we as an industry can get right before shipping a car?
You're mistaken. Almost every ECU on the planet right now is flashable and they are indeed often updated as part of routine servicing, particularly on brand new models.
You'll also not have any way to measure the impact of the update, chances are it is not perfect...
Here in the UK, the insurance industry collectively funds Thatcham Research, an independent body that assesses the safety, security and repairability of new cars. Thatcham's assessments are hugely important to motor manufacturers, because they directly influence the cost of insurance; a good rating from Thatcham means a low insurance group rating, which is an important selling point. It's a fantastic example of what happens when everyone's interests are aligned.
Thatcham also assess aftermarket security equipment; most insurers offer discounts on premiums where Thatcham-approved equipment is fitted. It is worth noting that neither of the products mentioned in this article are Thatcham approved.
I'm skeptical it does. I saw a few times how regulated software (certain billing systems) were certified and that was a bad joke. Maybe you have different experience, though.
If you want government bodies to spend taxpayers money on something, I'd rather suggest spending it on funding security researchers actively attacking systems and cooperating with manufacturers on fixing the discovered issues (and you can legally mandate such cooperation). This might work, actually improving end-user security. Although you'd have to somehow audit those researchers are actually doing something...
For manufacturers to actually listen to security research, you'd need regulation as well.
You could also require all or certain software in cars to be open source.
Yes, it can. DO-178B is a widely used security standard in military equipment. It's difficult and expensive to obtain, and caters to fighter jets, not cell phones, but there is precedent for true technical security improvements through government programs.
Because confusing safety and security is exactly the kind of awful goof that we're talking about here. I'm sure these car alarms are _safe_ the problem is they don't keep your car _secure_.
The security record of military procurements is... not good. Same for the financial industry. Do a bad job, hope nobody finds out, if they do insist you didn't do a bad job and hope nobody who understands the difference is empowered to do anything about it.
Let's take something easy, communications. Your generic Android phone is capable of doing secure voice communications over a distance subject only to traffic analysis and other inevitabilities.
The British infantry have Bowman. At squad level it's unencrypted spread spectrum voice radio. So, much worse than that Android phone. A sophisticated bad guy (so, not some random bloke who decided to join ISIS last week, but say, a Russian armoured division you've been deployed to counter) can literally listen to everything you say, seamlessly, without giving away their position. Brilliant.
Now regulation _can_ improve things by mandating something that people who know what they're doing already recommend. But you're not going to get there with things like DO-178B.
It's about both. The DO-178B/C standards require software to work as formally specified, and require robust branch testing to ensure the code conforms to the spec. This means different things for different applications, but in an operating system, for example, it means that no process can affect any other process if the two are deemed independent. This requires that the OS prevent all covert channels, strictly limit memory, CPU performance, etc. For example, a fork bomb wouldn't be able to affect other processes, and caches are wiped on every context switch.
So yes, it is definitely relevant to security as well as safety (which in military aerospace go hand-in-hand anyway).
How are these companies remaining in business? Call yourself unhackable and then don’t bother to even authenticate API requests... mind bogggles.