They ran into a similar situation with the banks and answered with one of the best academic slap downs I have ever read.
This is the preceding request:
Dr Anderson has reportedly been injuncted on other occasions but those don't seem to relate to chip and PIN.
I think that the intrusion into academic freedom is lamentable but without any link to the case documents it's hard to say exactly on what basis the interim decision was made. In any event as it's only an interim injunction it may well be overturned at full trial.
My two cents is that the case for disclosure here, while instinctively convincing, is not as urgent as it has been with the vulnerabilities revealed in financial systems. In those cases, for example, banks were refusing to repay money stolen from individuals on the basis that the banks' systems were secure so it must have been the customers at fault. Dr Anderson and others were extremely convincing in showing that this was not necessarily the case. (I don't know if that has filtered through to the financial world at large - are people still being told there are no security holes?)
Perhaps it's just that I don't feel that bad about people having their Bentleys stolen but probably we can wait for trial here (assuming it's not just a strategy by the car manufacturers to bankrupt or intimidate the academics involved).
The rash of high tech car robberies, I think, used the OBD port to reprogram the car to recognize their fake key. Like a dealer would. So they didn't break the actual crypto, as is claimed here.
The software that is referenced here could be the software VW distributes to dealerships to reprogram the car when a customer lost their key. So thats certainly one possible way to find out how the crypto system works, by the interaction of the software with the car.
The article also mentions decaping the actual ICs that do the crypto. Thats a very time intensive way to find out how a crypto system works, but it may be the only one when you are dealing with fixed master keys and propietary algorithms. If the keys alluded to here are actually master keys, burned into every car, then they should certainly be published along with the other results, since they are an integral part of the system.
"Finding in Volkswagen's favour, Mr Justice Birss said he recognised the importance of the right for academics to publish, but it would mean 'that car crime will be facilitated'."
If "crime would be facilitated" is to be the criterion of suppressing communications, there can be no digital security and no freedom of speech.
(Preferably to me first, but still.) If you've got them, then chances are I'm screwing up somewhere. I'd like to know so that I can do something about it.
Security by obscurity is flawed.
Unfortunately, it is how most of the banking world right now works. It is quite easy to initiate transactions and create accounts knowing a few basic numbers about you - bank account numbers, name, date of birth, SSN number (I'm talking about the US of course), home address. And it is very hard to change or hide many of these numbers and data items. The system is very fragile right now, and the only reason why it works is that overwhelming majority of users aren't crooks and losses from the crooks are small enough to be covered without triggering move to a more resilient system.
For example, if a car manufacturer (or bank) makes public claims that their system is secure, yet they use 56-bit DES keys to encrypt data, then I think the public should know.
Your "account number" analogy was off because it's is generally not in the public interest to reveal a private citizen's account number.
However, if an account number (email address or ip address) was being used frequently to commit crime, say in an advance fee fraud, then it makes sense to blow the lid on the said account number. That's what I meant by "in the public interest".
In any case, if you're being that pedantic, why didn't you notice that I actually asked revelation that question, and not DJN?
Shameful now they'll move to try and blame these guys when they probably botched the security in the first place.
Leaving "bad guys" having to repeat the feat until they can use the weakness should buy some time for everybody.
Publishing the vulnerability also allows other white hats to propose a possible work around that owners could implement before dealers have a permanent fix, e.g. physically removing part of the management interface from the vehicle, or just to know not to park such vehicles in high crime areas in unsecured parking lots because the car could be more easily stolen.
It would seem quite a tractable problem for a keen hobbyist. Build a robot, something like a 3D printer in reverse, to alternately remove thin layers from a chip and image the newly exposed layer, until the chip is gone. Use a program to assemble the images into a 3D representation and extract the circuit.
In fact, such a project would be a relatively simple way to start gaining the knowledge required for the reverse process, of building a chip.
Byuu published some wonderful articles (including some figures) on having the SNES coprocessors decapped so they could be successfully emulated for the first time in over a decade.
* Degate, a somewhat automated "aid in reverse engineering of digital logic in integrated circuits" - http://www.degate.org/
* Silicon Zoo offers a tutorial / background info on this - http://siliconzoo.org/tutorial.html - and it mentions that Pentium I-era chips were "easily viewable" , probably with optical microscopes.
* A blog about IC reverse engineering - http://uvicrec.blogspot.com/ (from the owner of http://siliconpr0n.org/ , which is also relevant)
The physical deconstruction would be fraught with peril for the chip, which would end up as powder! I'm not sure what the best technique would be. Maybe a grinding wheel, if it could be controlled well enough? Maybe a flat plate with abrasive paste, or a diamond coated nail file? That would probably be easy to control, albeit time consuming. Laser ablation? Heat the chip to slowly and continuously evaporate it, whilst videoing the evaporation process?
One would have to conduct an experiment to see whether it is best to slice the packaged IC, or remove the encapsulation first. The encapsulation can be removed with nitric acid and acetone, or even a blast with a hot flame . I'd guess it would be worth removing the encapsulation.
If I had to pick a technique from above, I'd first try removing the encapsulation then using a diamond coated nail file.
Getting through the package to the chip's top surface isn't too bad, because you can play rough with it until you get pretty close to the chip itself. So you have all sorts of fun options: wet chemistry, laser ablation, and physical milling being most common. Once you get up all in the chip's personal space, wet chemistry is probably the way to go, though nitric acid will wreak havoc on any copper elements, potentially including bond wires if they're not gold. Alternately, you can go at it with a specialized plasma tool.
Delayering the chip is time-consuming, but not prohibitively so. Your choice of wet chemistry, plasma toolsets, and physical grinding on a wheel (which works _shockingly_ well for what feels like a stone-age process). It can take a lot of practice to do this cleanly so you don't penetrate and damage a lower layer while working on an upper one, but it can be done.
The nasty bit, from the point of view of doing this outside a major megacorp, is probing and analyzing smaller geometries. As things get smaller, they get a lot more delicate. You can't just scratch through the insulative layer above metal lines with a big needle anymore, because the tip of that needle is significantly larger than multiple metal lines under it. Laser ablation can still work for mid-sized geometries, but with modern digital ICs it's all about focused ion beam tooling. That's a high-vacuum device that slowly and precisely mills and/or deposits metal with...well, an ion beam. You can get down way below the visible light range in terms of size and precision. Really cool stuff, but good luck finding one for under seven figures!
Once that's done, if your geometry is large enough to use an optical microscope, probe needles range in price from a few bucks a pop to well into the multi-hundred range. If it's too small, the next option is to get a scanning electron microscope with built-in microprobes. That's...not exactly hobbyist budget.
Doing this for an entire modern CPU-scale IC (instead of focusing on a target block) would take ages and ages and ages. I don't even want to think about it for too long. Months, at the least.
Like I said, possible...but expensive (both in engineer and tool time), hard, and time consuming. The thing is, it's time consuming because the bulk of the work of decapsulating, probing, deprocessing, and analyzing the ICs is done manually and iteratively. A TON of it could potentially be automated, but the motivation to automate all this has traditionally been pretty low because the tooling itself is so expensive that it's low-volume work.
Let's consider parallel situations not involving protecting rich peoples' luxury posessions, which seems to be clouding everyone's judgement here.
Some examples where an encryption key is discovered or reverse engineered, and a scientist wants to publish them:
- a key which can shut down every ventilator
- a key which can remotely control the throttle on high speed train
- a key which can explode a nuclear warhead
- the key to your bitcoin stash
- the google master ssl private certificate
There are an infinite number of such examples. I'm shocked and disappointed that the HN community finds publishing keys, as opposed to systematic flaws, acceptable.
Presumably the cognitive dissonance arises from a distaste for rich people. However even if this mostly results in mere car theft, it could also easily result in the innocent being harmed.
Free speech, even under the US first amendment, rather clearly does not apply to publishing private encryption keys, particularly ones that can cause grave harm.
Shame on the HN community.
What if the headline were:
Scientist banned from revealing codes used to control school bus brakes
In many cases, without publishing the keys to make it PAINFULLY obvious to everyone that the vulnerability exists, large companies can spread disinformation and influence public perception that the vulnerability is minimal or doesn't really exist outside of a special case/etc.
In this case, VW is very obviously not planning on updating things, fixing the vulnerability, or addressing things. The vulnerability and the codes have been available on the internet for YEARS without a proper response from VW or a bulletin or other addressing of the issue (and obviously no 'fix' either).
This is one of the key points of the 'responsible disclosure' debate: many companies DONT CARE unless they have to, and will just sit on things indefinitely. With all this publicity, I bet VW addresses this pretty significant vulnerability sooner rather than never now.
Do you disagree with free speech being used to publish de-css or the blu-ray decryption keys? If your security depends entirely on a single key being not discovered and re-used (because you have no way of changing it, for example), you really have a horrible security model. If you're selling that security to people, and it's really not effective at all for it's purpose, then how much different is that from false advertising or even fraud (given that you KNOW that it's not effective, or has already been easily subverted).
Given that Volkswagen spent significant effort to block the publication, I have to presume you are just making shit up.
Even if what you say is true, the argument being made here on HN is that the keys should be published regardless of whether they are available already - which is, quite simply, ethically indefensible.
It's easily ethically defensible - there is no moral imperative to keep the knowledge of something secret which may cause injury to others by being kept secret. In fact, just the opposite. VW is in an ethically indefensible position, as they are in the position of selling vehicles with systems marketed specifically as 'secure' that are, in fact, not secure at all; a fact which has been known to a smaller community (and VW) for over 4 years. THAT is ethically indefensible.
Sometimes, publishing details in a painfully easy to reproduce manner is the only way to get a company to FIX the problem, which is the point in all of this. For a great physical analog, see the 'pen and u-bolt lock' trick. It wasn't until a Youtube video appeared showing just how ridiculously easy that lock was to break that the company updated it's design and fixed things.
You then go on to say there is no ethical imperative to withhold information that may harm others, which is both wrong and contrary to your prior implication - that publishing is ok after a window has passed for the issue to be resolved.
This reasoning is contradictory and flawed.
Personally I feel that an outright ban is unacceptable, however a six month delay is reasonable.
In fact the scientist may have been sitting on this information for quite some time now, and Volkswagen et al have probably already been notified but they refuse to fix it (be it laziness/stupidity, it's outright negligence). My point is we don't know anything except that there's a vulnerability.
Car cyber-security has been in the news recently, and the reports indicate that cars represent a massive attack surface that is very poorly protected. Automobile manufacturers need a swift kick in the ass now more than ever.
I agree that a perpetual ban is not acceptable in this case. Industry should have to fix the situation and the keys should not be predictable from this hardware.
You do NOT need to publish the codes to allow others to replicate this research. Publishing the codes simply allows you to bypass spending the $50k to replicate this research and break into any car with little effort.
It seems quite "not right" to me that my own government could legally prohibit me from doing something in another country (jurisdiction).
It makes a bit more sense when applied to things like child abuse - US citizens who travel abroad and pay to have sex with someone under the age of 18 can be tried in the US. (https://en.wikipedia.org/wiki/Child_sex_tourism#Tourists_fro...)
If what they did was illegal in the country they did it in, then they should be punished there - making it straightforward to extradite them back for trial would be reasonable (assuming it isn't already(. And if it isn't illegal there, well it ought to be.
It also sounds like it would be legal for them to marry a 17 year old (as it is in many states in the US) but pay that 17 year old for sex and 30 years in prison.
Similar to your example, in the state where I live (Indiana) it would be legal for me (a 34-year-old man) to have sex 10 times a day with a 16-year-old kid but it would be a felony punishable by years in prison if I took a naked picture of the same kid. (I don't have any desire to do either one, of course, I just think that's an illustrative example.)
In some cases they have done this, such as in the PROTECT Act of 2003 which contains a prohibition on child sex tourism. But there is no general assumption that US law applies everywhere.
The actual codes are worthless for that.
It's not odd when you consider the need of researchers to allow other researchers to reproduce their results for peer review.
Disassembling the chip isn't part of the experiment, it's a precursor. You don't have to build your own particle accelerator just to replicate a subsequent experiment that was originally conceived based on data from the large hadron collider.
Not the results of this research.
Maybe it will open some eyes in industry that you need to hire experts for that sort of thing, or at least demand external expert auditing of the software.
I don't see a problem with this per se, in cases where there would be severe harm (like significant crime) without such a ban, provided that the ban is time limited to the minimum time required to fix the problem in the wild.
This means, IMHO, that the injunction should come with a requirement that the manufacturer fix vulnerable systems quickly, even if that costs them quite a bit.
If this is done, then I don't see this as a bad thing. "The manufacturer's security is so bad that they had to get a court order to stop people from explaining how while they fixed it" is a pretty good incentive, I think.
Would the high court prefer that, or a legitimate academic publication that allows us all to learn the lessons from this vulnerability?
I should add of course, that in the spirit of responsible disclosure, that this should only be done after the car manufacturers have had adequate time to fix the problem.