Nor did the original article specifically allege that it was "the Chinese", or that the backdoor was malicious. It did allege that it was inserted by the manufacturer (although technically anything on the chip is inserted by the manufacturer), presumably because it differed from a public spec, but the veracity of that statement is still unknown (at least to us). But I don't think that that's enough to call it "bogus".
The technology behind this story is serious and interesting, and all anyone wants to talk about is politics.
Which is why, at the end of this article, I suddenly went "what?!" when I read:
"And researchers will not probably hunt for similar JTAG backdoors in other chips."
If there's one good thing to come of all this fearmongering, it is that chips will be subjected to more scrutiny. Hopefully someday chips are much more open, standardized, and well documented. That's a long way off, but there are many reasons to hope it might happen.
I don't know, maybe I read that sentence wrong. Maybe the author meant, "researchers will, not probably, but for sure hunt for backdoors."
By implication, and by selectively leaving out information about how this attack required the use of JTAG port the article intentionally implied that it was a Chinese attack, even if it didn't lay out that claim in so many words.
It's not a new problem.
You could argue that the original was worded to be deliberately confusing, perhaps even implying that the claims were proven, but I certainly didn't find it that way when I read it.
The stuxnet thing is trickier. To make use of this remotely you (or a stuxnet style virus) would need access to a JTAG connection. These come in many forms, including USB (needing access to the host computer — like stuxnet) or Ethernet (needing access to the network). It seems a bit unrealistic because JTAG tends to be used for development, but field-reconfiguration is one of the advantages of FPGAs.
Of course, I believe this specific Actel FPGA uses flash for configuration, which makes updating it in the field somewhat inconvenient and therefore less likely to be used in practice. I remember hearing that this is why NASA switched to Xilinx, as they now require field reconfigurability.
Still, the article certainly wasn't 'bogus', and the new article claiming so contained far more errors. Especially when you read the actual paper and not just the linked press release.
Good call on the Chinese front, we don't know who generated the key material to block the JTAG.
Incidentally on z/OS systems things that open the system up to external access are sometimes referred to as backdoors, which is what this is. It's a way of accessing the chip, nothing more.
To call it a military chip is inaccurate.
Heck, people could (if they're nuts) call 44Con a military conference given that it's attended by various armed forces folk. I'd rather people didn't though, but if they want to look stupid while thinking they look smart, who am I to stop them?
As Feynman put it, when a researcher over blow his findings, he's doing "Cargo Cult Science" (See the ending chapter of "Surely, you're joking, MR. Feynman").
Quote: """One of the most common building-blocks is the debugger, known as JTAG. This is a standard way of soldering some wires to the chip and connecting to the USB port,"""
JTAG is just the low-level interface to a debugger. "Soldering some wires" is not the building block and USB is nowhere related to it (for example, my work-horse JTAG interface connects to Ethernet).
Quote: """Whereas companies (should) disable the debug feature in the version they send to customers, that's not so easy with chips. It requires millions of dollars for every change to chip design. Therefore, chips always have the JTAG interface enabled."""
At least parts of JTAG need to be enabled (most notably the boundary scan that allows you to read/set individual pins) for proper testing of complex circuit boards, but also this is not the problem here: It seems that they left some instructions active to read back supposedly write-only values (e.g. the AES key in question). Designing one of these internal, protected bits to be the "disable JTAG debugging" would not be that hard. CPUs with integrated flash are doing that for years: A certain signature in the internal non-volatile memory will disable flash-readout and CPU debugging, but boundary scan will stay active.
Quote: """ As real silicon chips are becoming more expensive to manufacturer, FPGAs are becoming a more popular alternative. (...) Every change to a chip design requires millions of dollars in changes to the masks that print gates onto a chip."""
Actually looking at a fixed complexity ASICs are getting cheaper to manufacture over time, just as everything else in chip-making. Or as FPGAs. And again: High-end special-technology ASICs might cost "millions of dollars", but no one in their right mind would re-design a complete ASIC for such a simple change like disabling JTAG debugging:
Chips are built in layers, and it quite common to produce a whole batch of wafers with the "lower layers" that form the actual transistors. The metal layers on top of them (those that form the wires interconnecting the transistors) may be added to say one third of the chips.
Then when errors are found during testing, one could take another wafer from the lot, apply a corrected metal-mask and check if the error could be remedied by re-wiring (often a few spare gates are spread over the wafer "just in case" one has to splice in an inverter in a signal... or such things).
Such a relatively cheap (say: 10% of the complete ASIC production run) change would be the right thing to build a chip with JTAG completely disabled, it would be impossibly to re-enable the feature from the outside, but of course, by opening the chip and re-wiring the metal (this is possible by using focused ion beams on a bare die) one could do it. But this was not the message of the quoted article.
"Ultimately, an attacker can extract the intellectual property (IP) from the device as well as make a number of changes to the firmware such as inserting new Trojans into its configuration."
Using a flaw in the system to "insert" a new trojan is not the same as an existing one. This and many other reasons that one sees when looking at both papers, the vendor response and then their response to the vendor make it pretty obvious that they stick to the backdoor claim to maintain face (perhaps for the original grant or clients).
but the best gem of the new paper is claiming that a crypto flaw that requires physical access to exploit = Denial of Service, considering you took out the chip or that you have physical access already.
That's the most famous drone.
But there have been other, tactical drones shot down on the Afgan and Iraqi borders and over the gulf.
Also, the Americans have shot down Iranian drones over Iraq.
Its best to Google with custom date range before last December so as to avoid all the RQ-170 stuff clogging the results.
Consider something like the drones shot down by Iran. The reason is that they are designed to be cheap, to be frequently lost while flying over the enemy. Thus, it's likely that one of these FPGAs was inside the drone shot down by Iran. While it's unlikely the FPGA had any secrets worthwhile, issues like this make it easier for Iran to reverse engineer the drone and manufacture their own.
The RQ-170 Sentinel was developed by Lockheed Martin's Skunk Works as a stealth Unmanned Aerial Vehicle (UAV)... Few details of the UAV's characteristics have been released, but estimates of its wingspan range from approximately 65 feet (20 m) to 90 feet (27 m).
Even US public doesn't know pretty much anything about it, not even wingspan, as it probably stems straight from some Black Project out of Area 51.
So it is not only VERY expensive, it also includes some of the most TOP SECRET technologies developed by USAF, like stealth and what not. In military jargon it's called high-value asset!
Blogspot is well known for some of the highest quality software(1) and adult paradigm-shifting(2) link sites on the planet.
Blogspot, along with Errata Security(3), provides only the highest of high quality security(4) information. After all, the tagline is "Errata Security is a high-end cyber security consulting company."
(1) NSFW: http://iphonevolt.blogspot.com
(2) NSFW: http://fascormet.blogspot.com/
(4) NSFW: http://tophackdownloads.blogspot.com/
Robert Graham, aka ErrataRob, is well-known and well-respected in the information security industry.
Although you obviously wouldn't know this just from his Blogspot subdomain (and I agree he should probably just register an actual domain name), his content should also stand on its own merit.
There is a lot of crap code on Github and there are a lot of idiots on Twitter, but you shouldn't discount all users of a service because of the quality of some of its users.
Hell, that goes for HN, too.
Ya. He sounds real thorough, too.
> ...[the problem] we should insist on fixing it.
Wait, I was under the impression that no backdoors exist and even if it did, everyone does it, anyway, like the author claims.
> ... there are a lot of idiots on Twitter ...
Don't forget blogspot and HN. This site isn't just for editorials from Forbes and the New York Times after all, it also features lots of PR from Techcrunch and kickstarter. ;)
The parent post is even more problematic; it's implying that the information in the article is low-quality because it's hosted on a service on which other people host shady software and pornography.
"In addition to supporting portable, consumer, industrial, communications and medical applications with commercial and industrial temperature devices, Actel also offers ProASIC3 FPGAs with specialized screening for automotive and military systems."
There seems to be a special variant of the chip for military use, hopefully without this 'debugging feature'.
As an example: for military chips they will certainly do every (non-destructive) test they know on every single device, say at elevated temperature with a little less than the minimum specified operating voltage... The test-devices themselves might cost $1M and be occuplied for oen hour per chip, hence they will charge you more for the final chip.
Well just because it isn't certified, doesn't mean it can't be used to leverage oneself into sensitive devices. Cfr the Chrome attacks of a couple of days ago, where the exploit leveraged several small issues that by themselves weren't dangerous.
It's neither a Xilinx, nor an Altera.
They also make properly rad-hard FPGAs, although these are ridiculously expensive (I've heard that if you are looking at building more than ~10-15 units, an ASIC may actually be cheaper).
Edit: I should add that you could still use Xilinx and Altera devices in these situations, but you really need some way of mitigating the problem -- such as TMR memory.
About the rad-hard cost, yes they are >10k u$s each in most models but rad-hard ICs also are very expensive, I doubt an IC will be cheaper unless you buy by the millions. Anyway if you need something to be rad-hard, cost is the least of your concerns.
Also, note that the Spartan 3AN does not seem to use flash for configuration. It looks like the internal flash is just for storage and the configuration is copied into RAM, as with normal Spartan devices. This means one less chip on the board and potentially better security, but no real advantage for avoiding SEUs.
Military operations and espionage by state actors are rarely limited by commercial constraints. In other words, access to confidential data on a vast array of devices would be a bargain at several million dollars when it comes to a national intelligence agency (or Apple or Google or Microsoft for that matter).
I should think that embedded development environments would have a foolproof way of excluding debug code built into their environments. If they don't, then perhaps this is an opportunity?