Hacker News new | past | comments | ask | show | jobs | submit login
Bogus story: no Chinese backdoor in military chip (erratasec.blogspot.com)
322 points by vgnet on May 29, 2012 | hide | past | favorite | 56 comments



"While they did find a backdoor in a popular FPGA chip, there is no evidence the Chinese put it there, or even that it was intentionally malicious."

Nor did the original article specifically allege that it was "the Chinese", or that the backdoor was malicious. It did allege that it was inserted by the manufacturer (although technically anything on the chip is inserted by the manufacturer), presumably because it differed from a public spec, but the veracity of that statement is still unknown (at least to us). But I don't think that that's enough to call it "bogus".


Agree; I feel like I've misjudged the story here, because apparently the idea that silicon is as riddled with backdoors as software is isn't a big deal; just, "is China cyberwaring us".

The technology behind this story is serious and interesting, and all anyone wants to talk about is politics.


> The technology behind this story is serious and interesting

Which is why, at the end of this article, I suddenly went "what?!" when I read:

"And researchers will not probably hunt for similar JTAG backdoors in other chips."

If there's one good thing to come of all this fearmongering, it is that chips will be subjected to more scrutiny. Hopefully someday chips are much more open, standardized, and well documented. That's a long way off, but there are many reasons to hope it might happen.

I don't know, maybe I read that sentence wrong. Maybe the author meant, "researchers will, not probably, but for sure hunt for backdoors."


I read that as a typo, i.e. "not" -> "now"?


They said that this chip was manufactured in China and that the manufacturer had inserted this backdoor which "could be turned into an advanced Stuxnet weapon to attack potentially millions of systems."

By implication, and by selectively leaving out information about how this attack required the use of JTAG port the article intentionally implied that it was a Chinese attack, even if it didn't lay out that claim in so many words.


As always, logical reasoning arrives just a tad too late to the party. The original story has already circulated the web and stirred anger and mistrust in a sufficient amount of people who will never read this common sense follow-up. Chalk up one more win for sensationalism and fear-mongering.


The reason corrections to sensational stories never make it as far as the original is because the people who propagated the first story have to admit they were fooled, or at least misinformed. I'm about to share this correction to my own link feeds -- but I'll be honest, I'm not enthusiastic about looking dumb.


You can cover for it by raging against the people who misinformed you.


Most of the replies on HN to the original story lined up with this follow up article. Wait and see, it's a debug feature, not necessarily a military chip, etc. On many other sites it was a different story, and I'm sure this follow up won't make their headlines either.


It's an incredibly frustrating phenomenon. Time and again a sensational story causes people to behave like the sky is falling.


"A lie gets halfway around the world before the truth has a chance to get its pants on" - Winston Churchill

It's not a new problem.


I believe that the original version of that quote is, A lie can travel halfway around the world while the truth is putting on its shoes. In which case http://www.quotationspage.com/quote/23633.html says that it should be attributed to Mark Twain.


Too late. Everyone believes the Churchill quote is the original one now. ;)


I'm sure it was Oscar Wilde~


Except that it wasn't a lie, as the original article didn't make these claims. Only the HN headline -- which wasn't in the original article -- and this new article contain sensationalised claims ('bogus story', for example).

You could argue that the original was worded to be deliberately confusing, perhaps even implying that the claims were proven, but I certainly didn't find it that way when I read it.


Are we talking about the same article that had claims about how most chips are manufactured in China (false) and that the vulnerability could be used in a Stuxnet style attack?


Can you offer proof that those are both false? The China claim I grant you — especially as the actual claim was 99% (I somehow missed this in my initial reading).

The stuxnet thing is trickier. To make use of this remotely you (or a stuxnet style virus) would need access to a JTAG connection. These come in many forms, including USB (needing access to the host computer — like stuxnet) or Ethernet (needing access to the network). It seems a bit unrealistic because JTAG tends to be used for development, but field-reconfiguration is one of the advantages of FPGAs.

Of course, I believe this specific Actel FPGA uses flash for configuration, which makes updating it in the field somewhat inconvenient and therefore less likely to be used in practice. I remember hearing that this is why NASA switched to Xilinx, as they now require field reconfigurability.

Still, the article certainly wasn't 'bogus', and the new article claiming so contained far more errors. Especially when you read the actual paper and not just the linked press release.


This is a military chip. It's used in military applications. I'm not saying that the original article didn't use this to sensationalise 'We JTAG-fuzzed a chip and found an AES key', but to deny that the chip is used in the military is inaccurate.

Good call on the Chinese front, we don't know who generated the key material to block the JTAG.

Incidentally on z/OS systems things that open the system up to external access are sometimes referred to as backdoors, which is what this is. It's a way of accessing the chip, nothing more.


The ProAsic3 is a commercial FPGA. So no, it's not military. The military may use it, but that doesn't change that it is a commercial part built to commercial specifications.

To call it a military chip is inaccurate.


It's ambiguous, not inaccurate. If the military used a certain COTS revolver as standard equipment, and the revolver had a eye-first camera with cellular upload to the public Internet, and that feature was not approved and known by the gun selection people, it is fair to say that "military revolvers upload battlefield images to the Internet, a serious security breach.


It's false, implies everything the military uses is military grade. By that reasoning I'm having military-grade coffe this morning. The keyword you need is "defense-grade". Most fpga vendors sells "defense-grade" FPGAs, for example the Xilinx Virtex 6Q and Spartan 6Q (the 'Q' means defense-grade) a very different product line.


There's a difference between grade and use. Grade implies a certain standard of assurance for a specific market or environment. Use just means it's used by someone, or a group (in this case security).

Heck, people could (if they're nuts) call 44Con a military conference given that it's attended by various armed forces folk. I'd rather people didn't though, but if they want to look stupid while thinking they look smart, who am I to stop them?


The original article was censored so it's hard to tell if it's bogus or not. I believe it's still very interesting. Here http://bit.ly/JKatpV it's an older paper from the same author detailing a technique about reading thermal signatures from individual transistors on the chip with a microscope and an astronomy camera. A little more advanced than jtag fuzzing. However I agree that's very hard to insert a hardware backdoor on a mask, even a FPGA mask. Given the hundreds of variations and revisions that FPGAs normally have, it's about impossible to cover even a single familiy. A more plausible explanation is: some big client got very angry in the past because they lost the password and the FPGA vendor didn't had a "master key" to unlock it. Still, very stupid move.


Are the critics basing their objections on the rough, generalist abstract for this paper or the paper itself?

http://www.cl.cam.ac.uk/~sps32/Silicon_scan_draft.pdf


The bigger issue here is if such a back door/debug mode was accidentally/intentionally left in any of the Actel FPGAs that use their anti-fuse technology. At present, anti-fuse seems to be the most robust technology for preventing read-out of configuration bit-stream as there is no serial data being pushed around each time the logic resets. ProASIC3 is a great platform for when you are on a very constrained power budget, however, I would not consider it one of their leading security-hardened chips. There is a lot of design reuse in complex semiconductor products so it is possible that this portion of the design was leaked to other devices.


This is some sensible response to the evil Chinese hype.

As Feynman put it, when a researcher over blow his findings, he's doing "Cargo Cult Science" (See the ending chapter of "Surely, you're joking, MR. Feynman").


I thought cargo cult science was (mis)copying techniques without understanding how they work. The original example was building a fake airport / ATC at an abandoned military base and hoping cargo planes would land there.


Thanks for the reference. Hand't come across that concept before. Assimilating...


For the record - Original discussion on HN: http://news.ycombinator.org/item?id=4030746


Whoever wrote that article is a little sketchy with his facts:

Quote: """One of the most common building-blocks is the debugger, known as JTAG. This is a standard way of soldering some wires to the chip and connecting to the USB port,"""

JTAG is just the low-level interface to a debugger. "Soldering some wires" is not the building block and USB is nowhere related to it (for example, my work-horse JTAG interface connects to Ethernet).

Quote: """Whereas companies (should) disable the debug feature in the version they send to customers, that's not so easy with chips. It requires millions of dollars for every change to chip design. Therefore, chips always have the JTAG interface enabled."""

At least parts of JTAG need to be enabled (most notably the boundary scan that allows you to read/set individual pins) for proper testing of complex circuit boards, but also this is not the problem here: It seems that they left some instructions active to read back supposedly write-only values (e.g. the AES key in question). Designing one of these internal, protected bits to be the "disable JTAG debugging" would not be that hard. CPUs with integrated flash are doing that for years: A certain signature in the internal non-volatile memory will disable flash-readout and CPU debugging, but boundary scan will stay active.

Quote: """ As real silicon chips are becoming more expensive to manufacturer, FPGAs are becoming a more popular alternative. (...) Every change to a chip design requires millions of dollars in changes to the masks that print gates onto a chip."""

Actually looking at a fixed complexity ASICs are getting cheaper to manufacture over time, just as everything else in chip-making. Or as FPGAs. And again: High-end special-technology ASICs might cost "millions of dollars", but no one in their right mind would re-design a complete ASIC for such a simple change like disabling JTAG debugging:

Chips are built in layers, and it quite common to produce a whole batch of wafers with the "lower layers" that form the actual transistors. The metal layers on top of them (those that form the wires interconnecting the transistors) may be added to say one third of the chips.

Then when errors are found during testing, one could take another wafer from the lot, apply a corrected metal-mask and check if the error could be remedied by re-wiring (often a few spare gates are spread over the wafer "just in case" one has to splice in an inverter in a signal... or such things).

Such a relatively cheap (say: 10% of the complete ASIC production run) change would be the right thing to build a chip with JTAG completely disabled, it would be impossibly to re-enable the feature from the outside, but of course, by opening the chip and re-wiring the metal (this is possible by using focused ion beams on a bare die) one could do it. But this was not the message of the quoted article.


His article might be a bit off but it is actually true that the University of Cambridge leaked document is off-base and even their official paper continues to clame there is a backdoor but in a way that negates itself:

"Ultimately, an attacker can extract the intellectual property (IP) from the device as well as make a number of changes to the firmware such as inserting new Trojans into its configuration."

Using a flaw in the system to "insert" a new trojan is not the same as an existing one. This and many other reasons that one sees when looking at both papers, the vendor response and then their response to the vendor make it pretty obvious that they stick to the backdoor claim to maintain face (perhaps for the original grant or clients).

but the best gem of the new paper is claiming that a crypto flaw that requires physical access to exploit = Denial of Service, considering you took out the chip or that you have physical access already.


He also keeps imposing the view that the Drone was "shot down" over Iran as fact. While many people have different views on how the drone was taken intact it was not "shot down".


Its a contentious point if the RQ-170 was shot down or just malfunctioned or run out of fuel or whatever.

That's the most famous drone.

But there have been other, tactical drones shot down on the Afgan and Iraqi borders and over the gulf.

Also, the Americans have shot down Iranian drones over Iraq.

Its best to Google with custom date range before last December so as to avoid all the RQ-170 stuff clogging the results.


More iomportantly the article uses drones, plural? I thought it was only one drone?


I generally liked the article but this drone part is totally off the hook.

Consider something like the drones shot down by Iran. The reason is that they are designed to be cheap, to be frequently lost while flying over the enemy. Thus, it's likely that one of these FPGAs was inside the drone shot down by Iran. While it's unlikely the FPGA had any secrets worthwhile, issues like this make it easier for Iran to reverse engineer the drone and manufacture their own.

What!?

http://en.wikipedia.org/wiki/Lockheed_Martin_RQ-170_Sentinel

The RQ-170 Sentinel was developed by Lockheed Martin's Skunk Works as a stealth Unmanned Aerial Vehicle (UAV)... Few details of the UAV's characteristics have been released, but estimates of its wingspan range from approximately 65 feet (20 m)[6] to 90 feet (27 m).

Even US public doesn't know pretty much anything about it, not even wingspan, as it probably stems straight from some Black Project out of Area 51.

So it is not only VERY expensive, it also includes some of the most TOP SECRET technologies developed by USAF, like stealth and what not. In military jargon it's called high-value asset!


Is this debunking from a reputable source? (I've never heard of this blog before.)


Of course it is reputable. Its on blogspot, a free web host known for exclusivity and high standards.

Blogspot is well known for some of the highest quality software(1) and adult paradigm-shifting(2) link sites on the planet.

Blogspot, along with Errata Security(3), provides only the highest of high quality security(4) information. After all, the tagline is "Errata Security is a high-end cyber security consulting company."

(1) NSFW: http://iphonevolt.blogspot.com

(2) NSFW: http://fascormet.blogspot.com/

(3) http://erratasec.blogspot.com/

(4) NSFW: http://tophackdownloads.blogspot.com/


>Of course it is reputable. Its on blogspot, a free web host known for exclusivity and high standards.

Robert Graham, aka ErrataRob, is well-known and well-respected in the information security industry.

Although you obviously wouldn't know this just from his Blogspot subdomain (and I agree he should probably just register an actual domain name), his content should also stand on its own merit.

There is a lot of crap code on Github and there are a lot of idiots on Twitter, but you shouldn't discount all users of a service because of the quality of some of its users.

Hell, that goes for HN, too.


> We shouldn't be surprised by this backdoor, ... And researchers will not probably hunt for similar JTAG backdoors in other chips.

Ya. He sounds real thorough, too.

> ...[the problem] we should insist on fixing it.

Wait, I was under the impression that no backdoors exist and even if it did, everyone does it, anyway, like the author claims.

> ... there are a lot of idiots on Twitter ...

Don't forget blogspot and HN. This site isn't just for editorials from Forbes and the New York Times after all, it also features lots of PR from Techcrunch and kickstarter. ;)


Questioning the author's reputation rather than the information itself as the grandparent post does is ad-hominem. The article deals with standard practices in a large industry and it probably wouldn't be hard to obtain confirmation or refutation of any factual claims in it.

The parent post is even more problematic; it's implying that the information in the article is low-quality because it's hosted on a service on which other people host shady software and pornography.


There are non-malicious reasons for designing in backdoors (debugging) but the manufacturer is based in the PRC, is subject to PRC laws and pressure, and the PRC government undoubtedly knows that the company supplies hardware to the US military. However benign the original motivation for creating the backdoor may have been, it's potentially bad news.


There is no evidence this chip is being used by the military for sensitive tasks, it isn't even certified for those uses. It's just an off-the-shelf FPGA chip that was found to have a backdoor (requiring physical access!), likely for debug purposes. While that should certainly be a concern for many of the company's customers, it is not necessarily a national security crisis.


Not evidence, but the product page indicates:

"In addition to supporting portable, consumer, industrial, communications and medical applications with commercial and industrial temperature devices, Actel also offers ProASIC3 FPGAs with specialized screening for automotive and military systems."

http://www.actel.com/products/pa3/

There seems to be a special variant of the chip for military use, hopefully without this 'debugging feature'.


"Specialized screening for automotive and military systems" means that they have done more extensive tests on otherwise identical chips.

As an example: for military chips they will certainly do every (non-destructive) test they know on every single device, say at elevated temperature with a little less than the minimum specified operating voltage... The test-devices themselves might cost $1M and be occuplied for oen hour per chip, hence they will charge you more for the final chip.


"There is no evidence this chip is being used by the military for sensitive tasks, it isn't even certified for those uses. "

Well just because it isn't certified, doesn't mean it can't be used to leverage oneself into sensitive devices. Cfr the Chrome attacks of a couple of days ago, where the exploit leveraged several small issues that by themselves weren't dangerous.


Of course, so could a paperclip.


Who even uses that FPGA?

It's neither a Xilinx, nor an Altera.


Actel devices are often used because they use flash memory for the configuration, which makes them usable in situations where RAM may be untrustworthy for something so important (e.g. where you can get bit-flipping due to ionising radiation).

They also make properly rad-hard FPGAs, although these are ridiculously expensive (I've heard that if you are looking at building more than ~10-15 units, an ASIC may actually be cheaper).

Edit: I should add that you could still use Xilinx and Altera devices in these situations, but you really need some way of mitigating the problem -- such as TMR memory.


Xilinx also has flash fpgas, the Spartan 3AN and I believe altera also should have one.

About the rad-hard cost, yes they are >10k u$s each in most models but rad-hard ICs also are very expensive, I doubt an IC will be cheaper unless you buy by the millions. Anyway if you need something to be rad-hard, cost is the least of your concerns.


ICs have their hardware 'configuration' effectively in the metal layers, which are not so susceptible. The configuration is the biggest concern, as this is essentially the processor (or whatever the circuit is) itself. RAM used by a processor is also at risk, but this is easier to handle. From what I'm told, a reasonably large feature size ASIC can have far fewer problems than a modern FPGA.

Also, note that the Spartan 3AN does not seem to use flash for configuration. It looks like the internal flash is just for storage and the configuration is copied into RAM, as with normal Spartan devices. This means one less chip on the board and potentially better security, but no real advantage for avoiding SEUs.


Even I who promised myself I would take all future stories about "cyber threats" with a huge grain of salt, because I know the US Government is very keen on expanding their (offensive) Cyber agencies, and would stop at nothing to spread propaganda about it, I still almost believed this story. Damn it.


Seems as though there is some truth and some doubt. Perhaps not "Bogus" but rather "Over-Hyped" --

http://www.securityweek.com/china-wrongfully-accused-over-ba...


While I find much of the author's argument relevant, "it would be expensive" is very unconvincing.

Military operations and espionage by state actors are rarely limited by commercial constraints. In other words, access to confidential data on a vast array of devices would be a bargain at several million dollars when it comes to a national intelligence agency (or Apple or Google or Microsoft for that matter).


The cause of these backdoors isn't malicious, but a byproduct of software complexity. Systems need to be debugged before being shipped to customers. Therefore, the software contains debuggers.

I should think that embedded development environments would have a foolproof way of excluding debug code built into their environments. If they don't, then perhaps this is an opportunity?


This article doesn't seem to actually debunk anything at all. It's a weakly supported opinion piece.


Cue Bogus Bogus story link...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: