Hacker News new | past | comments | ask | show | jobs | submit login
NIST reopens draft recommendation on random number generation for comment [pdf] (nist.gov)
140 points by healsjnr1 on Sept 11, 2013 | hide | past | favorite | 46 comments



It should probably be noted that this is not some sort of validation to the fact that "the NSA owns this particular DRBG".

Surprisingly (to me), this is merely a signal of a government agency that takes public perception to heart and issues a vote of not-complete-confidence in standards it has previous prescribed, and today is seeking to rectify the problem by looking for nothing up my sleeve numbers [0] agreed upon by security researchers and the public at large. A smart move, no doubt a difficult one to make, as even the slightest suggestion of no-confidence in a prescribed standard is quite damaging to the reputation of an institution devoted to maintaining reliable standards.

More info on nothing-up-my-sleeve: [0] http://en.wikipedia.org/wiki/Nothing_up_my_sleeve_number


Sure, but the Times piece /very strongly/ suggests it: http://bits.blogs.nytimes.com/2013/09/10/government-announce...

(Perlroth quotes from a few unpublished, leaked memos.)


Somewhat off topic, but it seems like it would be better to use some future unpredictable events to really remove any "nothing up my sleeve" doubt. e.x. hash of the sum of all S&P 500 companies' closing stock prices on a specific future date.


I assume you need some flexibility in choosing a nothing-up-my-sleeve number, in case the first number you try has properties that are bad for the algorithm.

Imagine if the super-official, international standard nothing-up-my-sleeve number was 1. Any time you need consistent but arbitrary bits in a cryptographic algorithm, they must be ...000000000000001. That doesn't sound like it would work very well.


In that case, you announce a reroll, along with a published paper explaining that x^1 == x. But, assuming you use SHA-256 or higher, the chances of that happening are less than one over the number of atoms in the observable universe, so you shouldn't worry about it the same way you don't worry about hash collisions happening purely by chance.


Ok, then describe some algorithm with exact criteria (and explanations of criteria) needed for the number, but still seeded by future random events, ex "if the first hash doesn't meet these crtieria, hash it again and again until it does.


The variety of "nothing up my sleeve" numbers listed in Wikipedia suggests that the NSA could brute force a "nothing up my sleeve" source backtracked from a vulnerable number they wanted to use.


> It should probably be noted that this is not some sort of validation to the fact that "the NSA owns this particular DRBG".

It sounds like the NYT is convinced. The caption under the header picture reads:

"As part of its efforts to foil Web encryption, the National Security Agency inserted a backdoor into a 2006 security standard adopted by the National Institute of Science and Technology, the federal agency charged with recommending cybersecurity standards."


I doubt the NIST will ever be trusted again as any standards or specs they are in favor of will be immediately suspected of having some favorable vulnerability for the NSA.

Let's say they hold a contest for people to submit next generation cryptosystems, and that Algorithms A,B, and C make it to the final. If the NIST publishes critical remarks on A and C and seems to favor B, immediate skepticism and red flags will be raised. Does B have a hidden weakness the NSA knows about?

A standards organization can only run on its transparency and integrity.


First, a lot of NIST crypto standards are relatively anodyne; for instance, the NIST GCM standard basically just explains how to do multiplication in GF(2^128), and the NIST CTR mode standard just lays out a bunch of ways you can arrange your counter block. Those standards remain valuable and aren't likely to harbor backdoors.

Second, it has always been the case that favorable responses from the USG in general and NSA in particular have cast a pall over proposed standards. Isn't that why there's a RIPEMD160, for instance?


There has always been suspicion, but nothing close to proof. Now that there is close to proof that the NSA has been inserting weaknesses for themselves (unlike the DES S-Box modifications they made which actually strengthened it) The Clipper proposal seemed to suggest the NIST/NSA was interested in strong cryptosystems with explicit, transparent backdoors, and they also had the export limitations, so many non-conspiracy theory people assumed the NSA was more interested in cryptanalyzing foreign communications and keeping our domestic cryptosystems out of foreign hands.

What used to be long held cryptoanarchist conspiracy theories are no longer so far fetched, as now the NIST/NSA is more concerned with targeting the existing communication networks of friendly western democracies/domestic intercept, on which terrorists may piggyback, instead of foreign non-aligned powers.

That is, the greatest threat is no longer foreign comm networks and cryptosystems, but domestic. We have become the adversary.

Anyway, what do you call AES? It was selected as the winner by the NIST and it has widespread industry adoption.


No. You didn't read my comment. The notion that NSA might know of weaknesses in any standard it endorsed has never been far-fetched. It's closer to an article of faith among cryptographers. Again: there are whole standards that exist solely because of that concern.


Yes but whenever NSA "suggests" changes such as certain constants or like classic S-boxes from DES. Those don't usually come with a clear explanation, more like "here makes these change, it will be better, trust us" kind of idea.

Another point is that most people (especially non-US citizens) don't necessarily view NSA and NIST as separate. They are seen as part of the same government. More like 2 offices in the same government department.

Now this also brings about an interesting thing I have been thinking about. NSA is also in charge of protecting its own data. So recommendations, practices and policies they tweak go into keeping its (and other agencies') classified data secure.

Given that they have managed to "tweak" and insert backdoors in some algorithms or systems, how likely they are to recommend those systems for its own and other government agency use? Do they want the communication or keys to the nuclear launch sites to use the "tweaked" version. They would need to have an pretty good feel that no other agency out there has also figured out the back door.


And in the DES s-box case, you'd have been right to trust them, because they fixed a vulnerability in DES 20 years before the theory behind it would become public. They couldn't disclose the reasoning behind the new s-boxes without disclosing differential cryptanalysis, because they generated the new s-boxes by first generating random candidates and then testing them for resilience against differential cryptanalysis.


True and that is the problem they are facing. Before when they came in and said "trust us it is better" NIST and everyone would say "yup we trust you". Now it has switched to "no way, you are just building in a backdoor". That is the sad part.


That's why the generally design the backdoor so that it's based on a key that only they have.

For example, with Dual EC DRBG, researchers discovered that it would be possible to create the constants based on another constant, with which you could predict the output easily. But without prior knowledge of that constant, it would be an infeasible brute-force search to find it.

Likewise, previous publicly known backdoors like the one in the export version of Lotus Notes depended on a key that the NSA had. There it was even simpler, and not obfuscated; it would just encrypt a portion of the session key with the NSA's public key, which they could decrypt and the easily brute-force the rest of the session key.[1]

The NSA doesn't want to make security weak against arbitrary attackers, they just want to give themselves the keys.

[1]: http://www.cypherspace.org/adam/hacks/lotus-nsa-key.html


Ah, it makes sense now. Thank you for explaining it.


Exactly.

I believe we will see a rise in credibility of foreign (for Americans) standard bodies. It could be Germany (would it be BSI?) or other country. I know for example Redhat had been getting their Common Criteria cert (needed to easier sell their systems to some government agencies) from Germany's BSI, but it was because of red tape not credibility.

http://investors.redhat.com/releasedetail.cfm?ReleaseID=7168...

I believe similar things will happen with other standards, products and services related to security. It will be beneficial to advertise that it was some other rather agencies that certified the product/stand besides NIST. Or that somehow this service or product is better because we know NSA didn't stick its fingers in the pie. Kind of a negative advertisement. A real shame. This will hurt American companies (including jobs, taxes) quite a bit.


As a rule of thumb I would recommend that no government agency be trusted about any projects they develop. Everyone would be better off {blank stare} Beyond that, there really needs to be more focus on open-source and public code review with an industry funded focus group that takes on reviewing core elements of technology and the internet down to the atomic level.

The chances are that if there are flaws and deliberate corruption to be discovered, it is to be discovered now. Not every line of code needs reviewing at all times, whole projects need periodic, thorough review.

Can I just say, I wish someone at some kind of reputable journal would calculate the damage and cost of what our government has done. It goes far beyond the obvious costs, by affecting things like, e.g., Apple's new, excellently timed iPhone 5S Touch ID feature, which will surely not go over well with consumers, especially globally. The US Government has now taken a pipe to the shins of most valuable company on the globe; all because our psychopaths in charge aspire to being despotic authoritarians that have to control everything in order to prevent discovery of their incompetence.


Background: https://en.wikipedia.org/wiki/Dual_EC_DRBG

> Dual_EC_DRBG or Dual Elliptic Curve Deterministic Random Bit Generator is a controversial pseudorandom number generator (PRNG) designed and published by the National Security Agency. [...] Shortly after the NIST publication, it was suggested that the RNG could be a kleptographic NSA backdoor.


It's an awfully weird trojan horse --- or, as Daniel Franke put it on Twitter, a trojan platypus.

First: NIST RNG designs aren't particularly important (unlike the curve standards); there is a broad diversity of CSPRNG designs, applications tend to "borrow" the OS's, and no OS I know of uses a design taken directly from NIST.

Second: Dual EC DRBG is a CSPRNG that uses elliptic curve point multiplications; in other words, it requires bignum math. If you're unfamiliar with CSPRNG design: that's not a normal requirement. Dual EC is very slow. Nobody would willingly use it.

Why would that be the big NSA standard back door? I'm not saying it isn't. Something hinky happened there. I'm just asking: what did they have to gain from trying to backdoor that standard?


> applications tend to "borrow" the OS's, and no OS I know of uses a design taken directly from NIST.

Windows has support for it (although it's up to applications to choose to use it or not).

If it's part of a standard then it might be required by some organisations. For example other government/military agencies and people dealing with them. They could also look to pass laws to require corporations protect their customers private data using the standard. And just have people recommend the standard to everyone, after all was designed by NIST/NSA.

It's also likely to end up in software that wants to support those use cases which could have seen it filter out to other uses. And possibly hardware support.

If the potential for a backdoor hadn't been discovered people wouldn't have objected/noticed.

> Dual EC is very slow. Nobody would willingly use it.

Which leaves the question of why it would end up in the NIST standard if it's not very good.

Of course maybe they where just testing the waters.


> It's an awfully weird trojan horse --- or, as Daniel Franke put it on Twitter, a trojan platypus.

Heh, that is a pretty good term for it.

> no OS I know of uses a design taken directly from NIST.

Except for all of the pressure recently on the Linux kernel developers to use Intel's RdRand directly rather than mixing it into their existing entropy pool (see, for example, https://plus.google.com/117091380454742934025/posts/SDcoemc9... and https://lkml.org/lkml/2013/9/5/212), where apparently the reason is "Customers want a SP800-90 source available through the OS interface" (quote from David Johnston, designer of the RdRand hardware, on the Google Plus link).

So, apparently there is a lot of pressure to get the OS to adopt the NIST standards directly.

There are lots of reasons for this kind of pressure. Obviously, if you sell to the government, it'll be easier if you follow the NIST standards. There are likely lots of other compliance related reasons you would want to, such as encryption requirements for HIPAA. I wouldn't be surprised if some of those standards either required or were easier to comply with if you just used NIST approved algorithms, and it's easiest to use those NIST algorithms systemwide if the OS CSPRNG uses them (and directly, without extra unapproved random number generation on top).

> Second: Dual EC DRBG is a CSPRNG that uses elliptic curve point multiplications; in other words, it requires bignum math. If you're unfamiliar with CSPRNG design: that's not a normal requirement. Dual EC is very slow. Nobody would willingly use it.

Yes, this is the odd part. On the other hand, you do have to recall that the NSA is a big government bureaucracy. It may be that their SIGINT enablement department (the one that's responsible for weakening exportable crypto, planting backdoors, and the like) had promised to get some backdoors into widely used standards, but couldn't find a better way to do so surreptitiously and effectively without weakening security against foreign attackers as well.

It may be that Dual EC DRBG was just inserted so they could check off a box and continue to get funding for the standards body division of SIGINT enablement, and not as an actually realistic attack.


> but couldn't find a better way to do so surreptitiously and effectively without weakening security against foreign attackers as well.

Have we seen any evidence that the NSA cares _at all_ about avoiding "weakening security against foreign attackers" in their quest to weak security against themselves as attackers?

Aren't they _neccesarily_ weakening security against foreign attackers when they intentionally weaken crypto, which we now know they do?


> Aren't they _neccesarily_ weakening security against foreign attackers when they intentionally weaken crypto, which we now know they do?

No. In fact, most of the schemes that we know about in which they have tried to weaken crypto have involved them having some secret key which can be used to crack it, but without which you don't have a better attack than the standard brute-force attack.

That's the case with Dual EC DRBG. What researches discovered was that the constants in it could have been picked such that with knowledge of a secret constant, you can predict future output given only a relatively small amount of past output. Without knowing those constants beforehand, you wouldn't be able to do better than brute force.

Previous attempts have been similar; the Clipper Chip was supposed to have strong crypto, but store a master key in escrow with the NSA that they could use to crack it. Lotus Notes would encrypt part of the session key with a public key, for which the NSA had a corresponding private key, so if the NSA wanted to eavesdrop they could decrypt that and use it to speed up the brute-forcing process[1].

So, there are numerous cases of the NSA trying to balance the need for crypto that is strong for other attackers, while leaving them a backdoor that only they can use.

1: http://www.cypherspace.org/adam/hacks/lotus-nsa-key.html


It's unclear what the NSA thinks it's doing.

Backdooring the RNG, if they can keep the trapdoor secret(and the NSA would think that, despite the fact that given Snowden, it seems they have some security problems), doest obviously weaken security against foreign attackers who don't have the key unless they can solve the discrete log problem. Plus, it's possible(I think likely) they didn't intend for it to be widely used( it's slow as hell after all and they knew that), but wanted it on systems so they could swap it out for targeted attacks.

Weakening crypto standards(as the NYT reported), on the other hand, seems very counter productive. Though I suppose from the NSA's point of view, it depends which standards. Screwing with IPSEC would seem to hurt US national security and they've been accused of doing that. Screwing with the encryption standards of mobile phone voice communications, on the other hand, would seem to have a far lower consequence.


Prior to this(or maybe Crypto 2007 where the suspicions were first aired), would you have have batted an eye if you saw Dual_EC_DRBG in a code base with some configuration bit for which NIST RNG you used?

Messing with the standard gets a backdoored RNG onto almost every system. Now NSA just has to convince companies to use it( and it's a NIST standard, so the company has some deniability) or surreptitiously swap it out.

Microsoft's CNG api has a default random number generator. How hard would it be to detect if that default was switched to BCRYPT_RNG_DUAL_EC_ALGORITHM for say, all windows copies sent to Iran or China?


I wouldn't have batted an eye, no. But that's not my point; what makes it weird isn't how poorly the "trojan" was hidden, but in what a pointless place it was hidden.

The "backdoor Chinese Windows" idea is plausible, though.


Saw this pointed out in an LWN comment:

> Apparently, RSA Security BSAFE Share for Java 1.1 has DUAL_EC_DRBG as a default:

> "The default Pseudo Random Number Generator (PRNG) is the Dual EC-DRBG using a P256 curve with prediction resistance off."

> I didn't find an obvious link for the equivalent C/C++ library documentation, but the RSA BSAFE CNG Cryptographic Primitives Library 1.0 FIPS 140-1 Security Policy document from RSA Security at the NIST site says (p.14):

> "The Module provides a default RNG, which is the Dual EC DRBG, using a P256 curve and SHA-256."

https://lwn.net/Articles/566329/

So yes, there are real products out there using Dual EC DRBG. In other news, never trust any crypto from RSA Security. After the SecureID fiasco, and this, it's pretty clear that they are not worth trusting for anything security related.


The best explanation is probably that NSA isn't infallible. Like any government organization, they do things that turn out not to be very effective.


Total tin foil hat territory here, but I see two explanations:

1) they didn't know it was going to be unpopular and dog slow when they started out.

2) they knew it was going to be unpopular and dog slow and didn't invest much time in covering their tracks.


I've actually seen someone choose to use Dual EC DRBG until I yelled "NOOOO DONT": Their software was implementing ECDSA signing, so all the primitives were already there!


It doesn't matter if you have all the primitives. The problem isn't that it's hard to code; the problem is that it's very slow.


Which seems to be confirmed by Snowden's leaks, according to the NYT article from earlier this week.

> Cryptographers have long suspected that the agency planted vulnerabilities in a standard adopted in 2006 by the National Institute of Standards and Technology and later by the International Organization for Standardization, which has 163 countries as members. Classified N.S.A. memos appear to confirm that the fatal weakness, discovered by two Microsoft cryptographers in 2007, was engineered by the agency. The N.S.A. wrote the standard and aggressively pushed it on the international group, privately calling the effort "a challenge in finesse."

Bottom of page 3: http://www.nytimes.com/2013/09/06/us/nsa-foils-much-internet...


This should be upvoted instead, as it provides context, and also links to this PDF (or its official announcement).

https://news.ycombinator.com/item?id=6364340

Or to provide minimum context, the announcement should've been submitted instead.

http://nist.gov/director/cybersecuritystatement-091013.cfm


For those curious (as I was) as to what the heck that title means, the press release linked deals with The National Institute of Standards and Technology (NIST) and what kind of random number generators (RNGs) they recommend using.


Agreed, a rewording of the title to something a little more accessible would not go amiss.


The terms EC-DRBG and NIST have come up quite a bit lately on HN with regards to the NSA, so it's not unreasonable to use them to ensure a descriptive title fits in the 80 characters allotted.

The other cryptic term, SP800-90a, looks like the issue number of an official standard, like ISO-8859 or EIA/TIA-568.


Can anyone provide a mirror, please?


Here's my comment: This is the dumbest PRNG ever.


Look up RANDU.


RANDU didn't claim to be cryptographically secure.


Oh.

Well, in that case look up Debian's OpenSSL from a few years ago.


Debian's OpenSSL was a perfectly good PSEUDO RNG a.k.a. DRBG


But it claimed to be a cryptographically secure one.

And I'm pretty sure even the NSA-adjusted EC PRNG standardized by NIST offers more than 15 bits of security.


Debain's broken OpenSSL claimed to be a cryptographically secure true random number generator (CSRNG). But it ended up being seeded with only 15 bits of entropy, so it failed in the true random part. Nevertheless, the pseudorandom number generator (CSPRNG, or as NIST calls it a DRBG) part of it still sorta worked (I don't recall if you could successfully seed it manually).

But regardless of how you were planning to use it, if an adversary has a backdoor in your PRNG/DRBG then it's not cryptographically secure (CS). That, and this Dual EC contraption is probably much slower than a conventional design.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: