

NIST reopens draft recommendation on random number generation for comment [pdf] - healsjnr1
http://csrc.nist.gov/publications/nistbul/itlbul2013_09_supplemental.pdf

======
lmgftp
It should probably be noted that this is not some sort of validation to the
fact that "the NSA owns this particular DRBG".

Surprisingly (to me), this is merely a signal of a government agency that
takes public perception to heart and issues a vote of not-complete-confidence
in standards it has previous prescribed, and today is seeking to rectify the
problem by looking for nothing up my sleeve numbers [0] agreed upon by
security researchers and the public at large. A smart move, no doubt a
difficult one to make, as even the slightest suggestion of no-confidence in a
prescribed standard is quite damaging to the reputation of an institution
devoted to maintaining reliable standards.

More info on nothing-up-my-sleeve: [0]
[http://en.wikipedia.org/wiki/Nothing_up_my_sleeve_number](http://en.wikipedia.org/wiki/Nothing_up_my_sleeve_number)

~~~
tlrobinson
Somewhat off topic, but it seems like it would be better to use some future
unpredictable events to really remove any "nothing up my sleeve" doubt. e.x.
hash of the sum of all S&P 500 companies' closing stock prices on a specific
future date.

~~~
rspeer
I assume you need _some_ flexibility in choosing a nothing-up-my-sleeve
number, in case the first number you try has properties that are bad for the
algorithm.

Imagine if the super-official, international standard nothing-up-my-sleeve
number was 1. Any time you need consistent but arbitrary bits in a
cryptographic algorithm, they must be ...000000000000001. That doesn't sound
like it would work very well.

~~~
johnsoft
In that case, you announce a reroll, along with a published paper explaining
that x^1 == x. But, assuming you use SHA-256 or higher, the chances of that
happening are less than one over the number of atoms in the observable
universe, so you shouldn't worry about it the same way you don't worry about
hash collisions happening purely by chance.

------
cromwellian
I doubt the NIST will ever be trusted again as any standards or specs they are
in favor of will be immediately suspected of having some favorable
vulnerability for the NSA.

Let's say they hold a contest for people to submit next generation
cryptosystems, and that Algorithms A,B, and C make it to the final. If the
NIST publishes critical remarks on A and C and seems to favor B, immediate
skepticism and red flags will be raised. Does B have a hidden weakness the NSA
knows about?

A standards organization can only run on its transparency and integrity.

~~~
tptacek
First, a lot of NIST crypto standards are relatively anodyne; for instance,
the NIST GCM standard basically just explains how to do multiplication in
GF(2^128), and the NIST CTR mode standard just lays out a bunch of ways you
can arrange your counter block. Those standards remain valuable and aren't
likely to harbor backdoors.

Second, it has _always_ been the case that favorable responses from the USG in
general and NSA in particular have cast a pall over proposed standards. Isn't
that why there's a RIPEMD160, for instance?

~~~
rdtsc
Yes but whenever NSA "suggests" changes such as certain constants or like
classic S-boxes from DES. Those don't usually come with a clear explanation,
more like "here makes these change, it will be better, trust us" kind of idea.

Another point is that most people (especially non-US citizens) don't
necessarily view NSA and NIST as separate. They are seen as part of the same
government. More like 2 offices in the same government department.

Now this also brings about an interesting thing I have been thinking about.
NSA is also in charge of protecting its own data. So recommendations,
practices and policies they tweak go into keeping its (and other agencies')
classified data secure.

Given that they have managed to "tweak" and insert backdoors in some
algorithms or systems, how likely they are to recommend those systems for its
own and other government agency use? Do they want the communication or keys to
the nuclear launch sites to use the "tweaked" version. They would need to have
an pretty good feel that no other agency out there has also figured out the
back door.

~~~
lambda
That's why the generally design the backdoor so that it's based on a key that
only they have.

For example, with Dual EC DRBG, researchers discovered that it would be
possible to create the constants based on another constant, with which you
could predict the output easily. But without prior knowledge of that constant,
it would be an infeasible brute-force search to find it.

Likewise, previous publicly known backdoors like the one in the export version
of Lotus Notes depended on a key that the NSA had. There it was even simpler,
and not obfuscated; it would just encrypt a portion of the session key with
the NSA's public key, which they could decrypt and the easily brute-force the
rest of the session key.[1]

The NSA doesn't want to make security weak against arbitrary attackers, they
just want to give themselves the keys.

[1]: [http://www.cypherspace.org/adam/hacks/lotus-nsa-
key.html](http://www.cypherspace.org/adam/hacks/lotus-nsa-key.html)

~~~
rdtsc
Ah, it makes sense now. Thank you for explaining it.

------
lifthrasiir
Background:
[https://en.wikipedia.org/wiki/Dual_EC_DRBG](https://en.wikipedia.org/wiki/Dual_EC_DRBG)

> Dual_EC_DRBG or Dual Elliptic Curve Deterministic Random Bit Generator is a
> controversial pseudorandom number generator (PRNG) designed and published by
> the National Security Agency. [...] Shortly after the NIST publication, it
> was suggested that the RNG could be a kleptographic NSA backdoor.

~~~
tptacek
It's an awfully weird trojan horse --- or, as Daniel Franke put it on Twitter,
a trojan platypus.

First: NIST RNG designs aren't particularly important (unlike the curve
standards); there is a broad diversity of CSPRNG designs, applications tend to
"borrow" the OS's, and no OS I know of uses a design taken directly from NIST.

Second: Dual EC DRBG is a CSPRNG that uses elliptic curve point
multiplications; in other words, it requires bignum math. If you're unfamiliar
with CSPRNG design: that's not a normal requirement. Dual EC is very slow.
Nobody would willingly use it.

Why would _that_ be the big NSA standard back door? I'm not saying it isn't.
Something hinky happened there. I'm just asking: what did they have to gain
from trying to backdoor _that_ standard?

~~~
lambda
> It's an awfully weird trojan horse --- or, as Daniel Franke put it on
> Twitter, a trojan platypus.

Heh, that is a pretty good term for it.

> no OS I know of uses a design taken directly from NIST.

Except for all of the pressure recently on the Linux kernel developers to use
Intel's RdRand directly rather than mixing it into their existing entropy pool
(see, for example,
[https://plus.google.com/117091380454742934025/posts/SDcoemc9...](https://plus.google.com/117091380454742934025/posts/SDcoemc9V3J)
and [https://lkml.org/lkml/2013/9/5/212](https://lkml.org/lkml/2013/9/5/212)),
where apparently the reason is "Customers want a SP800-90 source available
through the OS interface" (quote from David Johnston, designer of the RdRand
hardware, on the Google Plus link).

So, apparently there is a lot of pressure to get the OS to adopt the NIST
standards directly.

There are lots of reasons for this kind of pressure. Obviously, if you sell to
the government, it'll be easier if you follow the NIST standards. There are
likely lots of other compliance related reasons you would want to, such as
encryption requirements for HIPAA. I wouldn't be surprised if some of those
standards either required or were easier to comply with if you just used NIST
approved algorithms, and it's easiest to use those NIST algorithms systemwide
if the OS CSPRNG uses them (and directly, without extra unapproved random
number generation on top).

> Second: Dual EC DRBG is a CSPRNG that uses elliptic curve point
> multiplications; in other words, it requires bignum math. If you're
> unfamiliar with CSPRNG design: that's not a normal requirement. Dual EC is
> very slow. Nobody would willingly use it.

Yes, this is the odd part. On the other hand, you do have to recall that the
NSA is a big government bureaucracy. It may be that their SIGINT enablement
department (the one that's responsible for weakening exportable crypto,
planting backdoors, and the like) had promised to get some backdoors into
widely used standards, but couldn't find a better way to do so surreptitiously
and effectively without weakening security against foreign attackers as well.

It may be that Dual EC DRBG was just inserted so they could check off a box
and continue to get funding for the standards body division of SIGINT
enablement, and not as an actually realistic attack.

~~~
jrochkind1
> _but couldn 't find a better way to do so surreptitiously and effectively
> without weakening security against foreign attackers as well._

Have we seen any evidence that the NSA cares _at all_ about avoiding
"weakening security against foreign attackers" in their quest to weak security
against themselves as attackers?

Aren't they _neccesarily_ weakening security against foreign attackers when
they intentionally weaken crypto, which we now know they do?

~~~
lambda
> Aren't they _neccesarily_ weakening security against foreign attackers when
> they intentionally weaken crypto, which we now know they do?

No. In fact, most of the schemes that we know about in which they have tried
to weaken crypto have involved them having some secret key which can be used
to crack it, but without which you don't have a better attack than the
standard brute-force attack.

That's the case with Dual EC DRBG. What researches discovered was that the
constants in it could have been picked such that with knowledge of a secret
constant, you can predict future output given only a relatively small amount
of past output. Without knowing those constants beforehand, you wouldn't be
able to do better than brute force.

Previous attempts have been similar; the Clipper Chip was supposed to have
strong crypto, but store a master key in escrow with the NSA that they could
use to crack it. Lotus Notes would encrypt part of the session key with a
public key, for which the NSA had a corresponding private key, so if the NSA
wanted to eavesdrop they could decrypt that and use it to speed up the brute-
forcing process[1].

So, there are numerous cases of the NSA trying to balance the need for crypto
that is strong for other attackers, while leaving them a backdoor that only
they can use.

1: [http://www.cypherspace.org/adam/hacks/lotus-nsa-
key.html](http://www.cypherspace.org/adam/hacks/lotus-nsa-key.html)

------
pdknsk
This should be upvoted instead, as it provides context, and also links to this
PDF (or its official announcement).

[https://news.ycombinator.com/item?id=6364340](https://news.ycombinator.com/item?id=6364340)

Or to provide minimum context, the announcement should've been submitted
instead.

[http://nist.gov/director/cybersecuritystatement-091013.cfm](http://nist.gov/director/cybersecuritystatement-091013.cfm)

------
Wingman4l7
For those curious _(as I was)_ as to what the heck that title means, the press
release linked deals with The National Institute of Standards and Technology
(NIST) and what kind of random number generators (RNGs) they recommend using.

~~~
seldo
Agreed, a rewording of the title to something a little more accessible would
not go amiss.

~~~
nitrogen
The terms EC-DRBG and NIST have come up quite a bit lately on HN with regards
to the NSA, so it's not unreasonable to use them to ensure a descriptive title
fits in the 80 characters allotted.

The other cryptic term, SP800-90a, looks like the issue number of an official
standard, like ISO-8859 or EIA/TIA-568.

------
brokenparser
Can anyone provide a mirror, please?

------
marshray
Here's my comment: This is the dumbest PRNG ever.

~~~
mpyne
Look up RANDU.

~~~
marshray
RANDU didn't claim to be cryptographically secure.

~~~
mpyne
Oh.

Well, in that case look up Debian's OpenSSL from a few years ago.

~~~
marshray
Debian's OpenSSL was a perfectly good PSEUDO RNG a.k.a. DRBG

~~~
mpyne
But it claimed to be a cryptographically secure one.

And I'm pretty sure even the NSA-adjusted EC PRNG standardized by NIST offers
more than 15 bits of security.

~~~
marshray
Debain's broken OpenSSL claimed to be a cryptographically secure _true_ random
number generator (CSRNG). But it ended up being seeded with only 15 bits of
entropy, so it failed in the true random part. Nevertheless, the _pseudo_
random number generator (CSPRNG, or as NIST calls it a DRBG) part of it still
sorta worked (I don't recall if you could successfully seed it manually).

But regardless of how you were planning to use it, if an adversary has a
backdoor in your PRNG/DRBG then it's not cryptographically secure (CS). That,
and this Dual EC contraption is probably much slower than a conventional
design.

