Hacker News new | past | comments | ask | show | jobs | submit login
Don’t Rush Quantum-Proof Encryption, Warns NSA Research Director (nextgov.com)
123 points by jonbaer on Nov 20, 2019 | hide | past | favorite | 111 comments



The whole premise of the article is broken:

1. Every respectable cryptographic protocol designer would hedge their bets by combining a post-quantum cryptography (PQC) algorithm with a classical one, preferable elliptic curve cryptography (ECC), such that you first have to break ECC in order to attack the post-quantum cryptography. ECC is great in that it is both fast, secure, its signatures, ciphertexts and keys are small. Every post-quantum algorithm fails in at least one of the categories, but as ECC excels everywhere, the overhead is basically bound by a factor of two.

2. Our focus can't be on choosing one PQC algorithm now (and keep it forever), as it is a very young field comparably (as the article agrees). Instead, we need to built up algorithm agility in our protocols and software, as we are probably going to change PQC algorithms at least once, when the cryptographic community has gained experience in PQC design/cryptanalysis and we switch to the second wave PQC algorithms. In practice that means: Never assume keys, ciphertext, signatures are small. Investigate whether it is possible for keys to have state (see Lamport signature). And the only way to show these properties about protocols and software is to try out some PQC algorithms now. (Why the urgency? Because software and protocol turnaround time is bonkers in commercial applications. Heck, parts of the payment industry still use single DES in 2019...)

Given that the NSA knows both of this, the question is whether the author was clueless or the NSA spokesperson is malicious.


My employer produces low-power devices with hardware cryptography built into them. Without the crypto hardware, almost all crypto (including ECC) is too slow for practical use. It's all well and good to have "crypto agility" but that ends when it comes to depending on silicon. So if we're making our 20 year plan for, say, the next generation hardware platform that we're going to invest many millions of dollars and thousands of engineer hours to build, then which PQC algorithms will we select to be built into our platform? It's very unclear at this point.

Certainly we can be flexible in key and cert sizes, but also I happen to live in a world where a 1200 byte MTU actually matters a great deal, so it's easier to just push the requirement for dealing with enormous certificates down the road for the day when we actually have enormous certificates. Future-proofing isn't an issue yet because legacy devices will never be able to do PQC.

The premise is not broken at all, for us.


> So if we're making our 20 year plan for, say, the next generation hardware platform that we're going to invest many millions of dollars and thousands of engineer hours to build, then which PQC algorithms will we select to be built into our platform? It's very unclear at this point.

We've deprecated SSL 2.0 (2011), SSL 3.0 (2015), and TLS 1.0 & 1.1 (2020). We've gone from "use any >128-bits cipher", to don't use SSL CBC but RC4 is okay (POODLE, BEAST), to RC4 is not okay (Bar-mitzvah, NOMORE), to only use AEAD ciphers.

All in the last ten years.

If you have to add support for crypto acceleration in your products, perhaps look into FPGAs?


One of the premises of DBJ has been implementing algorithms that are efficient with off the shelf hardware. Would you say that that is wrong or if not, would it be better to buy hardware with generic vector units that can be used with newer algorithms that utilize the more common hardware?


The marketplace is clearly willing to implement hardware speedups for crypto, and in the low-power market where consumption of joules is tracked with multi-digit precision, shuttling expensive power operations to specialized hardware that can be powered off when not needed (e.g. after transition into a symmetric scheme) is a great thing. So I have to wonder if chasing off-the-shelf efficiency (e.g. in-CPU) is maybe not the right thing. If I have to power up a beefy CPU just so that I can do a key exchange or validation every few months, and pay for that the entire time it is not being utilized, that's just not a good power equation.

One of the benefits of SIDH and friends is the ability to re-use some existing ECC hardware. This is a very good thing for key exchange, but the signature problem remains challenging with enormous keys or signatures.


Is that a pertinent question? The availability of devices with integrated cryptography is very, very low due to ITAR. Perhaps the only thing I have encountered is a bluetooth controller.

Many things are not space constrained as they are cost constrained. It would be easier to put in a core high power enough to get acceptable performance for the one connection it needs to service.

It will be more expensive, probably in terms of development work. Will people do it? Probably not, but people weren't doing security right anyway.


We're not talking about desktop machines or even mobile phones. I'm talking about specialized sensors that are sold to big customers who deploy millions of them at a time. A 10-cent increase in the cost to manufacture easily results in millions of dollars lost over the lifetime of the product's sale and operation. We also sell battery operated devices expected to continue operating flawlessly for 20 years, on a small clutch of AA batteries. So no, we don't just "put in a core high power enough" because that boosts the production cost, eats into the power budget, and causes us to lose contracts.

We are absolutely watching the PQC space, but we absolutely will not move at all, beyond experimental toys, until NIST is done their first round and maybe not even then if there aren't any actual QCs around doing real work.

Also, pull out your wallet. I bet you can find several devices with embedded crypto. All of my debit cards, for example.


Yeah, I know all of this. Smart cards are kind of the exception because they aren't reprogrammable. But security costs money. As you've made it clear your business and customers prefer cost to security.


They're willing to pay for security, but they don't want to pay a penny more than they must. Security is not, and never will be, an open-ended budget.


So they prefer cost to security. That's fine, most people do. If being quantum resistant was a priority they would figure out a way to do it and it may be similar to what I described, or not; but it could happen without hardware implementation if it was truly desired.


This just isn't true.

Nearly every SoC you can buy today has hardware accelerators in it, from STM32s up to Xeons. You have to be looking at really tiny, generally pretty old micros before you literally don't have any.

On top of that, hitting hardware speeds by putting in faster cores just isn't a thing for most parts. It's pretty easy to get 8-9x throughput wins on many primitives with a hardware accelerator, but getting a similar improvement just by getting bigger chips is often impossible and always expensive.


> Nearly every SoC you can buy today has hardware accelerators in it

True, but few are full-featured HW acceleration SoCs. Most support a few operations like for instance AES-ECB and maybe AES-CBC but if you want AES-CCM or AES-GCM you still need to implement parts of it in software. The HW may be super fast at ECB:ing many blocks of memory but the setup cost is steep so when you need to ECB just a single block (for your counter in CCM) it buys you very little performance gains over just ECB in SW. (Of course what you do then is setting up several counters in a larger block of memory, after each other, this is ok because the counters are just increments, and you ECB a bunch of blocks. Next you need to solve how to do the same to get CBCMAC with just CBC HW...)


This is just moving the goalposts. First it was "crypto accelerators are rare because ITAR", now it's "crypto accelerators are rare because they don't buy you much". Neither is true.

Crypto accelerators are extremely common, including those that implement full cryptosystems or even complete protocols. Nearly every wireless part will have them (especially for CCMP), as well as basically every modern+common consumer device SoC (eg, all Qualcomm, Samsung, Apple, AMD, and Intel parts). Several of these actually have overlapping accelerators for eg memory encryption or wireless (full protocol) and acceleration instructions like those for ARMv8. And they are there because they work.

Setup cost is a thing, but A) is largely paid when you rekey and therefore rarely for most protocols, B) is acceptable in many protocols because you can interleave other operations to prevent port contention without sacrificing throughout, and C) is often buried by the cost of a very small number of blocks, or even just one.


He didn't move the goalposts and usefully expanded on my point. Those devices you're talking about notably adhere to other external standards and are not typically user reprogrammable (where user is the integrator). Also important is that I would not consider them secure in general due to the standards they implement. You also certainly realize that their power consumption, when present, massively dwarfs the type of processor we were first discussing?

By the time you get to the ARMv8 accelerators, yes, you're going to exactly the same place I was arguing we should go with my original comment. There's actually a number of primitives that could be reused for various systems.


The original claim was that these parts were rare because of ITAR. They aren't rare, and ITAR doesn't have much to do with where they're present or absent. Shifting the argument to a different point about a specific accelerator or specific class of parts is exactly as I said: moving the goalposts.

The question of whether they're user programmable or not is nearer to the mark because EAR cares about it, but it still doesn't present a formidable barrier-- at least, I've been shipping parts with crypto accelerators at various levels of user configurability for a long time, and so has everybody else.


> Setup cost is a thing, but A) is largely paid when you rekey

Well, it depends on the crypto HW. Some HWs are designed for "throughput", which is completely useless for ECB but looks good on specs ("Our HW AES 10MB/s!"). So you set it up with src, dest and key pretty much as you setup your typical DMA transfer, only you almost never want to encrypt more than 16 bytes at a time with ECB so it's mostly wasted.

> consumer device SoC (eg, all Qualcomm, Samsung, Apple, AMD, and Intel parts)

We are not all so fortunate that we get to work with such powerful SoC. In my job it's mostly small embedded MPUs.

> B) is acceptable in many protocols

I think we are talking past each other here. I haven't even gotten to the protocol part yet. In order to support a wireless and/or network protocol you will need better building blocks than AES-ECB. You need AES-GCM (or at least AES-CCM). Not to mention ECDSA or RSA(>=3072)...


I'm super confused. Let's back up a step.

Most accelerators come in one of a few flavors:

1/ They implement the expensive parts of a primitive for you and let you chain them together. This is how AES-NI and the ARMv8 crypto extensions work. Performance for these is generally measured in terms of cycle latency, or with a reference piece of software in cycles per byte. Common values for cycles per byte are anywhere from about 0.2 to 30. Much higher than that and people will start to go look at software as an option. You tend to see these on beefy systems with out-of-order cores.

2/ They implement a primitive for you, eg AES-ECB or SHA256, or more rarely AES-GCM and similar. These can then be chained together as with the above to build even higher level primitives like AES-CTR or AES-CCM, or they can be used as-is. These are usually found on micros as additional selling points, and therefore show up just above the bottom of most manufacturers' product lines as an upsell. These are typically measured in something like MB/s throughput, and I assume they're what you're focused on.

3/ They implement a full protocol, like TLS, CCMP, or secure boot. These show up on things that might more properly deserve the term SoC rather than microcontroller, largely because they tend to be attached to high-speed I/O. They generally aren't measured for cryptographic performance but rather for the performance of the implemented protocol.

In my mind, all three of these are using crypto accelerators. Taken together it is extremely common that a part will have one or more of these, and I'm not sure if we're still disagreeing on one or both of those points.

Regarding ECB, I don't know what you mean. Almost nobody uses ECB alone (thank goodness). Even if they have an accelerator for it, it's usually used to implement something like CTR with some software to glue it together (maybe with then yet more glue to do GCM). In that way, those accelerators act like a just-barely-higher-level version of the first type-- and if what you have is the first type of course you'll do that no matter what. This is still an accelerated implementation, it's just not 100% done in the accelerator. Of course, if you're doing that you're very often encrypting more than a block at a time. And because it's quite rare that you will be performance bottlenecked on a small infrequent operation in any context, you generally only do the work to turn on the accelerators when you care about that.

Regarding working on MCUs, I agree there's a minimum size past which you don't get crypto primitives, but overall don't think characterizing those parts as modern SoCs is terribly accurate (which was my claim).

Regarding needing better building blocks than ECB for a protocol... well, no, not necessarily. AES-NI doesn't even give you a full AES primitive, and yet it's extremely widely used.


Yes, my main experience is with 2) and these are pretty "modern" (as in recently released MCUs) that support AES-ECB (and maybe a few more in HW). These are not ARMv8 but Cortex-M level MCUs.

The problem and the point I'm trying to make is that a few platforms implement their ECB support in such a way to make it almost useless as a building block. They do not do it as processor instructions the way it's done in x86 (the right way IMHO) but instead it's implemented in a separate co-processor that you program in a similar manner as you setup a typical DMA-transfer. If you aim to encrypt 1KB or more the setup cost for this is negligible and you can get a comparatively good speed. However as we both agree there are very few cases (if any) where you _actually_ want to run ECB over 1KB blocks at a time. When you want to build something like CTR (or CBC), what you need is a fast way to ECB _a single_ AES block (i.e. 16 bytes). With this kind of solution the setting up of the co-processor eats up almost any gains won by doing ECB in HW compared to doing ECB in SW because the cost of the setup (it's I/O after all) comes close to the cost of a SW only ECB of 16 bytes.


Hmm? With CTR you usually just want to fill a long buffer with the appropriate counters and then shove it all through the accelerator. The resulting stream can then be used until exhausted by whatever higher level primitive you're working with. Obviously there's a trade-off in sizing the buffer correctly, but dozens of blocks would be more typical than one.


Yes, and that is what I said in my very first post in this thread. Still you need to handle the counters in SW, do the XORs in SW (unless you have some HW that does that for you as well) and then if you want CCM you need to solve CBCMAC (maybe you have CBC in HW but then there's the memory trade-off again). If you want GCM you need to do BigInt muls (Cortex-M MCUs do not support 128 bit muls). So either way you end up doing pretty substantial parts of it in SW which limits the usability.


> Nearly every SoC you can buy today has hardware accelerators in it, from STM32s up to Xeons. You have to be looking at really tiny, generally pretty old micros before you literally don't have any.

Well... the SoC in Raspberry Pi 4 doesn't have one. Although it does have enough CPU (and in theory GPU) oomph to still do crypto at reasonable rates, AES-128 at 85 MB/s per CPU core.


The RPi 4 SoC does have crypto accelerators, specifically for HDCP. It does not have the ARMv8 crypto extensions.


Does it have any HW crypto accelerator that could be utilized from ARM side software? I couldn't find anything.


Not that I know of, but they're so cagey on details for those parts I wouldn't be surprised if it did and they just hadn't documented it. Certainly lots of quasi-similar boards like the espressobin have them (which I like better for the topaz switch anyway).


That is an exaggeration in my opinion. Many µC don't have a real use case for encryption as well as many sensors.

Maybe you meant that with really tiny, then forget about it. But I would think that there are a lot more units of these tiny chips sold compared to a fully featured 32bit ARM processor.

Maybe that will change with price.


That covers AES and hashing. Try generating a 2048-bit RSA key on a Cortex-M. It will take minutes. ECC is thankfully more performant on resource constrained devices.


Devices with integrated cryptography are uncommon? There are AVR parts with integrated AES.


Algorithm agility has been a hilarious disaster for designing secure systems. In reality even protocols like TLS that are nominally agile advance primarily by versioning, not agility - and TLS is somewhat of a best case here compared to e.g. the mess of JOSE.


You're more right than the parent comment --- algorithm agility is, I believe, an increasingly discredited idea among cryptography engineers --- but there's truth to the idea that a serious PQC scheme is going to be paired with a conventional key exchange, so that a new lattice crypto attack won't break the whole handshake. That's not "agility" --- the schemes will almost certainly be hermetically sealed, one PQC KEX and one curve KEX --- but it does mean you can deploy PQC now without compromising your whole cryptosystem.

The big issue here is that this observation doesn't break the premise of the article. It remains true that we don't know enough about how real-world quantum computers, if they ever exist at scales useful to attack cryptography, will work.


> In reality even protocols like TLS that are nominally agile advance primarily by versioning, not agility

In the last ten years we've been able to disable CBC-based ciphers to protect against POODLE and BEAST, and disable RC4 to protect against Bar-mitzvah and NOMORE, all the while a legacy device could stick with TLS 1.0 without any code changes. All that was necessary was to tweak the cipher preferences.

Being able to support both AES and ChaCha20 is a good thing, and when AESng/Mambo40 come along it will be easier to add them to the protocol in a rolling fashion IMHO.


If that TLS 1.0 device had never needed to support RC4 in TLS to begin with, which practically speaking it never really needed to, we could've done less work at implementation time, provided clearer guidance to users when the vulnerabilities were discovered, and likely had fewer affected users to begin with.


> If that TLS 1.0 device had never needed to support RC4 in TLS to begin with, which practically speaking it never really needed to ...

Except that we did not know about not needing to. When BEAST came out a lot of recommendations said to switch over to RC4:

* https://en.wikipedia.org/wiki/Transport_Layer_Security#BEAST...


This was the reply I came here to write.


1. Yes, https://www.imperialviolet.org/2019/10/30/pqsivssl.html and https://blog.cloudflare.com/the-tls-post-quantum-experiment/ describe actual experiments Google ran using randomly selected Chrome canary users against Cloudflare and indeed their test algorithms CECPQ2 and CECPQ2b are both hybrids built using ECC plus a post-quantum algorithm.

2. However at this present time there is a strong constituency (e.g. see morelisp's comment in this sub-thread) which is happy to blame cryptographic agility for problems it arguably had little or no part in, so you can expect them to fight this while of course not committing to any particular post-quantum crypto because they also hate being wrong and invariably if you pick one now you'll be wrong.


Supersingular isogeny Diffie–Hellman key exchange (SIDH) has post-quantum keys comparable to current RSA keys.

It seems limited to DH key exchange with it's favorable properties, however...


It’s downside is that it is computationally very expensive. All of the other NIST candidates are not very computationally demanding, but have much larger keys. So the trade off with SIDH is that it spends a lot of time calculating while the others are transmitting bytes over the network. There are trade offs with all the proposed schemes. a


> hedge their bets by combining a post-quantum cryptography (PQC) algorithm with a classical one

OpenSSH does this with certain algo choices for anyone particularly inclined.


Don't rush to improve chicken coop, warns local fox


It's an intentionally misleading title to get clicks. The actual context of the quote refers to the fact that encryption methods built in too much of a rush may introduce other vulnerabilities. This article has no substance or constructive purpose.


I read this title as "NSA thinks they are getting close to enough qbits to mount real attacks, please don't switch to PQC before we get a return on our investment!"

In other news, your totally-trustworthy not-at-all-compromised neighborhood NIST is working on new PQC algs!


If this was the case why does the NSA issue warnings about things from time to time?


Because by then other intelligence agencies or mere commoners can also break the crypto.


Great metaphor. I hope the right people (academics, crypto authors, etc.) don't get deterred from reviewing this in earnest, but for lay people like myself, I cannot help but feel as you do.


I feel like the irony here is that, assuming the director of the NSA is even minimally self-aware, there's actually no way for me to know whether this is:

a) an attempt to prevent me from using encryption they can't yet break

- or -

b) reverse-psychology, assuming that a halfway intelligent person would guess this was a case of A, therefore prematurely switching to an untested quantum crypto that they might actually have an _easier_ time breaking.

and honestly, either way I'm left with no actionable information.


Why not a third option, that the research director of the NSA is giving her honest informed opinion?


Because that would be ignoring over 25 years of documented history.


Both are true because this is a government official. A key skill for a government official is to communicate in a way that gives no information to people who disagree with them. This is a sensible adaptation to their conditions. You disagree with this government official. And for that reason you gain no information from them.


How about c) making NSA's adversaries nervous by indirectly suggesting that they might be able to break current crypto and monitoring how different parties sending encrypted traffic react to this "information" to discover interesting/high value targets?

Side channel information exposure attack against human targets.


You've got to give them enough time to dig a tunnel underneath.


Just a note that there's a relatively fail-safe way to avoid concerns about rushing pqcrypto too soon: Just couple it with established crypto.

This is e.g. what google is doing in all their pqcrypto experiments. They use an elliptic curve key exchange combined with a post quantum key exchange. If you don't do any really big mistakes you get at lest the security of the stronger of the two.

Given that elliptic curve crypto is really cheap such a combination will probably be used for most post quantum schemes for a while.


Eventually quantum computing will also be really cheap. I don't think it'll take all that much time, given the amount of money and attention.

Thinking of security in layers, as you've suggested, is the way to go. Secure communications has always been about being "too expensive" to decrypt in a relevant time span.

Just because there's a new kid on the block, we need to keep our old friends around.


I have a hard time reading this with an open mind, because I mistrust the NSA so much. That being said, the processes described in the article sound like they will arrive at a fairly transparent (in a good sense), publicly-vetted algorithm --- or at least one where it's easy to perceive if it's not trustworthy.

I'm not sure, based on my conversations with people in the field, that I put so much stock in the "20 years 'til Shor" prediction. Most seem to be of the opinion that the best currently available encryption should be okay until we're very old.


NSA put (effective) backdoor in Dual_EC_DRBG:

https://www.schneier.com/blog/archives/2007/11/the_strange_s...

Thread by someone directly involved on why the ISO rejected NSA ciphers in the past (hint: they refused to justify design decisions, lied, and attacked the credibility of people who had put out actually-secure crypto):

https://twitter.com/TomerAshur/status/988696306674630656

Either they're too incompetent to be trusted or are bad actors and should be treated as such.


> Either they're too incompetent to be trusted or are bad actors

I agree but am curious about one thing. If they aren't stupid won't they realize it's a terrible idea to embed weaknesses in standards destined to become so pervasive your own economy will rely on them (as well as your own military/intel)?

Also, history virtually guarantees that your most secret secrets are reasonably likely to become known to your adversaries and used against you with catastrophic consequences? Even if they could entirely subvert the standards process to plant their own backdoor that ends up everywhere, game theory dictates they might be creating their own greatest future weakness.


> If they aren't stupid won't they realize it's a terrible idea to embed weaknesses in standards destined to become so pervasive your own economy will rely on them (as well as your own military/intel)?

The Dual_EC_DRBG backdoor [1] is a asymmetric crypto based backdoor. Only the one who built the back-door has the key to access it. It's even impossible to prove the backdoor is there without knowing the key.

This way you don't weaken yourself or allies WRT 3rd parties when standardizing on that specific weakened algorithm (of course you need to closely guard your key).

[1] https://en.wikipedia.org/wiki/Dual_EC_DRBG


The military is still on unencrypted IRC, the dog has already been fucked for years on that one. The economy doesn't really matter, and for any person they deem does matter, they can just tell them under NSLs to use something that DJB made, instead, if they even understand the difference between protocols to begin with.


From what I'm aware, the military doesn't use IRC? IRC was inspired from BITNET, which was used by the DoD briefly and not the military as a whole? Military networks are encrypted at the network layer? User authentication is enforced at the session and presentation layers?



Unencrypted IRC (esp on a secure network) sounds like a massive upgrade from what I'd answer if someone here said "Guess what the military just switched to for their secure chat??" (slack or some other commercial hosted web atrocity). :)


They can have highly competent researchers and still be institutionally stupid.


NSA fixed the S boxes in DES, protecting users from a type of attack that wasn't yet public knowledge. Many people wrongly assumed that the NSA was trying to weaken DES, but they strengthened it.


> That being said, the processes described in the article sound like they will arrive at a fairly transparent (in a good sense), publicly-vetted algorithm

Or a family of them for different applications anyway, which increases chances both that a) at least one of them is actually quantum resistant and b) at least one of them will have a flaw exploitable by even classical computers.


In 15-20 years I really hope a book comes out about the NSA's quantum computing work in this decade. I'm immensely curious to know how far they are/are not ahead of the Google/IBM/D-wave crowd.

They pretty obviously haven't recreated the magical chip from Sneakers yet (I think), but are they close? Or maybe they are there, but they have to be so very careful with how they use it so not to reveal its existence?

Or maybe they're far enough along to realize that's not a thing that can be built in this century?


My guess is they aren't far ahead which explains why they are constantly telling companies to not encrypt data. The reality is they've spend billions developing a "super weapon" of mass surveillance and that entire system is heavily reliant on people not using effective encryption and companies willingly handing over data or creating backdoors.


I agree, this is the simplest and most likely explanation. As tempting as it is to buy into the romantic idea that they’re 100 years ahead of everyone else but can only use their powers in extreme circumstances, that just seems very unlikely.


The funny thing is even if they did have that they would still be as useless for their claimed function given the sheer haystack of communication and simple opsec and signals making it impossible to decode from the context.

A terrorist saying "Say hello to grandma for me." Could mean cancel/start/delay the attack or it might mean exactly what it says.


See you at the "fireworks" tonight. We're gonna have a "blast." Muahahahaha.


Indeed. The probability of government agencies not at least seriously considering a “Manhattan project” for quantum is zero.

But it’s a different situation from the 40s in many ways, consider this: quantum machines could potentially revolutionise several fields, including medicine. So by keeping an advanced machine secret (whether it has arrived or in the future) could be passing up the opportunity to save millions of lives. It’s a hell of a moral judgement call to make. I’d go so far as to say that any intelligence advantage is just not worth the trade off. You’d have to find some way to share that technology. If your hypothetical book emerged, it’d be fascinating to see how it is justified.


I think it's entirely plausible (and likely) that the US intelligence agencies would keep life-saving technologies secret if other uses of that tech would give them an edge in intelligence-gathering.


If they do have that box, they'll still be using it in 15-20 years. If nothing else, information from it will still be of some use and they won't want to give up the game.


So, the NSA, who intentionally compromised NIST with bad crypto proofs, is now warning NIST not to make better crypto, in the name of "security".


The headline is clickbait. The only reference to not rushing into quantum-proof encryption happens near the end of the article, and the warning is that folks adopt them gradually so that any vulnerabilities are caught.

This is standard practice blown out of proportion.


I don't get that sentiment from the article at all. The article is talking about the difficulty of targeting a problem space 'too early' vs 'too late'. Too early and you solve the wrong problem, too late and, well, the obvious.


Breaking the old RSA requires more quantum computing horsepower than breaking newer ECC schemes. So although ECC is far more resistant to current attacks, it is less resistant to quantum attacks. The NSA is just saying that you're better off staying with strong RSA security than you are investing tons into migrating to Suite B since you'll just have to migrate to whatever NIST has coming that is quantum-proof.


Where did you read that? The only thing she says is to wait the competition to end (by 2022):

> “It's very important that people wait for NIST to do its due diligence,” Frincke said.

This is the standard process.


The NSA, who have a history of feeding NIST intentionally compromised crypto algorithms, suggest we need to wait to hear what NIST recommends?

How very fucking convenient.


It's a totally different ball game though: this isn't about NIST recommending a shady algorithm with mysterious parameters, this is about a well-known standardization process that accept submissions from cryptographers all around the world where anyone can review the proposals and make comments.

I don't care what they end up selecting as the winner (and to be honest, I'm so ridiculously paranoid that I don't trust Keccak, for instance), I just think that having a competition where everyone is spending all their energy into looking for flaws on the others candidates is a great thing.

Next year China will announce a similar standardization process. Do I trust China? of course not, but I really welcome this initiative anyways.


Quantum computers that can break traditional crypto do not exist. Whether post-quantum crypto exists today or not doesn't matter.


> Quantum computers ... do not exist

How can you be so sure? What if a nation state invents one and treats it as a Manhattan Project style secret?

> Whether post-quantum crypto exists today or not doesn't matter

Unless the first "Quantum computers that can break traditional crypto" are invented and announced tonight... then it matters about the current state of post-quantum crypto.


That's exactly what they're saying.


I think the key quote is this:

> "Shor's algorithm is the attack that was developed in the absence of a quantum computer,” Frincke said. “It's hard to predict what people will actually do with one."

Once QC actually manifests we may find new properties and new approaches to attacking existing algorithms, and being 'quantum hard' before that point is impractical.

I'm not a cryptographer by any means, so I have no idea if this is actually the case. Perhaps we understand the fundamentals so well that we can truly say an algorithm today can hold up in a QC world - I have no idea, that certainly sounds bold, but we do already have purported 'quantum hard' algorithms.


> Perhaps we understand the fundamentals so well that we can truly say an algorithm today can hold up in a QC world - I have no idea, that certainly sounds bold, but we do already have purported 'quantum hard' algorithms.

We don't have any theoretical proof that we can even encrypt against classical computation. It's still technically an open problem if P=PSPACE (as well as P=NP). All encryption (quantum or not) would be broken if we could effectively solve PSPACE-hard problems.

So really, nobody can truly say any encryption can hold up anywhere. But we still usually have a good idea of the truth of things simply based on empirical evidence - we don't think anybody is proving P=NP, much less P=PSPACE. We don't think people are going to crack our best classical encryption without brute force.

There's not as much empirical evidence that our current quantum encryption will hold up, which is the point of the assertion "it's hard to predict what people will actually do [with shor]"


I feel like 1) that is the key quote, 2) it's true, 3) it's also setting the bar too high for current efforts.

The first go at a post-quantum public-key crypto standard may in effect be (a public-key analog of) the DES that'll need be replaced with an AES someday. But provided we can get something that isn't a step backwards in classical security (so, well-studied problem) and doesn't fall to Shor's algorithm or any other thing we can find in a few years, that's way ahead of where we are now.

We may be feeling around in the dark because we don't know much about quantum computing, but we may also have to do that if we want reasonable security when the first big quantum computers show up.


The NSA: Don't rush building quantum proof encryption guys. Might want to take your time, make sure you really get it right. Could take years to prove it out. Maybe decades. (Psst, Joe, when will the quantum brute-force crackers be ready? How soon can you get them online?)


Joe: 2009.


I realize this is just the broken clock being right twice a day but the fact that we give the NSA any credibility of any kind is truly mind boggling.

Would you use YubiKey ever again after they got caught putting keyloggers into one model? That's what the entire security community does every time they so much as entertain anything the NSA says.


> That's what the entire security community does every time they so much as entertain anything the NSA says.

[Morpheus Meme] What if it doesn't matter what the NSA publicly says, but what it quietly does and which universities+researchers it funds?


Can we start a Gofundme or Kickstarter or something to get 24/7 bodyguards for DJB? We gotta keep this dude alive.


Hey NSA. You still on top of those backups you were making for me? Might need some of that data back soon...duplicati has been spitting errors


Send an email to the admins at the Utah Data Center. They have it.

https://en.wikipedia.org/wiki/Utah_Data_Center


"The cybersecurity community is already hedging its bets against a future when digital secrets are knowable to anyone with the right hacking chops and a couple dozen qubits."

What a bunch of journalistic bunk. All QC researchers I know say it will take at least thousands of logical qubits to break RSA. Given the large number of physical qubits required to produce one logical qubit using quantum error correction, this might entail millions of physical qubits, well beyond our current technology - and way more than "a couple dozen qubits".


Of course NSA would have this opinion. They depend on you believing it so they don't have to buy the newest D-wave or whatever.

If we adopted properly encrypted communications back in the 90's (when others were also trying to pretty much illegalize it), nobody would be complaining about speed today.

The problem is, everyone ignored encryption ("I've got nothing to hide!") and now that they're learning that they actually do need it, they're not willing to accept the drop in speed it would entail. Everything up to now has been sold based on how much faster it is. Security is still hindsight, and the perception of even slightly slower bandwidth infuriates people for no real reason besides.

NSA is up to its same old predictable game. They rely on public laziness and fixation on shiny new things.


It was the NSA who shocked [0] the world of cryptography by announcing "elliptic curve cryptography is dead under the threat of quantum computers, we must move to Post-Quantum Cryptography ASAP and encourage its development." (common knowledge, uncontroversial), "therefore, if you are running a legacy system which had not yet upgraded from RSA to ECC, you should not bother to do so, and instead should save money for the future upgrade to post-quantum protocols." (WTF? Most people thought the threat of quantum computers is serious, but ECC should be good for another ten years, and one should should definitely upgrade to ECC, also, it's worth to revise the ECC standard to include newer curves).

> In August 2015, the U.S. government’s National Security Agency (NSA) released a major policy statement on the need to develop standards for post-quantum cryptography (PQC). The NSA, like many others, believes that the time is right to make a major push to design public-key cryptographic protocols whose security depends on hard problems that can-not be solved efficiently by a quantum computer. The NSA announcement will give a tremendous boost to efforts to develop, standardize, and commercialize quantum-safe cryptography. While standards for new post-quantum algorithms are several years away, in the immediate future the NSA is encouraging vendors to add quantum-resistance to existing protocols by meansof conventional symmetric-key tools such as AES. Given the NSA’s strong interest in PQC, the demand for quantum-safe cryptographic solutions by governments and industry will likely grow dramatically in the coming years. Most of the NSA statement was unexceptionable. However, one passage was puzzling and unexpected:

> <quote>For those partners and vendors that have not yet made the transition to Suite B algorithms, we recommend not making a significant expenditure to do so at this point but instead to prepare for the upcoming quantum resistant algorithm transition.... Unfortunately, the growth of elliptic curve use has bumped up against the fact of continued progress in the research on quantum computing, necessitating a re-evaluation of our cryptographic strategy</quote>

> The NSA seemed to be suggesting that practical quantum computers were coming so soon that people who had not yet upgraded from RSA to ECC should not bother to do so, and instead should save their money for the future upgrade to post-quantum protocols.

> Shortly thereafter, the NSA released a revised version in response to numerous queries and requests for clarification. The new wording was even more explicit in its negative tone on the continuing use of ECC: “...elliptic curve cryptography is not the long term solution many once hoped it would be. Thus, we have been obligated to update our strategy.” Although other parts of the statement assured the public that ECC was still recommended during the time before the advent of practical quantum computers,the overall impression was inescapable that the NSA was distancing itself from ECC.

> In addition, people at the National Institute of Standards and Technology(NIST) and elsewhere have noticed that the NSA has not been taking an active part in discussions of new curves to replace the NIST curves that were recommended for ECC in 1999. The PQC announcement suggests that the NSA has no interest in this topic because it now views ECC as only a stopgap solution. The statement in fact advises against “making a significant expenditure” to upgrade to any of the Suite B algorithms, let alone to any new ECC standards using updated curves. Even industrial and government users who are using antiquated protocols should just sit tight and wait for post-quantum standards. This caught many people by surprise,since it is widely believed that ECC will continue to be used extensively for at least another decade or two.

Later, NSA ordered the NIST to start the PQC competition - which I think is good for driving further development of PQC.

But now, NSA research director is telling us "don't rush"? It doesn't make sense.

[0] See the exceptionally good paper by Koblitz, et al. https://eprint.iacr.org/2015/1018.pdf


I think the main thrust of the article is just that she feels that we should let the NIST process play out which means we're looking at no recommendations until 2022.

I'm not real familiar with the field, but it wouldn't surprise me if there are vendors out there selling "Quantum proof encryption" which actually isn't, and maybe she has some insight into that and is trying to warn people away from it, without revealing anything specific.


> vendors out there selling "Quantum proof encryption" which actually isn't, and maybe she has some insight into that and is trying to warn people away from it, without revealing anything specific.

There are. Okay, so it seems that the NSA is just here to warn people about the snake-oil, makes sense.


I don't believe it was the case that anyone was "shocked" about elliptic curve being broken by quantum computers; that was well understood long before NSA's Suite B announcement or Koblitz's paper.


Please read my edited comment again, people were shocked not by the fact of quantum attack of elliptic curve, but by

> The NSA seemed to be suggesting that practical quantum computers were coming so soon that people who had not yet upgraded from RSA to ECC should not bother to do so, and instead should save their money for the future upgrade to post-quantum protocols.


>"therefore, if you are running a legacy system which had not yet upgraded from RSA to ECC, you should not bother to do so, and instead should save money for the future upgrade to post-quantum protocols." (WTF? Most people thought the threat of quantum computers is serious, but ECC should be good for another ten years, and one should should definitely upgrade to ECC, ...)

The government moves much more slowly than consumer-oriented high-tech companies. Upgrading a large legacy defense system might very well take longer than 10 years. (And the NSA's advice mentioned above is directed toward government and contractors.)


There's a difference between encouraging security guys to start looking at new cryptosystems now and encouraging them to pick one and put it into _the_ standard right away. The NSA saying that probably also helped research get funded. Crypto standards probably should be picked to be good and usable for a long time; jumping one of the first few may not be conducive to that.


I to this day think quantum proofing of encryption will greatly weaken whatever upstream its use


"We already have it, and we don't want you to compete with us" -NSA, probably


"Don't depend on bleeding-edge software exclusively."


The only guaranteed safe post-quantum crypto is the one time pad.


I'm a simple man. I see an algorithm backed by djb, I use it.


Hahahahah of course the NSA research director doesn't want people to start using quantum-resistant algos. What this says about their quantum capability is left as an exercise to the reader ;)


There's never been a perfectly secure lock in all of history, thousands of years of it. What makes modern day humans think that one can exist in digital form? I think there is a direct connection to the laws of physics somewhere.


> What makes modern day humans think that one can exist in digital form?

...Shannon's[1] work showing that the one-time pad maintains perfect secrecy?

[1] https://ieeexplore.ieee.org/document/6769090


copied from Reddit

No, when properly implemented, it can not be broken. "Properly implemented" pretty much implies a paper version where there are only two copies of any pad page, and both are immediately destroyed after use.

Computerized versions share the same vulnerabilities as the computer it is on. It can't be "cracked", per se, but there are side-channel attacks that can be effective.

Here is a paper by Dirk Rijmenants which explains it in the context of Cuban spy communications: http://users.telenet.be/d.rijmenants/papers/cuban_agent_comm...

I've experimented with generating them manually, using 10-sided dice to generate the code groups and an old manual typewriter (not electric!) with a very used cloth ribbon and 2 part carbonless forms. Works pretty well, and once you get into a rhythm you can make a considerable amount of key material. I just grabbed some random 10-sided die at the local gaming store, but if you were serious about it, I'd get Game Science 10-sided die: http://www.gamesciencedice.com/Gamescience-White--d10--Ten-s...


You're saying we shouldn't spend time to make sure the algorithm is mathematically sound...because the implementation will have side channels? Using your metaphor, that's like saying we should never bother designing a secure lock because someone will just break a window to get in anyway.


Math, mostly.


copied from Reddit

No, when properly implemented, it can not be broken. "Properly implemented" pretty much implies a paper version where there are only two copies of any pad page, and both are immediately destroyed after use.

Computerized versions share the same vulnerabilities as the computer it is on. It can't be "cracked", per se, but there are side-channel attacks that can be effective.

Here is a paper by Dirk Rijmenants which explains it in the context of Cuban spy communications: http://users.telenet.be/d.rijmenants/papers/cuban_agent_comm...

I've experimented with generating them manually, using 10-sided dice to generate the code groups and an old manual typewriter (not electric!) with a very used cloth ribbon and 2 part carbonless forms. Works pretty well, and once you get into a rhythm you can make a considerable amount of key material. I just grabbed some random 10-sided die at the local gaming store, but if you were serious about it, I'd get Game Science 10-sided die: http://www.gamesciencedice.com/Gamescience-White--d10--Ten-s...




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: