Hacker News new | past | comments | ask | show | jobs | submit login
OpenSSH now uses hybrid post-quantum streamlined NTRU Prime X25519 by default (openssh.com)
75 points by DyslexicAtheist on April 9, 2022 | hide | past | favorite | 39 comments



It's hard for me to comprehend simple asymmetric encryption let alone understand how this solution is quantum proof. Everytime I read something like this I wonder if I'll ever catch up with the times.


QC can only solve two problems more efficiently than classical (and this is proven very thoroughly): unstructured search and discrete log problem (well, more accurately, certain hidden subgroup problems). DLP covers factoring, hence RSA. Elliptic curve crypto is directly a discrete log problem, as the crypto is based on the underlying group defined over rational curve points. NTRU is based on neither problem; it's based on a lattice problem in a certain class of rings. Lattice problems cannot be reduced to either class of problems for which quantum provides speedups.

As a side effect of quantum only being more efficient for a very limited set of problems, there are many crypto systems for which quantum provides no effective benefits.

This can all be made very precise. For those with some math background, almost two decades ago I wrote a decent technical summary of the DLP style aspect at https://arxiv.org/abs/quant-ph/0411037


> QC can only solve two problems more efficiently than classical (and this is proven very thoroughly)

Do you mean those are the only two KNOWN problems? Otherwise the claim is very surprising. It is not even known whether QBP=P, I thought, while on the other hand there are some sampling problems in QBP that last I heard were thought to be outside PH (Aaronson and Arkhipov's boson sampling, and whatever the quantum supremacy crowd is currently doing). But, those are probably useless for cryptography.


> Lattice problems cannot be reduced to either class of problems for which quantum provides speedups.

Essentially any problem can be poised as an unstructured search. But the quantum speedup for generic unstructured search is modest-- a sqrt in the number of operations. This can be compensated just by doubling the number of bits of the effective search space.


Unstructured search has sqrt n speedup once all items are loaded, which generally takes O(n) classical time anyways, so Grover doesn't generally present a sqrt speedup for crypto (or any database search where the database is not already loaded as a quantum state).

But the idea is right, sqrt speedups even if possible would not weaken crypto enough to break it since encoding would still be exponentially faster, and that's all that's needed for crypto: encoding sufficiently faster than decoding without knowing a secret.


The 'database' can be a function implemented as a quantum circuit, e.g. a function that takes a candidate private key (and noise inputs) and computes a public key from it and compares it to the target's key. Grover gets handed this function as an oracle and asked to find the input that makes it return 1, which it can do with sqrt(|key|) operations.

Of course, the circuit for reversibly computing a public key ends up ginormous... but it's still vastly smaller than the keyspace. And if we were limiting ourselves to the practical we wouldn't be talking about quantum computers. :P


Thanks for the article!

> QC can only solve two problems more efficiently than classical (and this is proven very thoroughly)

I didn't see a discussion of the proof in your article, although maybe I missed it. I thought your article instead emphasized that all quantum algorithms that have been discovered so far can only be more efficient relative to classical algorithms with problems that can be reduced to hidden subgroup problems.

Is there some other research (that maybe you mentioned) that goes on to prove that no new quantum algorithm could ever do any better on other problems? If not, isn't this a conjecture (perhaps a widely-believed conjecture) of complexity theory? When I last looked at the Complexity Zoo, I thought there were quite a few different complexity classes that had been defined relative to quantum computing, and it was apparently unresolved whether many of them could be reduced to one another.

I don't really know this area at all, beyond having heard of Grover's and Shor's algorithms and read the SMBC on what quantum computing doesn't do. And having seen some candidate PQ cryptographic algorithms, without understanding exactly why any of them were considered credible as candidates. :-)


I have a really old (college) implementation of plain old NTRU in Python, if you'd like a little insight into what NTRU is and how it works at a superficial level (kind of like how RSA is never textbook RSA). There are also some slides explaining the math, but you have to be comfortable with some college level group theory (even reading my own slides is hard because I haven't used it since then).

The short answer is that NTRU uses convolution polynomial lattices. Breaking the cryptosystem requires finding a "short" basis for a lattice described by the private key. There are efficient algorithms for this in R^2, but not for high dimensional convolutional polynomial spaces (LLL Reduction is the best I'm aware of, which is greater than O(d^5) where d is the number of dimensions https://en.wikipedia.org/wiki/Lenstra%E2%80%93Lenstra%E2%80%...).

Do note the impl I did is absolutely terrible and probably subtly buggy but might get the idea across. Also, it splits the message into blocks and encrypts that instead of just wrapping a symmetric encryption algorithm because I was young and naive. :)

https://github.com/logannc/pyNTRUEncrypt


IIUC, NTRU is only "quantum proof" because there is no (publicly known) quantum algorithm for breaking it, while there is Shor's for everything discrete logarithm based (both RSA and elliptic curve anything).


Until we prove P!=NP that (i.e. its only secure because we dont know how to solve it) is pretty much true of all crypto algorithms, even classically.


This is true, but we don't know even more for quantum computers, because we have spent a lot less time studying quantum algorithms, so not yet having found an algorithm is much weaker evidence that the algorithm doesn't exist.


I can understand Curve25519, Diffie-Hellman, but lattice-based crypto goes way over my head.


There is a comment on LWN saying that the addition of NTRU Prime for PQC is a little premature:

https://lwn.net/Articles/890788/

> The addition of NTRU Prime for PQC is a little premature, because a new paper just came out which showed that many LWE schemes are weaker than was previously thought. The paper admits that it is not directly applicable to NTRU (which is not an LWE scheme), but it might be adaptable to it.

> https://groups.google.com/a/list.nist.gov/g/pqc-forum/c/Fm4c... > https://doi.org/10.5281/zenodo.6412487

> > NTRU-based cryptosystems are among the leading candidates for lattice-based post-quantum cryptography. In this work, we propose improvements to the dual attack on LWE, and as such our attack is not immediately applicable to NTRU-based cryptosystems. It is an interesting question whether ideas from this work can be adapted to similar improvements to attacks on NTRU. Specifically, there appear to be similarities between the dual attack on LWE and the so-called “hybrid attack” [How07, Wun16] on NTRU. The hybrid attack also involves enumerating over parts of the secret, and then invoking some distinguisher to determine whether a resulting vector is close to a certain constant lattice. It seems reasonable to the writers of this paper that ideas similar to those presented here can be used to reduce the running time of such attacks as well, though anything definitive would of course require further research.


It's pretty easy for openssh to change kex ciphers-- they're not used for persistent keys, and they're negotiated every session.

How many more years should it go with no attempt at resisting future decryption by quantum computers just for the sake of potentially keeping the surface of legacy ciphers that would need to be carried around one entry shorter, should it become necessary to replace this cipher?

I think instead, it could be easily argued that it was long past due: it could have been deployed several years ago.

The security story for lattice crypto will probably remain dubious for the forseeable future (unless it gets converted to outright broken). But on desktop computers and servers lattice crypto is so cheap that most applications can just throw it in just-in-case. Unfortunately, the obvious alternatives like code base crypto have operating costs that make them harder to justify on a just-in-case basis.


I should also comment that even someone who believes QC's will never be a thread should be enthusiastic about this change: Now an attacker who has logged SSH sessions will need to break both the Ed25519 and Streamlined NTRU prime. Even if you believe we'll never see a QC large enough to break ed25519 you can't deny that we might someday find a classical way to break ed25519.

For authentication keys we can rotate them out when ed25519 looks at risk, but logged data is forever. Protecting against that by using two very different asymmetric schemes seems well justified esp when the cost is likely to be completely invisible to the user.

Reasonable people could have a debate about how far to go-- e.g. I think low volume applications like email encryption should be using ed448+McEliece but I seem to be alone there-- but just having N+1 security for the key exchange underlying assumption seems very defensible to me.


I wonder how much of the warehoused data[1] being sat on by security agencies will be decrypted in any meaningful timeframe.

[1]: E.g. the exabytes at https://en.wikipedia.org/wiki/Utah_Data_Center, but there's lots of these agencies in the world


Decrypting against leaked / found private keys is likely the path for now - but on the infinite timeline - 100% will be decryptable.


Once there is a tingle it’ll happen they’ll reissue keys and reencrypt. Nobody is going go “oh well out of business time was a good run”


But all the data up to that point is wide open. If it ever becomes practical to decrypt captured data, there's going to be some interesting leaks from non-quantum-proof encryption age.

Also lots of cryptocurrencies will shit themselves, which might be a feature or a bug, depending on your perspective.


You have to operate under the assumption that state agencies already have access to your communications. Also, 256 bit hashes are considered broken but that doesn’t mean state agencies can break the at a sufficient rate or that they are interested in breaking cryptocurrencies.


which well known 256 bit hash is broken?


Let’s assume you’re right and the crypto itself is sound. I agree that is likely and doubt any TLAs have a significant mathematical edge. Does the NSA just stop there and call it a day? No, of course they write Stuxnet. They attack everything else.

What well known piece of software isn’t broken?


FreeBSD?


I think you mean OpenBSD. From their homepage:

> Only two remote holes in the default install, in a heck of a long time!

FreeBSD, not so much.

https://www.cvedetails.com/vulnerability-list/vendor_id-6/pr...


Keep in mind that openbsd has a tiny base install, so it's not exactly surprising that there aren't a lot of remote exploits. Nothing's listening by default, so why would there be?

By the time you set up OpenBSD to do anything, that's gonna change.


Indeed. “Default install” is a big asterisk.


Up to 1024 bit is considered to be broken (not mathematically but at least by means of brute force) against state actors in a targeted attack.


What do you mean by "brute force"? Find a collision? I think it's almost certain the combined resources of every computer ever made would be unable to brute force a collision in a 1024 bit hash.


You're confusing symmetric and asymmetric cryptography, it's such a basic mistake that maybe you should not comment on cryptography until you've read some more.

SHA-256, SHA-3-256, BLAKE2, BLAKE3, etc. are all not-broken and have "128 bits of security".

256bit ECC like NISP P-256 ECDHE, ECDSA and ECIES, X25519 ECDHE and Ed25519 EdDSA are not-broken and have "128 bits of security".

RSA and FFDHE at 3072 bit are not broken and have about "128 bits of security". RSA-2048 remains very popular, although it is only about 112 bits of security.

See https://security.stackexchange.com/a/230713/70830 and the links from it for more info.


That does not sound right. Are you talking about RSA?



Am I alone in wondering if quantum attacks on existing algorithms are still very hypothetical and may never come to fruition?


Folks in quantum computing seem to think that's it's a near-certainty, "just an engineering problem." As long as you don't ask for a specific timeline, they won't even lie.


The "Too Much Crypto" paper is pretty reasoned.

See section 5.2, and 5.2.3 - "Should we care?"



It mentions only "key exchange method". Does it mean we can keep our old keys or it requires different types of keys to be used?


Different keys.

For example, you might currently be using a public/private keypair for 4096-bit RSA. That keypair (by definition) only works for the RSA key exchange algorithm. Likewise, an x22519 keypair is for the x25519 key exchange.

A sntrup761x25519 keypair will be its own thing. As an aside, a sntrup761x25519 public key will be two public keys glued together (one for each algorithm). [1] Likewise for the private key.

(one could reuse an existing x25519 keypair for the x25519 component of sntrup761x25519, but it seems like a bad idea)

[1] https://github.com/openssh/openssh-portable/blob/master/kexs...


Same keys, servers and clients will keep using their ed25519 keys for authenticating each other, the keys for the key exchange are negotiated on login.


Huh! Thanks for the correction.

That makes a lot of sense -- I have some more reading to do...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: