Hacker News new | past | comments | ask | show | jobs | submit login
Stop using RSA (2019) (trailofbits.com)
165 points by spekcular on Nov 8, 2020 | hide | past | favorite | 113 comments



This post ignores the fact that most users of RSA do not have to worry about these low-level implementation issues because they are using mature and well-vetted implementations.

Also, this part in particular displays an alarming lack of nuance:

> if you come to us with a codebase that uses RSA, you will be paying for the hour of time required for us to explain why you should stop using it.


The point of the article is that mature and well-vetted libraries have repeatedly been shown to have vulnerabilities in their RSA implementations. See [1] for just one recent example, but there have been many.

[1]: https://www.cryptologie.net/article/461/the-9-lives-of-bleic...


So the solution to a proven algo plus mature implementation with some issues is to... switch to a green algo with a green implementation?


The point is that the alternatives aren't "green" and their implementations are mature, arguably even better understood than RSA and its implementations.


Switch to other proven and mature algorithms eg libsodium, Tink, etc. RSA has not been the only game in town for a long time.


There are many other algorithms that are just as proven but don't feature many of the traps contained in RSA. The main issue with RSA is that depending on implementation and context-sensitive parameter selection (i.e. you actually have to think about them and can't just always use the same) you get wildly different results. Other algorithms, like ECC, also have parameters, but they aren't context-sensitive, so always choosing the same one is fine for developers and only security researchers have to think of new and better parameters. The Question really is how brittle do you expect/accept your encryption algorithm to be: RSA is very brittle in very unexpected ways, while e.g. ECC is only brittle if you choose a bad curve (but you can simply always select the same safe one) and people also know that.


ECC also depends on good RNG for computing signatures. RSA needs it only at key creation time.


This is not necessarily true - deterministic EC signatures are now recommended, eg EdDSA or [1] for ECDSA. (To prevent some side-channel attacks you might want to reintroduce some randomness).

On the other hand, it’s now recommended to use PSS padding with RSA, which requires a random value for each signature.

[1]: https://tools.ietf.org/html/rfc6979


It's not enough to have a well-vetted implementation. You also have to use RSA safely, and there are pitfalls to using it right that ECC algorithms don't have.

So, sure, use RSA certs for your TLS connections, and I guess, fine, use RSA SSH keys,

But don't add RSA to new designs where it doesn't have to be there.


What,s the proposed alternative with at least 20 years maturity?


ECDH and ECC signing systems (simpler answer, and the one everyone means when we talk about this stuff: libsodium). The "20 years maturity" thing is funny, because it's a line from 10-15 years ago that has now aged out. In reality, it was always a shorthand for "don't adopt systems you read about on sci.crypt", which, obviously, modern curves aren't.


>10-15 years ago

Was there a publicly available RSA implementation (even a commercial one) that was 20 years old 15 years ago?


It,s not a ,,line,, it,s a general standard I apply to any software I build on.


The author's experience seems to be contrary to what you are saying about users and/or implementations. I don't know who to believe, but I'm not going to automatically believe you.


> alarming lack of nuance

What's the nuance you see that they don't?


The fact that many systems have no control over all parties to make such a change.


I don't like this article.

From the security perspective, RSA is a perfectly acceptable choice. EdDSA is good and also a perfectly acceptable choice. They have different tradeoffs (some of which are mentioned in the article but not all), so you may want to consider trade offs, but as a user (e.g. choosing ssh or ssl keys), not a designer, everyone should feel free to choose either from the security perspective. You need to choose the right key size for both, and you need to use the correct vetted tools for both. But there really isn't much of downsides for using either.

As a system designer, there are some trade offs, but once again, in almost all practical system designs, both are acceptable security wise.


But the thing is that it isn't.

RSA is much harder to make secure that even simply ECC. It is systemically hard to make time it timing safe, more or less any mistake I padding completely compromises it. More or less every single operation involved in computing RSA exchanges is susceptible to timing attacks.

Then as the article says, it's very easy to choose "bad" keys. Bad keys are ones that aren't prime (I mean duh, but primality testing is probabilistic and has been found to go wrong before).

At a theoretically level RSA and ECC are essentially equivalent (in the sense that the maths is based on the same core problem), but the details of how it's approached is really important.

Then when get to basic practicality. ECC keys are much much smaller than RSA for the same level of security - 256 bits vs. 4-8k - and while RSA is faster than ECC, the advantage is severely impacted by key size, so secure rsa keys will be slower the equivalent ECC in not too long.

That's before considering the network traffic cost and latency for key exchange and certificate validation.

The only advantage RSA has over ECC at this point is that it is conceptually easier to understand - the mathematics are all basic primary/grade school arithmetic on simple whole numbers.


I initially disliked the article as well, but I think it makes a fair point about misuse-resistant libraries. Or as you put it, "correct vetted tools".

Where are all the secure hybrid RSA schemes?

I use RSA-OAEP for tiny messages and feel quite good about it. But that's not a general purpose tool. If I had to manually do all the steps to properly combine it with AES for block or streaming encryption, I'd be in danger.

libsodium is seriously ubiquitous and you only really have to worry about nonces.


JWT/JWE/JOSE RFCs have some secure hybrid RSA schemes, with mature libraries. It's typically RSA-OAEP and an AES session key


Here is what Bernstein has to say about RSA

https://www.hyperelliptic.org/tanja/vortraege/facthacks-29C3...

See the conclusion slides. In short, RSA is secure if primes are large and random.

It’s worth mentioning, Bernstein is the author of a competing algorithm. There are hundreds of scientific talks and articles about RSA by renowned cryptographers.


I'm not a crypto expert, but these slides mostly focus on "factoring" attacks, and don't cover other kinds of attacks mentioned in the submission link.

For example, the slides you linked don't mention padding or exponent weaknesses.



This seems to work better as an argument against rolling your own crypto than an argument against RSA in particular.


I think it does both - rolling your own crypto is almost always a bad idea, but some crypto is harder to implement well than other.


"research has shown that roughly 1% of TLS traffic in 2012 was susceptible".

I don't think this is right. Researchers showed that 1%ish of keys they found on the internet are bad, not that 1% of traffic is susceptible.

From having studied this in a large enterprise setting more recently, there is in fact a really high prevalence of junk keys but much of it is concentrated in self signed certs and other garbage. Voip devices, printers, and management cards. Roca, internet of shit etc. The last big issue that hit the a high proportion of traffic was maybe the debian debacle from 2008.

Now, is this better or worse you decide, but it is a different claim.

The keys that do the vast bulk of traffic generally come from respectable implementations.


FWIW, none of the major cloud providers support ed25519. It's 2020 and no one seems to care about it.


AWS won’t let me add an ED25519 SSH key, but Google Cloud Platform does and it seems to work. YMMV.

(I have no idea why AWS won’t let you do this. I am pretty sure it just passes the SSH key to the VM and that it just writes it to authorized keys by default. But at least in the UI, it doesn’t seem to validate whereas RSA keys do...)


Github suggests it in their docs[1] which I can confirm works as I just added one today. A coworker added my ed25519 key to a DigitalOcean server today as well, and I successfully ssh-ed in with no issue.


Gitlab has not only allowed it but recommended it for several years now iirc. I'm now in the habit of generating ed25519 keys for all of my ssh hosts.


For TLS certificates? That's because none of the CAs support it.


For TLS certs, you can just use ECDSA. Let's Encrypt has a ECDSA root.


ECDSA has some bad footguns as well. I don't see why you'd consider ECDSA to be any more robust than RSA if we're talking about certificates keys for TLS.

EdDSA is better, but is in general not yet supported in the WebPKI (browsers, BRs). I did manage to successfully use it for an internal mTLS setup though, so I think it's only going to be a matter of time.


The point isn't "good crypto must not have footguns", it's "RSA has so many footguns, that we as a community should stop silently accepting uses of it".


For SSH, I’d guess.


Yes, for managed systems (meaning not a SSH inside a VPS), at least I'm sure that both AWS and Azure does not support other standards (no experience on GCP).


That particular curve is less popular. Do they support NIST P-521?


Stop using them. SSH-RSA is deprecated in OpenSSH since a few versions now.


What is deprecated in OpenSSH is not all of RSA in OpenSSH. They've deprecated "ssh-rsa", but there is still e.g., rsa-sha2-256. These describe the algorithms used in key exchange; ssh-rsa uses SHA-1. This is separate from the types of public/private keys, which is what parent is describing.


Back in university I learned the in and outs of RSA, and to be honest, it seemed simple, understandable, and frankly, quite solid. It absolutely depends that p and q are chosen at random, but besides that, should be basically uncrackable.

Please let me know if I am wrong (outside of a quantum computing breakthrough).

edit: I understand that p and q need to be large primes. But there are a gargantuan number of large primes. AFAIK, if 52 cards in a deck shuffled randomly outnumbers in possibilities than the number of atoms in the universe, than surely RSA beats that by many many orders of magnitude in terms of computational complexity even at 1024 key length?


The article does list several real-world attacks on:

  - Lack of padding
  - Padding oracles
  - small public exponents
  - badly chosen private exponents
  - Badly randomized large primes
  - Re-used large primes


It's the "steel door in a wooden frame" problem. The implementation is the weakest link, and not the theory.

Don't worry too much about quantum computers for now; worry about the attacks listed tfa, about the history of attacks being discovered, and the history of implementations being weak years after those attacks being discovered. And then consider that the NSA is the world's largest employer of mathematicians, who have each been toying with RSA since the very beginning of their career.


I am not cryptographer, just developer who used cryptography in the past.

I find historical argument not very convincing. RSA have been around for a long time.

Will we be reading "do not use ECC" articles in 2039 after comparable amount of research will be put into finding subtle unexpected errors in ECC?


RSA has been around a long time, yes, and Caesar cipher has been around for even longer. Part of the historical argument that you're dismissing is a very persistent Dunning-Kreuger effect: very smart software developers don't know how much there is to know about crypto, and since RSA is "simple" it's easy to delude yourself into thinking that you can do it right.

And, yes. Expect ECC to be broken. It was initially developed by Miller at the NSA, who only released that information when Koblitz discovered the cryptosystem independently. So they've been trying to crack it for a very long time, and you can be certain that they know of unpublished breaks. It's almost certain that they can break certain parameter classes, but the discrete log problem itself keeps getting weaker and weaker.

If moving away from weak crypto on a regular basis sounds like an undue burden, get out of the game, don't roll your own, leave it to the experts or you'll be doing yourself and your users a disservice.

I'm not a number theorist, but I did a lot of crypto and rubbed elbows with several NSA-employed cryptologists in my undergrad. I'm a decent developer, too, but I know far too much to think that I'm qualified to roll my own.


The main point of the article is that RSA is easy to use wrong, unlike ECC, so, no, we won't (at least for the same reason).


I suppose you could argue that weaknesses in ECC just haven't been discovered yet.

Whereas RSA is more thoroughly researched and various weaknesses revealed.

I don't know if that's true, I wrote a toy ECDSA implementation years ago (during highschool), and compared to RSA, ECC is certainly more complicated. Sure there are fewer parameters, and we currently know of fewer requirements for these parameters. But who is to say a weak class of ECC private keys won't be discovered in the future?

If you're paranoid about what weaknesses might be discovered, I suppose using RSA+ECC is a option :)


You're getting downvotes and not replies, so I'll bite. My guess is that it's because you're advancing a common crypto fallacy. Two weak cryptosystems do not combine into a strong cryptosystem. Do not wing it. Use vetted code; don't roll your own.


Speaking of NSA, libsodium looks nice, but isn't anyone a bit worried that it's still hosted on Github ?

That Microsoft is in bed with NSA is common knowledge at this point, and that the libsodium authors choose to (keep being) associate(d) with them doesn't exactly fill me with confidence…

(yes, the actual risk of NSA messing with the repository without libsodium authors' knowledge is probably very low, but still, it doesn't give the best impression… )


That's not how git works.


> Back in university I learned the in and outs of RSA, and to be honest, it seemed simple, understandable, and frankly, quite solid.

If you have a degree in computer science and not in cryptography it is quite frankly unlikely that you learned more than the basics of RSA.

> It absolutely depends that p and q are chosen at random, but besides that, should be basically uncrackable.

No. There are many things that can go wrong even if you choose p and q adequately.


See for example Dan Boneh’s “20 years of attacks on the RSA cryptosystem” [1]. Itself now written 20 years ago and the attacks keep coming.

[1]: http://www.ams.org/notices/199902/boneh.pdf


tl;dr: "The attacks discovered so far mainly illustrate the pitfalls to be avoided when implementing RSA. At the moment it appears that proper implementations can be trusted to provide security in the digital world." (My own emphasis.)


We’re still waiting to find one of those proper implementations 20 years later.


> Back in university I learned the in and outs of RSA, and to be honest, it seemed simple, understandable, and frankly, quite solid.

No offense, but typical uni knowlege of these things is woefully underprepared for understanding security subtleties, and uni-level api design in my experience is not mature and skews towards power and flexibility instead of ease and safety.



This is not my realm and I am grateful for experts who can advise us well. But I have a nagging yet sincere quibble that isn't easily dismissed: Where are all the failure stories? Does "under attack" mean "successfully defeated"?

I find it singulary unsatisfying to hear in response, "Well, you know, the ones who got owned are too ashamed / too afraid of liability to come forward... [etc]"


>Moreover, p and q must be chosen independently. If p and q share approximately half of their upper bits, then N can be factored using Fermat’s method.

Wouldn't p and q normally share around half their upper bits if chosen independently? I suspect the probability distribution would be centered around 0.5


This is a good point. Perhaps they meant to say "if approximately the upper half of p and q's bits are identical"?


That might make more sense. If you have two integers in the same approximate range you might expect the upper halves to look similar.


Why would they? Given two random numbers with a size of n, the probability that upper halves (size of upper half is n/2) are the same is 1/(1+n/2).


That's not how the parent is interpreting that statement; their interpretation of,

> If p and q share approximately half of their upper bits

That, of the upper bits, are approximately half are the same? Since the combinations of two corresponding bits in p & q are 00, 01, 10, and 11, and, if they're chosen uniformly, one would expect about half to be the same.

I suspect that your interpretation is what the article wants to say, but I think that the parent's interpretation is reasonable given how it was written.


I would _love_ to abandon RSA, but Azure DevOps requires my SSH keys are RSA and no matter of fiddling with my public key on input seems to change that. At my home, and on our infrastructure, I only use ECDSA, but when providers _require_ RSA it's hard to stop using it.


I'm repeating myself but just to chime in on the most common pushback this sentiment ("no more RSA!") gets: nobody is telling you to switch providers so you can do ECC SSH. The gist of the argument is not to introduce RSA into new designs --- which is a thing you see people do a lot if you review cryptography implementations, and it's often a clownfire, moreso than NaCL ECC designs.


Azure has a nack to make everything annoying and/or dysfunctional


The section entitled "So what should you use instead?" doesn't make a suggestion of what to use instead...


> Trail of Bits recommends using Curve25519 for key exchange and digital signatures... it is implemented in libsodium, which has easy-to-read documentation and is available for most languages.


You use ECDH to establish keys for sessions or files. You use ECDSA and ECC-Schnorr stuff like Ed25519 to sign things. That covers 99.9% of what RSA is used for. HTH.


This is how most critical articles of crypto end. They criticise the algorithm or system, and then don't offer much in the way of solutions.

They _do_ suggest using ECC, but in practice ECC support is super sporadic, and a lot of products charge extra for it.


Doesn't it say so in the last paragraph of that section?


A lot of the implementation challenges with RSA tend to stem from trying to he clever and prematurely optimize.

Whether its trying to save space by using smaller keys which is silly because the key size is rarely the bottleneck. Or trying to save time with clever prime genetation algorithms. The safest way to generate primes is also the easiest, just generate random numbers of appropriate size until you get one that is prime. Its not blazing fast but it's plenty usable.

Lots of security problems seem to stem from trying to be clever or optimize something that really doesn't need optimization.


Didn't the article say that prime reuse was a security hazard? With your proposal there would be huge re-use.


Why would there be huge reuse? Bad pRNG implementations?


What? How do you think you avoid reusing primes?


A more useful contribution than this rant would have been a structured guide for how to do RSA without any of those issues. If they are all so well known, then, well, write down exactly how to do it properly and make a list of all attack classes together with a convincing argument why your guideline prevents them.

But that would open up responsibility on the author's side, and who wants that? It's oh so much easier to just blast out a rant with some bits of information that are enough to illustrate your rant's point but far from sufficient to actually be helpful.


When it comes to security software, that doesn't work. There are so many ways to get things wrong, it's better to just use a well maintained existing library.

Don't be the (next) person who makes one single mistake and throws everything away.


Sure, I totally agree. But that was not my point. I was trying to convey that it's always easier to rant than to actually contribute to improving things.

Mind you, the post does not just say "dont't roll your own". It essentially says "don't use RSA no matter which implementation". And the argument for that is not very strong. It's an arbitrary selection of what could go wrong. In which libraries does it? Maybe for the major implementations this is all just cold coffee and they don't have those issues. But then I go and use one of them, and since the author just wrote a random rant, they pull the next trick out the sleeve and claim the next thing this implementation got wrong. It's easy to be in that position. It's harder to compile a complete list so that I can go and see that my framework does not violate any of those issues, so it can be trusted just as much as his favorite ECC framework. But yeah that would not just take work but would risk being criticized for being incomplete, so just easier to publish a rant instead.


> if you come to us with a codebase that uses RSA, you will be paying for the hour of time required for us to explain why you should stop using it.

Okay and what does an hour of their time cost?


Based on what I billed out at at comparable shops, anywhere from $250-400/hr. Maybe $500 if one of their more crypto-focused people looks at it, but that'd be pretty unlikely for a whole codebase.


$2000/day is what you'd pay for a web application for an appsec shop. Double that for cryptography from a focussed cryptography practice, and add 50-75% on top if you're getting reviews by PhDs at Rambus, like Intel does for new designs.


RSA itself is secure, if your primes are large and random. You should use a decent library like OpenSSH etc.

Transition from RSA to ECC has an issue: unlike RSA where you increase the security by keeping the same algorithm but increasing the key length, in ECC there are a zoo of curves. You have heard that parameters in 25519 are secure, but then later you upgrade to another curve with dubious parameters. Each of these curves needs its own cryptanalysis. Now you have to deal with a lot of fragmented algorithms and complexity.

Also I kind of believe the difficulty of the factorization problem that has not been solved 2500 years more than the difficulty of the discrete log problem. There is no doubt more work on factorization, but that also indicates solving it gets increasingly difficult.

Most algorithms have limitations. Mentioning the limitations of one algorithm to promote another is not a proper comparison. What if the users pick up one of those NIST curves?


Security with keys mate......


Nice try NSA.

On this topic I think I'll go with Schneier and continue using RSA in general, and PGP in particular. Also Veracrypt is a good choice for an overall solution.


Schneier is wrong. Notably: this is not a subject Schneier has particular expertise in. There's a layperson belief that "cryptography" is a single domain, but it is subspecialized. You might take Schneier seriously for block cipher cryptography (though that field has moved far past the last serious engineering writing Schneier did about it), but I don't think it was ever the case that Schneier was someone you'd take to the bank for asymmetric cryptography.


Schneier has thousands of citations to his peer-reviewed scientific articles. He has authored a number of books in cryptography, and has thought cryptography in major places. He has been involved in the creation of many cryptographic algorithms. His contributions have been mostly in the area of symmetric crypto, but that doesn’t imply that he doesn’t understands well asymmetric crypto, especially the RSA. He teaches this stuff.

A lot of us don’t have /any/ track record in /both/ symmetric and asymmetric crypto.


It is in fact not true that Schneier is an especially widely cited scholar on this subject.


The cases where ad hominem is not a red flag are rare.


I bet there are a similar number of pitfalls with ECC, but (depending on the specific weakness) we just have a ~50-2000 year head start with RSA because prime numbers have been studied for far longer.


There's a great Boneh talk discussing the history of elliptic curves that refutes this weird argument. At any rate, the history of elliptic curves and RSA in cryptography is intimately intertwined.



I'm sure there is. Why don't you provide a pointer to this talk instead of being weirdly disparaging.


DM for my rate card.


He explicitly said it doesn't suffer from identical problems, notably not having padding requirements and not having to select primes: users select random keys instead.

Anything can be badly implemented. ECC can be badly implemented. Two specific, often repeated problems are not shared by ECC


You, me, or djb still have to select a curve -- that's where I expect the pitfalls will be.


Libsodium. Djb or his equivalent did that already and if there is a timing attack, it will get found and fixed faster.


I'm not referring to implementation bugs, I'm referring to choice of curve, which can't be changed after the fact.

Look at the table at the bottom of: https://safecurves.cr.yp.to/

Today, Curve25519 is a good choice because it's popular and checks all the boxes. But how many undiscovered columns are in this table?

My top level comment seems unpopular, so for what it's worth, I'll clarify that I definitely agree that there are fewer knobs for the non-cryptographer implementer to get wrong with ECC. But that's because the only way to implement an ECC system exactly as specified by the cryptographers. That's clearly an improvement over RSA, but I remain unconvinced that the cryptographers designing the curves are infallible. There are curves and curve parameters that, for various reasons, are bad.

Maybe there's a proof that it's impossible to do better than Curve25519, but if there is, I haven't seen it, and tptacek would rather be snarky than provide a reference. Is it impossible for more columns to show up in that safecurves table? If so, prove it. Otherwise, I think my top level suspicion is reasonable.


This article reeks of arrogance. I stopped reading it after a while since I can't stand the style of the writer. They seem to think that they've seen it all and this is a huge red flag to me.


Could you give some specifics instead of a blanket "this article is shit"


Read it and you will see.


Trail of Bits, I love your content and love your tools.

I was also proud that your CEO stood up against the Time AI bullshit at black hat. Someone had to do it.

But Christ, this article is ridiculous.


Probably what's wrong about this article is the title and the introduction.

The article isn't about stop using RSA, the article is about stopping to roll your own crypto with RSA. I honestly don't see why anyone of us should write any crypto implementation: no matter how hard we try, I bet that without being a mathematician focused on crypto writing such an important algorithm will inevitably make it not secure.


Since this article comes up from time to time in a PGP context I generated a PGP FAN article discussing it:

* https://articles.59.ca/doku.php?id=pgpfan:rsabad

A generally relevant excerpt:

The writer ends with a discussion of alternatives to RSA. The writer says this:

> … the math behind ECC is so complicated that very few people feel confident enough to actually implement it. In other words, it intimidates people into using libraries built by cryptographers who know what they’re doing. RSA on the other hand is so simple that it can be (poorly) implemented in an hour.

In other words; the ECC encryption method is superior because it is significantly more complex than RSA. That is at the end of an article that talks about how hard RSA is to get right. This is not a compelling argument.


Wow, what a compelling rebuttal of the original article this is! </sarcasm>

> The writer mostly talks about common implementation errors. PGP has been using RSA for a very long time now. There is no real chance that there are any of those errors in the PGP code.

The opening sections of the article are about how choosing the primes and exponents in RSA is fraught with error. PGP doesn't prescribe any particularly algorithm for these operations, so PGP implementations are potentially susceptible to these issues.

> The writer has a section on padding oracle attacks. Such attacks are not applicable to PGP simply because the encryption is only done once and there is no reverse channel.

Wow. Just wow. There is actually quite a few ways to get reverse channels out of email, some that don't even require social engineering. The most trivial one is the 1×1 tracking pixel that phones home, which is pretty common in email for tracking purposes.

Sorry, this is the kind of rebuttal that proves the actual point of cryptographers: designing cryptosystems is fraught with error, and amateurs are likely to fall into traps they didn't know about. PGP is already decently well-known as being a bad cryptosystem; it's basically a message format that wraps Enc(x) and Sign(x) for your given cryptographic algorithm, and we've spent the last several decades learning that these primitives are not secure by themselves and do not naïvely compose.


>...PGP doesn't prescribe any particularly algorithm for these operations, so PGP implementations are potentially susceptible to these issues.

OpenPGP and implementations get a fair bit of academic scrutiny. The published PGP standard refers to RFC3447 which is PKCS #1 v2.1 for RSA. It is quite unlikely that there are any bonehead errors in any of the popular implementations.

>The most trivial one is the 1×1 tracking pixel that phones home, which is pretty common in email for tracking purposes.

I think it was clear from context that I did not mean that no reverse channels existing anywhere, only in a form suitable for oracle attacks. The HTML image thing is a straight up leak that has actually been used to exfiltrate decrypted plaintext.

>PGP is already decently well-known as being a bad cryptosystem...

That isn't really true. It is actually one of the solidest cryptographic protocols ever.

>...do not naïvely compose.

What does that mean?


There are bonehead errors in PGP, which is not actually one of the solidest cryptographic protocols ever.


I think the last comment on the article is the most important for discussion.

—- KEEP USING RSA!

This article is misleading to make it appear that RSA is not secure, but only the only evidence presented is improper implementation.

Properly implemented RSA has been proven secure and unbreakable by the NSA with case studies such as Snowden, Lavabit, dark markets, and ECC is much harder to properly implement than RSA.

The NSA has been pushing ECC because their quantum chips can break it easily. D-Wave, Google, Alibaba, and others already have quantum chips. The disinformation agents claim that “quantum computers don’t exist” which is true because nobody uses a computer to break crypto, they use specialized custom chips.

All ECC (X25519-P521) will be broken by private sector quantum chips before RSA-2048 due to the physical limitations of stabilizing qubits.

The people making false claims against RSA are either being paid or they are useful idiots.


Wut. This is just false. D-Wave provides the "quantum computers" that others have and they're completely irrelevant when it comes to breaking ECC (or RSA, for that matter; ECC is broken by a variant of Shor's algorithm, which is used to break RSA). They're quantum annealing chips, not general-purpose quantum computers; they literally can't run Shor's algorithm, now or ever.

We're orders of magnitudes out (qubit-wise, on real quantum computers) from breaking RSA or ECC in practice, and the difficulty level is on the same order for both.


I'm not convinced that D-Wave's hardware is relevant to anything beyond D-Wave's profits. It's neat that their technology works, I guess, but it isn't clear at this point that it does anything better than a classical computer could.


I'm actually a cryptography engineer. RSA encryption standards are hard to implement correctly. Frequently they contain padding oracles. There are safe ways to do RSA people don't do in standards.

ECC eschews all that complexity. Writing a safe ECC implementation especially for Edwards or Montgomery curves is pretty simple.

PGP is mortebund and has a lot of cruft.

RSA does some things that are really cool like accumulators, Pallier, etc. But if you are doing them you are doing black magic.


That comment goes off the rails once it starts talking about quantum computers breaking encryption (and only ECC, not RSA).

Yes, (real) Quantum computers can break ECC and RSA encryption. I have no idea why they think the NSA (and apparently Google as well) has secret Quantum Computers that are capable of breaking ECC, but not RSA.


Not to say I agree with that comment. But I think the case for why ECC would be easier to break than RSA using quantum computing is that key sizes of ECC are much smaller so would require less qbits to break.


You can break a 160 bit EC with a 1000 qubit quantum machine. An equivalent 1024 RSA bit key needs about 2000 qbuits. (The following paper has a cool table and formula for working out how many you need for each)

https://arxiv.org/pdf/quant-ph/0301141.pdf

My incredulity is in the idea that Google and the NSA apparently have 1000 qubit machines that can do that, but not 2000 qubit ones (and you should be very confident that will remain the case in future).

Note that once you're at the "I have a stable 1000 qubit machine that can perform complex calculations" realm, you can consider doing things like entangling it with that other 1000 qubit machine you have lying around, bringing yourself rapidly into the 2000 qubit realm. Getting to the first part is really hard. If you can pull that off, you're probably only a few years away from the second. The main thing that's stopping us from doing that today is that the (physcial) qubits we've made so far aren't stable enough; getting to a single logical qubit is anticipated to take 1000 physical ones.

https://en.wikipedia.org/wiki/Physical_and_logical_qubits


The few thousands qubits that you mentioned is logical qubits. With error correction you need tens of millions of physical qubits to break RSA or ECC.

Google and NSA can produce more of the same thing. But they probably have no major secret sauce for things such as error correction that the public does not have.


Curves are 2 to 3 times easier to break than RSA using Shor's algorithm on a hypothetical quantum computer but that is not enough of a factor to really make a difference.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: