EnvKey[1] moved from OpenPGP (RSA) to NaCl[2] for its v2, which recently launched.
It’s causing a difficult migration for our v1 users. Moving to a new encryption scheme is not fun for a product with client-side end-to-end encryption.
But within a year or so of releasing the v1, it seemed like the writing was on the wall for OpenPGP and RSA. I didn't want to go down with a dying standard.
NaCl is so much better. In spite of the migration headaches that will likely cost us some users, I'm very happy I made this decision. It's so much faster, lighter, and more intuitive.
It’s legitimately fun to work with, which I never thought I’d say about an encryption library after cutting my teeth on OpenPGP.
I'm not familiar with NaCl, websites that don't seem to have been updated in 7 years (version states 2016?) make me a little suspicious about the future viability of said projects. Perhaps it's not something that needs to be updated very frequently, but my knee-jerk reaction is that it looks abandoned, especially considering that they have an "upcomming features" section.
What made you choose it? Could PGP/GPG with ed25519 keys not have been sufficient? What makes NaCL "fun to work with"? For me, fun to work with would be Age [1] or Ring [2] with a elegant and well designed API. I'm also aware that the older something is, the more likely it has undergone peer review and security audits, unlike new Rust crypto libraries.
It has a thick UI layer in React (which I wouldn’t want to write in anything else), and it’s crucial to be able to share types and functions across the whole stack, as it’s designed in a way where a lot is shared. That’s the main reason.
Sounds like the author would agree it's fine to use RSA, so long as you use an audited library with a well-designed API that makes it easy to do the right thing, and hard to do the wrong thing.
This makes me wonder, if we have an RSA library as good as libsodium, is ECC really a better choice than RSA?
I love libsodium and tend to choose it, but ECC seems far more mysterious to me than RSA. Curve25519 is much newer, has more parameters, and could potentially have a backdoor (like it's precursor, P-256). It also has much smaller, fixed-size keys.
RSA by comparison is elegant and simple to understand, with only one parameter. It's been in wide use since the 1970s. You can choose the key size.
> This makes me wonder, if we have an RSA library as good as libsodium, is ECC really a better choice than RSA?
Yes, absolutely. Compare the key generation process, for example:
RSA: generate two large prime numbers around half the size of your key, then make sure they aren't too close to each other, or share one of the primes with another RSA key someone else generated, or have certain mathematical relations to each other, or...
ECC (specifically Curve25519): take 32 bytes of random data, set and clear a couple bits. Boom, done.
Performing operations with ECC keys is also significantly faster, and constant-time implementations are much easier to develop and verify.
Clear the highest bit, set the next highest bit, and clear the three lowest bits. This process is known as "key clamping", and ensures that the private key is in the right range and is not vulnerable to a couple of specific attacks.
is radically simpler and not remotely comparable to the requirements for RSA key generation. Moreover, RSA key generation is massively slow-- enough to be irritating for users even on fast computers so there is a lot of incentive to 'optimize' key generation and introduce complexity that results in bugs.
(do not use, I just implemented what the GP poster said, didn't check if it what he said was correct, but it sounds right)
> Moreover, RSA key generation is massively slow--
Not only is it extremely slow, but it's also non-deterministically slow -- there's no constant-time way to randomly generate an RSA key, because searching for suitable primes is a "guess-and-check" process.
ECC key generation, on the other hand, is so fast that it's perfectly feasible to use it as part of the session negotiation process (e.g. in ECDHE).
You confused me here; this looks like you're clearing the highest bit and then setting the highest bit... in a 7-bit integer. Of course it makes more sense that you're clearing the two highest bits and then setting the second-highest bit of an 8-bit integer, which is indeed the same thing as clearing the highest bit and then setting the second-highest bit.
But why have you written it to fiddle the second-highest bit twice? Why not just write `(key[31]&127)|64`?
(And then of course, why not use the hexadecimal form of your bitfield constants so people can tell what's going on? No need to count bits that way.)
Thank you for that self analyzing! I got lost on that part, so will now stop writing ping 127.1 just to save four bytes. I've always felt that people learnt things by seeing that, but now I know I'm too lazy to bother learning myself.
This is not really fair. The correct way to mint RSA keys is actually pretty simple, a naive approach works fine - it's just that doing this is slow and when people try to do something fast they keep making keys which fail the criteria you listed.
The tests you've advocated make sense if somebody else picked the keys and you're worried whether they did a good job, some of these tests are mandatory for a Web PKI Certificate Authority checking RSA public keys for example (the CA only has your public key so it can't check everything), but if you are picking the keys you really can just use the naive approach. The odds of "accidentally" getting two factors that are unduly close or very smooth are negligible.
> ... are actually pretty high with some poorly conceived key generation algorithms.
To be fair, that's what they said:
> > a naive approach works fine - it's just that doing this is slow and when people try to do something fast they keep making keys which fail the criteria you listed.
The naive way to generate a RSA key is to pick a uniformly random 1024-bit (or per your security parameter) integer, test if it's prime, and if it's not throw it out and pick a new, unrelated, 1024-bit integer, and repeat until you have two to multiply together. This will, probably, produce a secure RSA key. Definitely upwards of half of the time, assuming your primality test has sufficiently low false positive rate.
It's just ('just') painfully slow, so people almostly always optimize it. There are some optimizations that provably don't change the distribution of primes (the most obvious being marking the least significant bit so the number is never even), but history has shown that any optimization that doesn't exactly and exactingly preserve the distribution almost always turns out to introduce exploitable vulnerabilities.
Of course, even if you do it wrong it's still slower than barely-optimized ECC, and even if you do it right it still has vastly less security than a ECC key a quarter of it's size.
>any optimization that doesn't exactly and exactingly preserve the distribution almost always turns out to introduce exploitable vulnerabilities
Not a crypto expert, but aren't there changes to the distribution that you actually want to make, like having a lower size limit for each of the factors?
> aren't there changes to the distribution that you actually want to make
Yes, but that isn't a optimization, it's a deliberate change in semantics.
> like having a lower size limit for each of the factors?
For some reason, a annoying amount a cryptography literature uses "1024-bit integer" to mean "a (arbitrary-precision) integer whose most significant set bit is in position 1023". I probably should have phrased that better to avoid confusion with "a integer stored in 1024 bits of memory".
“Seeming” “elegant” is not a good way of choosing cryptographic algorithms (this argument is also unfair to elegance of ECC).
RSA has been in use since the 1970s and has been repeatedly shown to be extremely easy to introduce seemingly minor issue that completely break the crypto.
Among the many issues are the belief that encryption is just P^^message%N. Which it is not - you must include padding, and doing that padding wrong also breaks the security.
In general implementing RSA safely is very hard, then you also get to the the huge keysizes required for acceptable levels of security.
An encryption scheme seeming elegant or understandable is not a sign of strength. Neither is age, as basic ciphers like Am+B%N have been used for thousands of years, but are trivially breakable.
Furthermore, while RSA dates to the 70s, ECC still dates back to the early 80s IIRC.
As for RSA being easy to understand, I’m not sure why you think ECC is hard to understand - you follow the specified addition operation and you get the correct result. I would argue it’s even easier to intuitively understand than RSA, as the whole point is that you’re both adding secret numbers together in such a way that you end up with the same total. This is unless you find the Chinese remainder theorem obvious?
Honestly my current preferred encryption scheme is McEliece. Basically you generate an error correction code for a message such that it can correct half the bits in the message. The public key is your error correcting codes basis matrix multiplied by a permutation matrix to shuffle the bits around. Encryption is done by multiply the public key matrix by the message and then flipping half the bits. Decryption is simply inverting the permutation and performing the error correction.
As a bonus, in addition to being simple McEliece has also been around since the 70s, encryption and decryption is extremely fast, it has withstood huge amounts of research and isn’t broken by quantum computers. On the downside the keys are impractically large :(
> the belief that encryption is just P^^message%N. Which it is not - you must include padding, and doing that padding wrong also breaks the security.
Actually, even that is wrong; you can't[0] securely pad a message; you have to use P^(K:=rand(N)), then use K (or preferably H(K)) as a symmetric encryption key to encrypt the actual message.
0: outside of a handful of special cases, of which `rand(N)` is the only example I'm reasonably confident in
> An encryption scheme seeming elegant or understandable is not a sign of strength. Neither is age, as basic ciphers like Am+B%N have been used for thousands of years, but are trivially breakable.
I don't think the argument is that "we've had RSA since the 70s". I think it's "we've had RSA since the 70s and we still can't break it".
The point is that we have, over and over again. We’ve just recently had headlines about poorly chosen p and q leading to keys being broken. There have been issues with p and q being too small, too similar, etc. Then there are people not padding correctly.
RSA sounds easy, but in reality it isn’t. It is super easy to make minor mistakes that make your scheme trivially breakable.
ECDSA is a lot computationally cheaper for the same strength. Switching from RSA to ECDSA TLS certs can have a significant impact on the CPU usage for, say a reverse proxy. (By extension, costs less in terms of money and environmental impact).
That being said, I do think that global monoculture and putting all eggs in one basket as a society is a bad idea. If you're willing to take the cost and prefer to use something you understand - by all means do so, as long as you're aware of the tradeoffs you're making.
With Curve25519, by contrast, DJB explains exactly what constraints were imposed (with very solid justifications for each of them) and then proves that Curve25519 is the unique solution to these constraints which minimizes the remaining free coefficient (which maximizes efficiency). NIST should appoint him to be their Czar or something.
NIST P-256 (and ECDSA in general, to some extent) is widely suspected to have been subverted by NSA, as outlined in the first link in octoberfranklin's response, page 16 of [1], and [2].
Only key size. And public exponent. And padding algorithm. And prime generation algorithm. Remember what happened with Infineon’s on-device generated keys?
Keep in mind how much stress you would be putting on only having one implementation. In practice sooner or later someone would write another and using a needlessly complex encryption scheme can lead it to having bugs.
Comparing key size isn't really meaningful here because the main reason for moving away from RSA is that it needs way bigger keys to achieve the same security.
This is all true, but reads funny to me because I've implemented an intentionally vulnerable version of RSA and still had issues getting timing attacks to work on modern hardware (due to lack of sophistication in my approach, I think).
RSA is bad, because developers do not implement it as specified and ECC is good, because most developers will not implement it themself, because they do not understand ECC and therefor use libraries? IMHO a huge advantage of RSA over ECC is it is easy to explain. After you have explained the different kinds of elliptic curves and their pitfalls, you have to explain the integrated encryption scheme to actually use it for encryption. But ok, you want hybrid encryption with RSA too, but in theory, you do not have to.
The article's argument is that because RSA is easy to explain, developers are more likely to roll their own and do it wrong. But that's mainly an argument against "roll your own", for those who aren't expert or aren't willing to take the time to learn all the pitfalls.
The highlight here is that in some cases, failure to properly validate gets an attacker the secret key material.
Note all the conditional bits. Different curves have different properties and different issues. There are a bunch of different curves in common use while RSA pretty much always uses the same value for the parameter these days (RSA literally has just one parameter. The exponent.).
The article discusses this, the difference is that ECC parameters can be chosen before library-development time by expert cryptographers, all the developers have to do when actually developing the library is to generate random bits. The obviously complex mathematics of EC intimidate the non-experts away from trying to roll their own implementation and push them towards the most trusted expert implementation, while the false superficial simplicity of RSA seduce non-experts into greenspunning own implementations or using a greenspunned implementation without too much thought.
>RSA literally has just one parameter. The exponent.
How so? the 2 most important parameters are p and q, which have so many caveats and constraints on them that you lose track of them by the midpoint of the article, and you have to generate them privately so you can't offload this to non-affiliated cryptographers.
p and q are not parameters, they're the secret key. A "parameter" in this sense is a constant that defines the behavior of the algorithm that all users of the algorithm must agree on, otherwise they can't communicate.
In every RSA primer and cryptography reference that I'm aware of, p and q are considered parameters. Their nature as secret keys is merely a qualifier ("secret parameters").
>The obviously complex mathematics of EC intimidate the non-experts away from trying to roll their own implementation and push them towards the most trusted expert implementation...
My experience is that programmers are not so easily intimidated. If anything, complexity is an attractant...
Why would someone create their own parameters for deployment and not strictly research? There are plenty curves studied by academic experts out there that aren't even NIST.
“RSA is bad because developers often don’t implement it correctly, leading to vulnerabilities. Instead, use ECC, which can also be implemented incorrectly, but developers tend to do this less.”
The article raises some good points, but it really explains why you shouldn’t use your own RSA or an unaudited third-party library. A good RSA implementation which has been audited by security experts and doesn’t take shortcuts for performance would alleviate the OP’s concerns.
Maybe the article should say "use the security library that has the best developer documentation, as that gives the best chances of it working correctly."
I'm not sure why, but documentation on crypto libraries tends to be noticeably worse than the documentation for any other library, pretty much assuming that the coder has already written their PhD thesis on implementing a cryptographically secure system and doesn't need the documentation to explain what anything is.
And thus you have an endless stream of products that screw up setting the IV, because there was literally no guidance anywhere in the library about how you should handle it. Even big companies are made up of individual people and not everybody has the time to take graduate level courses on every single thing they're building before they build it.
Having the library audited for correctness is of no help when the majority of problems arise from just using it wrong because the documentation was incomplete, vague, or even outright wrong/out of date.
What is important is more a library where the easiest thing to do is correct, and making it really hard to make mistakes. Make the encrypt/decrypt function the only public api, with as few args as possible. Manage IVs under the hood so the user can't accidentally reuse them.
That's the approach taken by NaCl, and arguably the only one worth considering.
The article doesn't say what your paraphrase says, though, and even less of what your next sentence says. The various RSA implementations with serious problems it brings up weren't all 'own RSA or unaudited third party library'.
So it sounds like the main pain-point is improper implementation. Though the padding oracle attack is convincing to use something else, as it's necessary to pad yet still opens up to a different attack vector.
The article mentions various RSA implementations that have had problems. The other thing is, since they do audits and are telling you to avoid RSA, the advice obviously isn't 'a properly audited RSA is fine'. "it's actually ok to use RSA" is not a reasonable conclusion to draw from this piece.
I think the crux of the argument is summed up here:
> Developers could theoretically build an ECC implementation with terrible parameters and fail to check for things like invalid curve points, but they tend to not do this.
So, it's just about trusting developers to implement a different algorithm properly.
This article is a good list of implementation flaws with RSA, both in parameter selection and protocols.
However, I disagree with the recommendation to use ECIES. It has a separate MAC and encryption algorithm approach which is better served by AEAD algorithms these days.
e.g. for a fullstacker who spins up the latest Ubuntu LTS then generates a pair of 4,096-bit RSA keys using default openssh-server set over a high-number TCP port, what should they be doing that is different?
Just to say, understanding the complexity of the topic they are discussing, this is one of the finest pieces of writing I have ever read. If you can express this issue clearly, you can express anything clearly, imo.
Isn't RSA the best assymmetric crypto system for when you do need that? I mean, I get it for signatures, key exchange and data encryption it should be avoided but what else is there for symmetric key encryption such as how S/MIME uses it for example. I always thought RSA+OAEP with >= 4096 bits was an acceptable way to encrypt symmetric key material for transport.
I only know of PGP as the alternative which isn't well supported in many environments (especially commercial).
> Encryption needs to be done using a protocol called ECIES which combines an elliptic curve key exchange with a symmetric encryption algorithm.
kex+symmetric encryption is not the same as actually encrypting the symmetric key for transport. In situations where you need the recipient to decrypt it with only their private key and the symmetric key must be anything other than (EC)DH derived key, this does not work
Article is from 2016. It is a good article, but why does it say OAEP is notoriously difficult to implement? OAEP seemed very natural to me, and I "invented" it myself (long after it was well known to others, of course), i.e. the ideas in it weren't complicated. Am I missing something dumb?
I do remember hearing that Victor Shoup found some kind of bug in the security proof, but it wasn't something of practical concern.
I have a vague understanding that it is not so easy to encrypt data with ECC as it is with RSA. Is that true? This is one reason I still use RSA. What is the right way to use an ECC public key to encrypt data so only the holder of the private key can decrypt it? (Without any fancy key exchange - just fire and forget, email style)
meh. dude seems to be conflating "don't use RSA" with "don't roll your own crypto."
If we are to believe Scott Vanstone, ECC has security proofs that RSA doesn't. And I found Scott was a pretty trustworthy guy.
So the OP has a point. It's probably easier to mess up RSA then ECC. But it's not easy to not mess up ECC, so maybe the title should have been "for the love of god, don't roll your own crypto."
Maybe look at NTRU. It's supposedly quantum resistant, so that's a plus. But for the love of god maybe don't roll your own.
I contributed to two and a half commercial implementations of RSA and I still got Bob Baldwin to review my code. Bob was hip-deep in crypto research and knew how to avoid even obscure bugs.
I miss Scott and Bob.
Also... I'm using the term "Crypto" to mean "Cryptography" and not a solution in search of a problem crypto-currency.
Someone still had to write it and those people make mistakes too. Don't roll your own can run dangerously close to abstinence only as a solution for teen pregnancy. Certainly there are cases where "don't reinvent the wheel" is useful advice, but there are a great many where it isn't.
Let this be a lesson to whoever comes next: cute and distinct names for your cryptographic algorithm make it easy to distance yourself from people who create similar algorithms that turn out not to be secure.
RSA is fundamentally a trick of modular arithmetic, but it's not called that, so if someone else fucks up their modular arithmetic, it doesn't immediately bring shame and doubt upon RSA.
Meanwhile the ECC community is using an algorithm that is basically 'Curve' plus a five digit number. If this name isn't already being openly mocked, it will be at some point in the future. Normal people don't have lists of 'good' and 'bad' numbers in their heads. If Curve#### is found to be bad, then they'll think Curve##### is the one everyone is talking about.
Is there any new concern that Curve25519 has been backdoored by the NSA? It looks like P-256 did a long time ago, and reading the Wikipedia article doesn't give that impression, wanted to check though.
A backdoor in Curve25519 hasn’t really been a concern, because unlike P-256, the parameters for the curve didn’t come from NIST. Curve25519 is a djb (https://en.wikipedia.org/wiki/Daniel_J._Bernstein) special.
So unless djb was secretly working with the NSA and willing to risk his reputation to backdoor a highly scrutinized elliptic curve, the risk is low.
More importantly (because that's definitely something the NSA would try to do), curve25519 has fewer degrees of freedom to hide a backdoor in than P-256; 2^255-19 is the largest uint255 that's prime, and the other parameters (mostly the coefficent A=486662) were chosen by a similar "first value that satisified the security requirements" process - there's a paper by DJB explaining the parameter selection rationale around somewhere[0], although they could definitely stand to be more conspicuous about it.
0: The value of A is (poorly) explained in passing in https://cr.yp.to/ecdh/curve25519-20060209.pdf under heading "Why this curve?", but that doesn't explain any details for someone who's not a cryptographer.
I believe that the thing is more that DJB created that huge paper about how to create a curve so that everybody can know it's not backdoored, with advice that appears to be almost unanimous, and created Curve25519 by following it.
It's not a matter of trusting him, instead, it's a matter of using the algorithm that requires less trust of them all (including RSA).
There's no particular evidence that the NSA has backdoored P-256 or any other cipher in Suite B. Suite B is considered suitable for protecting compartmentalized classified information, so it would be actively dangerous for the NSA to attempt to backdoor it.
The closest thing to evidence of a backdoor is circumstantial: P-256 and the others use seeds that have never been fully explained. But this alone isn't particularly unusual: DES's S-boxes were similarly chosen opaquely, and we now know that the NSA did this to strengthen DES against the not-yet-public technique of differential cryptanalysis.
All that being said, Curve25519 (and Ed25519) is just better, and you should prefer it whenever you can[1].
There's an extremely interesting rebuttal that appears as a comment in the original article. I'm going to quote it below for the benefit of HN readers.
/QUOTE
Bob, February 28, 2020 at 12:15
KEEP USING RSA!
This article is misleading to make it appear that RSA is not secure, but only the only evidence presented is improper implementation.
Properly implemented RSA has been proven secure and unbreakable by the NSA with case studies such as Snowden, Lavabit, dark markets, and ECC is much harder to properly implement than RSA.
The NSA has been pushing ECC because their quantum chips can break it easily. D-Wave, Google, Alibaba, and others already have quantum chips. The disinformation agents claim that “quantum computers don’t exist” which is true because nobody uses a computer to break crypto, they use specialized custom chips.
All ECC (X25519-P521) will be broken by private sector quantum chips before RSA-2048 due to the physical limitations of stabilizing qubits.
The people making false claims against RSA are either being paid or they are useful idiots.
The campaign against RSA is pretty evident for my pattern recognizer, for it comes in distinct waves and employs templated articles, which indicate an organized backing behind it; And it has some markers of psychological manipulation - first and foremost, when you honestly have an Y better than X, you don't yell at everyone imperatively to stop using X, instead you plausibly and intelligibly highlight all the advantages of Y and let the readers draw their own conclusions.
Clearly this is not the tactic used in the article in question and many more alike. One just doesn't promote better things by declaring all prior art inferior unfoundedly without any vested interest.
I'm not sure how Snowden or Lavabit represent a "case study" in favor of RSA over ECC. My recollection is that the federal government never cracked Lavabit's encryption, and that all interactions with the government came in the form of (sometimes gagged) court orders.
I don't think any of those "case studies" prove anything about whether or not the NSA can break RSA. But if they alone could break ECC and not RSA, they would certainly push for ECC.
Because if they were just worried that other groups could break RSA, then presumably they would be happy to provide a demonstration or show evidence for such attacks.
> Afaict RSA is simpler to crack with quantum computers than ECC.
RSA requires 2n qubits to crack, ECC requires 6n. Since the normal RSA key is 2048 bits, and the normal ECC key is 256 bits, RSA requires a 4096 qubit quantum computer, and ECC requires a 1536 qubit quantum computer. If you use 4096 bit RSA and 512 bit ECC keys, this becomes 8192 qubits and 3072 qubits respectively. I'm not aware of any ECC curves larger than 512 bits.
Ultimately, both are broken in a post-quantum world. However, in the interim-quantum world, where quantum computers exist but are noisy and unreliable, RSA is safer.
No, the discrete logarithm problem and the prime factoring problem are very similar and are both solved by Shor's algorithm. I suspect they might actually be equivalent, in a weird way.
You get RSA by default with GnuPG. Since the signature method embodies your PGP identity you want to pick the method with the widest implementation. RSA is supported by TLS 1.3 for roughly the same reason.
There are add on standards for various curves available for PGP should you want to mess around with them. GnuPG implements them all. Other implementations do not.
Note that the encryption issues associated with an offline compatible system such as OpenPGP are different than online connected systems like TLS. The article was mostly talking about the sort of issues that crop up with an online connected system.
1) Post needs a "2019".
2) Best comment inside was from Philip Zimmermann: "I agree. This is why I switched to El Gamal as the default algorithm for PGP version 5 in the late 1990s."
There is no joke? Stop implementing your own crypto. Vault has transit encryption out of the box, among other things.
"The transit secrets engine handles cryptographic functions on data in-transit. Vault doesn't store the data sent to the secrets engine. It can also be viewed as "cryptography as a service" or "encryption as a service". The transit secrets engine can also sign and verify data; generate hashes and HMACs of data; and act as a source of random bytes."[0]
Oh. I thought it was because there are at least a dozen ways to securely encrypt/decrypt data, most of which are audited.
Skimming their site they seem to offer some sort of encryption + service hosting? I don't see how this is much different than any of the other options out there. And not really an equivalent to using RSA as it looks to be tied to their hosting.
I also tend to not trust for profit companies with things like this (esp. if it's closed source or I can't know what the servers actually run).
Has this service been audited? Has it withstood against the US court system like veracrypt has multiple times? Do their founders have any history that goes against good data security?
Been trialing Vault as a CA for internal use and so far everything seems to work great and setting it up with the documentation provided was quite easy. Furthermore, unlike some other FOSS developers they also provide and support illumos (Solaris) binaries for which I am truly grateful for.
I was trying to generate a new ssh key with Vault. No dice. I needed a new cert for my SSL service. Vault let me down again. Could you help what am I doing wrong?
>Then using HashiCorp Vault isn't the silver bullet
Except it practically is.
That's like saying Linux isn't a silver bullet because ls doesn't make filesystems... You wouldn't use Vault to create SSH keys, nor should it. You wouldn't use ls to make filesystems, nor should it.
Unless you're really good at auditing code, why wouldn't you want to use something stronger?
Memory managed languages are generally preferred for a lot of software because it's much harder to introduce memory leaks accidentally vs in something like C.
Yes, C is fine, but if your goal is no memory leaks, you could be doing a lot better.
I don't get why anyone would die on this hill over something as important as encryption. It's like using skeleton keys for your house door, because why would you be careless enough to not know when an intruder walks in?
It’s causing a difficult migration for our v1 users. Moving to a new encryption scheme is not fun for a product with client-side end-to-end encryption.
But within a year or so of releasing the v1, it seemed like the writing was on the wall for OpenPGP and RSA. I didn't want to go down with a dying standard.
NaCl is so much better. In spite of the migration headaches that will likely cost us some users, I'm very happy I made this decision. It's so much faster, lighter, and more intuitive.
It’s legitimately fun to work with, which I never thought I’d say about an encryption library after cutting my teeth on OpenPGP.
1 - https://github.com/envkey/envkey
2 - https://nacl.cr.yp.to