Hacker News new | past | comments | ask | show | jobs | submit login
GnuPG 2.1.0 “modern” released (gnupg.org)
197 points by bruo on Nov 6, 2014 | hide | past | favorite | 66 comments



Congratulations Werner and GnuPG team!

> GnuPG now support Elliptic Curve keys for public key encryption. This is defined in RFC-6637. Because there is no other mainstream OpenPGP implementation yet available which supports ECC, the use of such keys is still very limited. [1]

Google End-To-End [2] supports ECC by default. We are working on supporting Ed25519, but encrypting and signing with NIST curves should work and be compatible with GnuPG. That said, we've had a lot of compat issues, so it'll be great if Hacker News readers and GnuPG users can help test our implementation against GnuPG. You can generate keys in GnuPG and import them to End-To-End and vice versa, or you can encrypt/sign messages on one software and decrypt/verify them on the other. If you found a bug or anything that doesn't work as expected, you can report it at https://code.google.com/p/end-to-end/wiki/Issues?tm=3. If it's a security bug you're eligible for a monetary reward under our bug bounty program :-).

[1] https://gnupg.org/faq/whats-new-in-2.1.html#ecc [2] https://code.google.com/p/end-to-end/


Hopefully this is not too obvious, but what is the advantage of ECC crypto compared to what GnuPG 2.0 is using?


There are three big issues; in order of importance:

1. PGP's RSA constructions are archaic; they use a format defined in the 1990s that is vulnerable to multiple different attacks and likely to harbor more that we don't know about yet. (This, bafflingly, is also a problem with DNSSEC.) I should be clear: PGP is not itself known to be vulnerable to these attacks. But neither was Java's TLS implementation, before it was found to be vulnerable a few months back.

2. RSA is well-studied but it's hard to say how well we understand its strength. There are no credible attacks on RSA-2048, but academic progress is being made on a cousin of the factoring problem it relies on (the discrete log problem). ECC is based on a harder math problem, is also well studied, and is believed to be stronger.

3. ECC is faster and provides more security with fewer key bits.

A combination of all three of these factors gives a sort of second-order issue, which is that modern public key crypto constructions tend to be based on ECC and not multiplicative group IFP/DLP algorithms. EdDSA is good for reasons other than that it's based on good ECC crypto.

Hope that's helpful and not just noise. Looking forward to inevitable 'pbsd correction. :)


Speed. Key generation (and most other operations) in ECC is order of magnitude faster than in RSA, DSA or ElGamal. Last time I checked it took End-To-End's Javascript library seconds to generate a key pair in RSA, but only milliseconds in ECC.


Key gen time probably doesn't matter too much in the particular case of pgp, but key size is a big deal. Keys that you might conceivably type by hand. Keys, not fingerprints, that can fit in tweets. Or tattoos. :)


Do you know of anyone with a signify (or reop) key tattoo yet?


Heh. No, I'm not trying to encourage that, but I think it's useful as rough measure of practical size for exchanging data. Like "Olympic swimming pools" is a popular measure.


One particularly nice feature of the new version: gpg-agent no longer just stores the passphrase and hands it out to gnupg. Instead, gpg-agent actually holds the private keys and does crypto operations with them, and never lets any other process have the private keys or the passphrase (other than the pinentry program that prompts for the passphrase).

See https://gnupg.org/faq/whats-new-in-2.1.html#nosecring


Does that kinda imply that gpgagent would be usable as a building block for programmable access to gpg, ie next-gen gpgme?


Yes, exactly; with 2.1 out, I expect future versions to start moving towards a usable "libgpg".


I suppose that doesn't apply to using smartcards, as the private key never leaves the card anyway.


Presumably this makes the API uniform: Ask gppagent for sign/encrypt -- and the agent can delegate to smart card, or do everything itself?


This is great news, but if you're life-or-death dependent on crypto (and PGP is one of the very few tools you might ever consider being life-or-death with), you might consider letting the modernized GnuPG (and particularly it's ECC code) percolate a bit.


Looks like you know more about this than I do. Is the crypto code new to libgcrypt or just to GnuPG? Given your statement I assume it's the former.

---

Apparently libgcrypt hasn't been updated since late August[1]. Also relevant, GnuPGP 2.1 has been in Beta for over 4 years[2].

[1]: http://directory.fsf.org/wiki/Libgcrypt [2]: ftp://ftp.gnupg.org/gcrypt/gnupg/unstable/

---

Just for the record, I agree that under severe circumstances choosing the new ECC features might do more harm than good. However, given the context that somebody else asked whether it's a good idea to wait for a 2.1.1 release, I'd say no, use 2.1 right away. Interpreting version numbers like that (especially when the crypto routines have their own module anyway) seems overly suspicious and might do more harm than good as well.


It's gcrypt, but I have no idea who has looked carefully at that code.


Out of curiosity, and I apologize if this is radically off-topic, do you have a list of "usual suspects" for auditing crypto libraries (GPG, OTR, etc.)?

(Or is the answer simply: the modern-crypto list subscribers?)


Thai Duong, Juliano Rizzo, Kenny Patterson's team at Royal Holloway, Daniel Bernstein, Trevor Perrin, Nate Lawson, the Riscure guys, the Cryptography Research team at Rambus.


Are there any well funded national intelligence organizations that does this, and still do public recommendations? In theory the NSA, MI5 etc should be advising their various governments and businesses on how to protect secrets (not just military, but protecting against corporate espionage, protecting journalistic foreign sources (think: Chinese dissidents etc)). Obviously the NSA have some very real trust issues -- but does anyone have an update on whether or not they've actually sacrificed large parts of their mission on the altar of total information awareness? Does Navy Intelligence still provide support for Tor?


Don't use PGP if in a life or death situation. Once someone steals the key you're going to be really screwed because all the emails you thought were so private are suddenly not private at all.


That's true in a practical sense of all encrypted messaging solutions, but the other ones have the added hazard of routinely being insecure even when you don't accidentally leak your key.


> For many people the NIST and also the Brainpool curves have an doubtful origin and thus the plan for GnuPG is to use Bernstein’s Curve 25519 as default. GnuPG 2.1.0 already comes with support for signing keys using the Ed25519 variant of this curve.

That's great to hear, but it seems like this represents a split from the OpenPGP standard, which requires NIST curves[1]. Will other OpenPGP implementations (e.g. OpenPGP.js), start to have to offer extensions to be compatible with GPG?

[1] - http://tools.ietf.org/html/rfc6637#section-4


Great question. The core crypto library in End-To-End has already supported ECDH on Curve25519 and Ed25519 since day one [1]. We're discussing with GnuPG team on extending RFC 6637 to include these algorithms in the OpenPGP standard.

[1] https://code.google.com/p/end-to-end/source/browse/javascrip....


I spent the weekend digging into both End-to-End and OpenPGP.js, and it appears the End-to-End has the primitives for Ed25519, but it doesn't recognize signature packets that use Ed25519. Is there an e2e bug tracking compatibility with GPG 2.1 signatures that use Ed25519?


Favorite easter egg: in https://gnupg.org/faq/whats-new-in-2.1.html the examples showing how modern gpg works includes the identities 'Glenn Greenwald', 'Laura Poitras', 'Daniel Ellsberg', and 'Edward Snowden'.


It's all well and good adding support for new algorithms, and streamlining the UI. But still, access to the key servers are done over plaintext[1]. Which could allow an attacker to modify your request/response from the keyservers.

Am I correct in believing that this is a critical issue not to address?

[1] "Support for keyserver access over TLS is currently not available but will be added with one of the next point releases. " -- https://gnupg.org/faq/whats-new-in-2.1.html


I don't believe that this is a critical issue. The PGP-trust model doesn't need you to trust neither the keyserver nor the connection to the keyserver. You are supposed to look at the actual key, and the actual signatures of the key to decide if you trust it.

Anyone can usually upload any key to the keyserver, so even if you use TLS that wouldn't make a difference from a security perspective.


  - Commands to create and sign keys from the command line without any
    extra prompts are now available.
=> thanks god, at last


This should allow interoperability with Google's Chrome End-to-End GMail PGP extension. I hope this ends up in package managers quick.


ECC support?? FINALLY


What happened to the patent issues around ECC? I thought Blackberry/Certicom were holding that tech pretty close.


D.J. Bernstein stated that he wasn't aware of any patents that applied to Curve25519[1]. This leads to the following statement in the GnuPG v2.1 FAQ[2]:

>For many people the NIST and also the Brainpool curves have an doubtful origin and thus the plan for GnuPG is to use Bernstein’s Curve 25519 as default.

[1] http://cr.yp.to/ecdh/patents.html

[2] https://gnupg.org/faq/whats-new-in-2.1.html


A very large portion patents related to ECC are optimizations for characteristic 2 curve (interesting in smart cards, perhaps, but not used here), and/or are roughly around now.

The non-ed25519 ECC techniques in the OpenPGP standard have well establish too-old-to-be-patentable prior art: https://tools.ietf.org/html/rfc6090


The Schnorr signature patent that expired in 2008? The one-bit-sign point compression patent that expired earlier this year? The algorithms in RFC6090 which would be too old for patents anyway?

While there were (and are) some specific implementation techniques for efficient ECC covered by patents, many of them have now expired, and many others have better patent-free alternatives anyway.


Anyone have any guidance on building this on Ubuntu Trusty?


    # wget http://GPG/source/gpg-blah-tar.gz
    # wget http://GPG/source/gpg-blah-tar.gz.asc
    # gpg gpg-blah-tar.gz.asc
    # apt-get build-dep gnupg2
    # apt-get install devscripts ubuntu-dev-tools
    # apt-get source gnupg2
    # cd <The gnupg2 version directory something like gnupg2-2.0.26 >
    # uupdate ../gnupg-2.1.0-blah-tar.gz
    # cd <the directory uupdate tells you to cd into>
    # dpkg-buildpackage -uc -us -nc 
    # dpkg -i ../gnupg2*.deb


Please, post the same for OS X (Yosemite)! I tried and it seems 'modern' requires extra libraries not present.


Here is the build process that worked for me

sudo apt-get install pinentry-curses

add pinentry-program /usr/bin/pinentry-curses to ~/.gnupg/gpg-agent.conf

#cd

#wget ftp://ftp.gnupg.org/gcrypt/gnupg/gnupg-2.1.0.tar.bz2

#wget ftp://ftp.gnupg.org/gcrypt/gnupg/gnupg-2.1.0.tar.bz2.sig

#gpg --verify gnupg-2.1.0.tar.bz2

#tar -xvf gnupg-2.1.0.tar.bz2

#cd gnupg-2.1.0

#make -f build-aux/speedo.mk native

#export LD_LIBRARY_PATH="/home/[user]/gnupg-2.1.0/PLAY/inst/lib"

#cd PLAY/inst/bin

#./gpg2


It will end up in your package manager eventually.


I doubt Trusty will get the upstream release in its repositories.


Well not knowing what "trusty" is, I'm sure it'll filter down eventually unless Trusty is no longer supported.


Great news and congrats to the GnuPG team. A practical near-term consideration: there are plenty of installs of GPG v1 still in the wild --- people who never even upgraded to 2.0. It might not be a good idea to rush to ECC keys if you want to interoperate with these legacy installs. But for the long term, this is a great development.


What's a good, cheap smartcard to use with GPG 2.1? Ideally it should support ECC as well as RSA (4096 bits).


Second that question. In fact, I'd be happy for a not-so-cheap one (up to $100). Do ECC smartcards compatible with GPG2.1 exist?


Another large codebase in C that deals with security. How thoroughly has it been audited and tested for bugs? Should we soon expect a new "Heartbleed"? Why or why not?


If you wrote it in a language you consider "more secure" you'll open yourself to timing attacks and loose portability.


How would you lose portability by moving to a high-level language?

Further, why would timing attacks necessarily be more or less of an issue in a higher-level language than C? This is a genuine question.

I feel like in any case, there are some things that should just not be done in C. For example, the base64 encoder/decoder that gpg uses for ASCII-armoring should not be written in C; in general any parser should not be written in C; there's no reason for the code that interfaces with the user (for example, checking a password given to gpg-agent) to be written in C. You could write these in OCaml, use the FFI to natively link them into a single executable, and you'd have safety for string handling while retaining fine-grained control of the hardware during crypto.

However, I'm skeptical that you literally cannot use OCaml to implement crypto that's secure against timing attacks. You can always fall back to C for primitive operations, and you can control the garbage collector enough that gc pauses wouldn't be an issue.

IMO, the most compelling reason not to do this is that for all its fatal flaws, C is the most widely-used systems programming language, and so it has the largest share of experts ready to review code. But this doesn't seem like a good enough reason, given the severity of memory corruption bugs.


The way you lose portability by moving to a high level language is that you don't have an implementation of that high level language for as many platforms. You're not losing abstract portability in the sense that the semantics of the high level code is very highly machine-independent. But an actual port requires an actual toolchain.

C is not inherently portable, and it takes a lot of know how, care, testing, and sometimes luck, to write C that is portable.

C is de facto portable in the sense "able to be actually ported" by having the tools --- but possibly at the cost of combing the code for nonportabilities that have to be uprooted.


Typically gpg is used in asychronous communication. Ie, I encrypt or sign something and send the file to someone else who decrypts or verifies it at their leisure. So how can timing attacks be used?


I send you an email. Your mail client pipes it through gpg upon receipt. Meanwhile, I have a microphone pointed at your computer from across the room listening to your capacitors humming. Now I have the key you used to decrypt the message.


Nice try, but I still use vacuum tubes


While using C automatically shields you from timing attacks and buys you portability? Right. It is not even supported on 64-bit Windows yet. (I'm not saying that there's a need to support 64-bit Windows; I'm just demonstrating that it's not portable. It's not portable because the codebase assumes some flavor of unix and because C is underspecified. long is 64-bit on 64-bit linux platforms, 32-bit on 64-bit MSWin. It's a C problem, not a Windows problem.)


How do Ada (or even SPARK) open opportunities for timing attacks? How are they less portable than any other gcc frontend (modulo the quite tiny runtime)?


Could all the downvoters explain why this isn't or shouldn't be a genuine concern? I know that GnuPG is not a network server, but there's a still lot of potential for various exploits in format parsers, etc.


When rust becomes stable, this probably would be a good candidate.


Seeing as this is crypto software, is it a good idea to wait for a 2.1.1 bugfix release?


Well, this announcement refers to a beta version. Draw your own conclusions.


The previous announcement refers to beta, this is the final.


With each new release I feel kind of nostalgic. I have been using this since year 2001.


Excellent news! Ed25519 signatures ftw :D


Pfft. Not Ed448-Goldilocks? I'd rather not use a long term identity key with less security than the 4096-bit RSA that people are already widely using for long term keys.

Or really, wheres the SPHINCS+Ed25519 option for pgp master keys? :P


I take your point, but honestly I don't think at this time that Curve25519 (or properly implemented RSA-3072) will fall cryptologically in any reasonable timeframe without either a quantum computer or a really fundamental development in factoring/discrete logs that would probably threaten RSA-4096 roughly as badly.

Ed448-Goldilocks is great, and may indeed be a good contender for certificates as it seems to represent a good inflexion point of security versus performance.

The Crypto Forum Research Group at the IRTF is currently considering recommendations to make to the TLS Working Group (and perhaps to be used more widely) regarding elliptic curves: I've been a, um, vocal participant there already. I've raised this development there, as it may spur the discussion forward.

Goldilocks is one of the potential curves on the table there. There's potential enthusiasm for recommending two curves: one strong, fast curve with ≈128-bit workfactor (quite likely Curve25519 in my opinion) and one super-strength curve, at least 192-bit workfactor (possible candidates include MS Research "NUMS" P384/P512, Curve41417, Ed448-Goldilocks, and E-521). The option of generating entirely new Brainpool-style random curves isn't off the table either, and some have been asking for that, but I'm really not sold: the software performance of random curves is absolutely dreadful, the implementation can be the kind of hairball we want to avoid, those asking seem to be asking because they're heavily committed to hardware with aged generic (RSA-targeted) hardware multipliers with anti-side-channel blinding that is no longer great and doesn't work right over special primes - and besides, if we wanted Brainpool, we could just use Brainpool. (Brainpool is indeed in this GnuPG release if you want it.)

If I was not performance-sensitive and wanted the highest level of security I could reasonably get from curves, maybe I'd choose E-521, which has impressive rigidity, being independently discovered and confirmed by multiple research teams, and good performance for its class due to being a Mersenne prime. But you do pay quite a bit of performance to go from the ≈224-bit to the ≈250-bit+ range, and it's not clear to me that's particularly meaningful in all cases. (Of course, maybe PGP is one of those cases, and performance isn't usually a big deal there unless it's really bad.)

Regarding SPHINCS, 41KB signatures are remarkable for what they are, but not practically ideal for OpenPGP, especially with the way the keyservers are now, but you know, I could maybe live with it. I could see it start to get ugly for keys with a lot of verifying counter-signatures on them, however.

Do remember OpenPGP doesn't bring everything to the table: for example, forward security. And MUAs have a reputation for awful UX regarding it - operator error is extraordinarily frequent. But it seems to at least be Pretty Good in practice. Thanks Werner & team!


I'm kinda bummed that all the considered sexy curves are cofactor!=1, beyond losing a bit of rho security, it makes implementations more complex and complicates derivation schemes. (I think it also shows there is more flexibility in supposedly rigid schemes than people give credit for; IIRC brainpool _required_ cofactor 1, but a speedup is suggested (montgomery ladder instead of using a slower constant time addition law) and tada, relaxed in future curve specs. This is especially acute for PGP where performance on this sort of scale is basically a non-issue: practically no one needs to verify tens of thousands of PGP signatures per second.

Indeed, PGP doesn't bring everything to the table (You fail to mention non-repudiation, but the fact that I can't authenticate a pgp message without producing a transferable signature is a pretty big flaw for general correspondence)... but, considering the the limitations and usability concerns, it appears to me to be most suited and widely used for higher security applications authenticating software distributions with long term keys, which effectively all the rest of our computer security ends up resting on it. Other things end up being protected by OTR, TLS .. or not at all. :( We effectively have no general widely deployed "high security" option for long term person to person encryption and authentication except PGP.


This is incorrect. Cofactor 1 does not have a complete addition law, nor did original specs forbid cofactor >1 on security grounds. The Montgomery ladder doesn't have certain kinds of weaknesses common in naive implementations of Weierstrass form addition. Clearing the cofactor is easier. I can fit Montgomery ladder in 10 lines, and can't do that with a secure Weierstrass implementation.


GnuPG scares me. In all kinds of ways.

And Werner Koch must be some kind of demi-god.


I'm curious, why does GnuPG scare you?


I feel quite the opposite Tomte, a world without GPG would mean that the bad guys who want to invade innocents' privacy have an all access pass to our inbox. That world would be horrific.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: