If you haven't seen it, there's a pretty neat video that outlines how public-key cryptography works using paint colors [1]. It's focused on Diffie-Hellman key exchange, but it's a great intro for beginners to be able to visualize the concepts, and I think helps provide some context before jumping right into the GPG command line.
While it's true that it's not possible to derive the input colours given only the output, in the case of DH both the output and the yellow (common) colour is available to Eve, which makes it trivial to reverse the other input.
The actual property a one-way function needs to be suitable for DH is to remain hard to reverse when both the output and one of the inputs is known.
This is a "beginners" article, and it's still too damn complicated for most people. I don't think I could send this article to my family, for instance, or even try to explain it to them without their eyes glazing over.
I'm not sure if the solution is to "hide" the confusing bits in the way that Signal/Enigmail/Mailvelope does, or if we need to scrap the whole concept and try again.
Why would the solution not be hiding the complexity? TLS "hides" the complexity. It works fine, and it's ubiquitous.
The "solution" to E2E encrypted email has been known for two decades: there should be a system for publicly announcing public keys and privately sharing private keys between trusted devices. It's not rocket science: see Telegram, Signal, Whatsapp, iMessage, etc.
But what those service have that email does not is a commercial incentive to make it all work seamlessly. Email is stuck with a system of half-implemented RFCs and patches for a system designed in the 80s with zero security considerations.
I feel like the only way we're ever going to make progress is if regulators require it. Have the EU mandate that email services with more than 10m customers need to have transparent E2E. Gmail, Outlook, and Yahoo will figure it out about three months after that law is passed.
Could someone explain to me the lore that email can't be make secure? It seems to me that all the steps described on this article could be automated behind the scenes by a good local email client.
* Email defaults and fails open to plaintext. Modern secure messengers only transmit securely. Even modern secure websites take pains, including extensions that had to be made to the HTTP protocol, to prevent traffic from being sent unencrypted. No such mechanism exists in email.
* Email reveals large amounts of metadata, including message content (which is what a Subject: line is), which is a characteristic shared by no secure messaging system. The graph of who talks to who and when is immensely important, and the best secure messengers (Signal, in particular) go through trouble to obscure it.
* Modern secure messengers are forward-secure, so that a compromised key doesn't allow attackers to retroactively decrypt old messages. Since messaging traffic between selected peers is almost always low-volume, any reasonable threat model assumes every piece of ciphertext exchanged between peers has been archived by adversaries. With PGP, a key compromise at any point in time reveals the contents of every one of those messages. The best secure messaging systems are not only forward secure but also have a notion of "future secrecy" allowing them to recover from ephemeral key loss.
* Email clients are inhospitable to cryptography; in particular, many of key features of clients (like search) rely on having a local archive of plaintext. The UX in email clients begs users for catastrophic failure, like accidentally top-posting a reply to an encrypted messages with an unencrypted message.
All four of these problems are, in isolation, total dealbreakers for serious secure messaging.
SMTP email is janky and outmoded and it's a little amazing that we all use it as much as we do; it's long overdue for a replacement. If you need to communicate securely with someone else, you should do so with a secure messenger like Signal, not with PGP and email.
I agree, email sucks. However it seems like a reasonable goal to improve it.
Signal started out with SMS, which was quickly abandoned. Your target would basically get a SMS message saying something like "someone is sending you an encrypted message, please install signal to read it".
Seems like a imap proxy could do similar. Use a DHT to track who has a public key. Send anyone who doesn't a "please install this plugin/proxy/whatever to read this message". Have a signal like autogeneration of keys tied to an email address instead of signal's phone number.
Could even have a service like <username>@whisper.com to allow people to use their favorite email client (like thunderbird) to send/receive email and leak less metadata. The imap proxy would use a signal like transport for anything matching @whisper.com.
I seem to recall a project like this mentioned, even a thunderbird plugin or some such.
Not as secure as signal, but better than today's default.
It's hard to understand why we'd spend effort pushing for adoption of a rube goldberg machine that can only hope to approach the security of Signal when we could instead just have people adopt Signal.
Signal needs quite a bit of work to be as useful as email.
No choice in clients, requiring a cell number, no search, and no threads.
Hard to imagine someone that uses email heavily with large volume of email trying to cram that much information into signal and then be able to find it again years later.
The vast, overwhelming majority of email users --- who also encompass the vast, overwhelming majority of users with serious privacy needs, contrary to the opinion of "privacy nerds" --- use whichever email client they were first introduced to, never change clients, and don't use any of the advanced features in those clients. Those are the users who are better served by Signal than by some janky attempt to make email almost perhaps under the best circumstances 60% as secure as Signal.
So why is it that people like Assange and Snowden, whose lives arguably depend on these matters, choose PGP over anything else?
I'm no expert in security, but my intuition is that the likelihood that my signed ubuntu plus gpg leak secrets is significantly lower than my who-knows-what-the-hell-its-doing android with signal, right? You gotta agree with this at least.
Because contrary to their public images neither is in fact a subject matter expert on these topics.
Snowden in particular did much dumber things than using PGP: he used Lavabit, a "secure email provider" with an architecture any expert could have told you from 10 miles away to avoid, and some amount of the Snowden/Poitras/Greenwald coordination occurred over CryptoCat, which was grievously broken in several different ways --- it is almost certain that every message exchanged between those parties on CryptoCat has been recovered by SIGINT agencies.
People wonder why crypto engineers on message boards are so shrill about this stuff. This is why.
If anyone wants to tear out excess hair, watch Citizenfour's scenes with Snowden and the rest on their computers. The main argument against the omniscience of the intelligence agencies is that they didn't walk into that hotel room on the first day.
Besides technical issues, watching Snowden try to get Poitras and especially Greenwald to adopt security based on operational discipline and CLI *nix tools is both hair-raising (easier to tear out that way) and laughable. Clearly he never had to work with end users. IIRC, in his initial instructions to Poitras he tells her 'if you have trouble setting up that encrypted connection, just ssh to the ip address and use the tool of your choice.' (If you don't understand why that is laughable, you haven't worked with end-users either.) Greenwald seems completely at a loss - 'I made my password (to the data on which Snowden's life and the whole news story depends) simple because I have to use it all the time - is that ok?' Sure, why not at this point?
A good indication that someone is not an expert is them telling ordinary users to avoid mobile operating systems in favor of general-purpose desktop operating systems.
Experts I know recommend iOS above any other consumer platform (though that might have something to do with Apple's hardware and of course you can't use iOS without Apple hardware).
So the experts you know recommend a closed-source OS? They recommend something that no one has any idea what's inside above ANY OTHER consumer platform? You're joking, right? Can you put me in touch with these experts? I'd like to ask them about it directly.
I don't want to speak for him but I suspect for communication purposes 'tptacek prefers iOS to the FOSS alternatives.
When talking with the security folks I know, this is not a rare opinion. You can't trust the source of any comms platform. So you have to reverse engineer the binaries. Once you get into that game the more important things than open source are, do you trust the team? Is the protocol open and vetted? Does it rely on 'odd' implementation details.
The idea that the nature of the software license of the OS that your comms platform is running on, is important for its security characteristics, does not seem to be something that the experts I know believe. Quite the opposite. It seems to be a signal of people who don't do this sort of research for a living.
Personally, I strongly prefer FOSS for many reasons. IME, real security experts are not so focused on that issue. Saying that nobody knows what is inside is a large exaggeration and an over-emphasis on one factor.
This is a great article for beginners but I must take issue with the fact that gpg 2.1 has been released for a year now and this article still recommends non-elliptic keys. When will we switch?
You can get the same amount of "bits" of security with EC as you can with RSA for sure if you choose the right parameters. There are also EC implementations (Ed25519 being an excellent example) which are incredibly simple which makes them easier to audit and reason about.
Also the simplicity allows us to implement some EC crypto in constant time, so there are no timing side channels.
The main reason not to use EC would be spottier hardware support (cards, keys, etc) and potentially less software support in the wild (people could be stuck on old versions of GPG for example).
It is not practical until my non-programmer neighbor can use it. Or my uncle. Or my friends. The people I communicate with most frequently, whose privacy is most important to me.
[1] https://www.youtube.com/watch?v=YEBfamv-_do&t=2m25s