>"Consider your average investigative journalist or whistleblower, with windows or a mac, that they haven't updated because then their kids favorite game doesn't run anymore or they simply don't want windows 10.
....
This makes forward secrecy a mandatory requirement, as this implies that the malware has to be constantly active and thus also enhances chances of detection and mitigation."
This is a bit of a straw-man argument. Forward secrecy or not, if you can get root on the client device, you own everything. So if you are a journalist/whistleblower, and have invested the effort to learn PGP, you should use Tails or something more appropriate for your job than windows or a mac.
Edit: This may be a good use case for hardware support for trusted execution (Intel SGX), along with all the other nasty features that it brings (DRM). The threat model for trusted execution is that the OS cannot be trusted whereas the app is sacrosanct.
This attitude of "either you really need the security, so make do with whatever inconvenient method is necessary, or just forget about it" is what keeps secure methods out of the mainstream, and thereby from widespread usage. This in turn means that anyone actually using those methods is immediately suspicious. Carrying multiple devices at the airport, or is there a Tails on that USB stick? Let's interrogate!
Also the notion that only journalists/whistleblowers by today's definition need this level of security is wrong. Your "normal", politically active person of today may be the whistleblower of tomorrow. E.g. after a regime change, or just by voting some crazy people into government. I'm living in a country where bloggers are sometimes arrested because of some alleged nonsense. But even in the West, it's nowadays all too easy to just be labelled "terrorist".
So the only sensical way forward is to popularize secure communication methods so that they are normally used by everyday people (including "professionals") with their everyday systems. This increased user base would likely increase the demand to close the existing holes and insecurities of the rest of the system, thereby creating a market force in this direction. And for this purpose, I agree with the author that PGP is not the way forward.
I don't have an answer to all your questions, but there will always be a fundamental conflict of interest between privacy & security (of the end user) and market forces. You can hope for more usable systems that are also secure. If you are put off by Tails or usb sticks and different devices, look at QubesOs. They also have a good approach to security, but even they are having a hard time finding hardware that meets their requirements. In the end, you, as an end-user, will have to pick your trade-offs.
My point is that as long as I have to choose non-mainstream OSs for secure communication, the tradeoff will not be worth it for me. As will be the case for 99% of "normal" users (most of whom don't even know that they can install alternative OSs, let alone on a USB stick). Things like QubesOs (which I had to Google) are interesting, but I think the general failure of Linux to become a mainstream desktop OS alternative proves that the chances of something like that gaining any relevant degree of popularity are minimal.
So yes I accept the risk that this will come back to me one day, because there's no other practical choice. Just as I decide to go out of the house and take part in normal life, even though there's a high chance of getting into a traffic accident one day.
>Things like QubesOs (which I had to Google) are interesting, but I think the general failure of Linux to become a mainstream desktop OS alternative proves that the chances of something like that gaining any relevant degree of popularity are minimal.
Their chances of success are anyone's guess, but it does offer you the ability to run windows in a VM alongside other OSes. I don't know how good the windows support is, or whether it's even a priority for them, but that is probably the best tradeoff that you can hope for in a system that lets you run windows and still be secure (assuming the user doesn't go out of the way to break the isolation between VMs)
> In cryptography, forward secrecy (FS; also known as perfect forward secrecy) is a property of secure communication protocols in which compromise of long-term keys does not compromise past session keys. Forward secrecy protects past sessions against future compromises of secret keys or passwords.
So for those us that need to run Windows/OSX to run software, like Photoshop, for our job, we should just give up on PGP? Seems like a supporting argument for the article then.
No, you just keep your work computer separate from your whistleblowing/terrorism computer. Productivity and secure+anonymous communications have different needs, and an air-gap between them is pretty much the only way to achieve both.
Luckily, in the modern world, owning a second single-purpose device is really easy: your secure+anonymous communications can be done on a rooted Android phone or a Tails-reformatted Chromebook for ~$150.
Your argument comes down to the threat model. A journalist whom also uses Photoshop is free to use whatever system they have sufficient trust in for the nature of the communication at hand.
If they're likely to be killed because someone reads the content of their messages, they should easily be able to weigh that cost against needing to boot of USB every now and again.
So that said, if you use crypto-system X on a machine you cannot trust, crypt-system X will not be able to protect you very much.
I understand the threat model. However that notion implies that those of us who aren't at risk of death over our emails should give up PGP. Given that (I assume) most of us aren't at risk of murder by the nation-state, PGP is dead.
I would prefer a solution where everyone could reasonably get end-to-end encrypted emails. Unfortunately, given the unfitness of PGP for this goal, coupled with recent work in the space, it looks like we will either get plaintext over decentrailzed email, or e2ee inside walled gardens.
There is nothing stopping you from having more than one identity. If you don't mind too much if malware has your PGP details, then all good. If you would like a more way to securely store these details for other sorts of communication, perhaps a different machine, a live-CD or hardware token[1][2] can be justified.
I think it is also worth remembering that even the most liberal western democracies have laws that one way or another prevent people from keeping secrets from the law. If everyone has it, and uses it, and it cannot be back-doored, it will be banned rather quickly IMHO. And the people who don't care about privacy -- i.e. most everyone -- won't care about crypto being banned either after the right wing press is done with them. (Restated: we shouldn't force people to close their curtains any more than we force them to open them).
Traditionally, you should have a second cheap machine running Tails, and do your email through PGP there.
The idea isn't that PGP is compromised on a Windows box with outside software, it's that everything is compromised on that box. Changing algorithm or app doesn't matter if your hardware might be recording every key stroke and sending it through a side channel.
As another example, take the newer remote management "features" included in modern BIOSes that run on a separate microcontroller in the system and allow access to network, memory, peripherals, etc., even if the computer is shut down (but still on mains). Let's imagine someone had an exploit or the keys for that. With that hole, you could never fully lock down. Sneakernet over a wide airgap is a good policy.
Ugh, yep. I totally forgot about how many machines come with entire parallel communication systems. The Intel Management Engine, for example, is horrific. It's a machine-compromising threat built into the CPU and (as far as I know) totally unremovable. I haven't heard about any kind of compromise, but it's far from impossible. Well-meant, sure (why make two production runs at higher expense), but that's not very comforting. We already know the US government intercepts some machines in transit; silently enabling and owning the IME would be an elegant way to beat a physical inspection on delivery. It's only reasonable to assume China and similar players do the same.
(If the IME is disabled, will the CPU still complain if you physically destroy the thing? Does anyone know?)
There's also the rather nasty proof-of-concept attack where air-gapped machines output audio data at ultrasonic frequencies to beat the gap. Sure, it takes initial compromise to activate, but that's a plausible risk for someone running nation-state defense.
Sneakernet continues to be a solid policy. I don't remember who, but some notable security researcher talking about Lulzsec summarized the issue with "If I were crossing a government, my opsec would be a stolen library card in a city I don't live in."
> If the IME is disabled, will the CPU still complain if you physically destroy the thing? Does anyone know?
If I'm remembering correctly, if the CPU doesn't receive a heartbeat from IME within 30 minutes, it shuts down the hardware.[1].
There are also different levels of the thing. There's a basic level that handles some kind of low-level system monitoring. The remote access stuff is usually marketed as "vPro", and it's sold as a premium feature. That's the one with direct access to the networking hardware, transmitting packets over the same physical interface, but using its own unique MAC address.
The "low-level system monitoring" is effectively a software implementation of any "active" ACPI logic (e.g. Intel's SpeedStep.) This used to be in a separate "platform controller", but putting it in an on-CPU-die coprocessor 1. allows the ACPI logic to be regular x86_64 code, and 2. allows much simpler motherboard layout wiring (because you don't need a bunch of duplicate pinouts running from peripherals to both the CPU and platform controller; they can all just go to the CPU.)
> The remote access stuff is usually marketed as "vPro", and it's sold as a premium feature.
AFAIK, in all modern Intel processors the IME is directly attached to the network hardware and gets its own SR-IOV virtual instance of the hardware (with its own MAC) to talk to. But not all motherboards support vPro: the onboard Ethernet controller has to be designed with awareness of the IME so it can recognize its packets as additional Wake-on-LAN packet types, and not all are (yet). So—at least if you're building custom—you can just buy a non-vPro motherboard to "wall off" your computer from secret probes of its ports. (Or, you know, use a router. It's not like these packets will make it to your computer from across the Internet.)
With how ASIC & firmware companies do things, best to assume they're all the same thing until proven otherwise. As in, the circuitry and software for doing any of that is there but only made visible/usable to by a configuration that changes the more they pay. This is the most economical thing for hardware makers to do as they build just one thing instead of several things. This was also the trick some hard disk makers use where they put the same platter in all the drives but a firmware option limited how much was visible to user (less money = less visible).
So the next question would be whether it's possible to remotely reconfigure the "non-vPro" machine to activate the features, in which case it could be considered a direct equivalent.
And that's what we don't and can't know the answer to since it's a black box. Hence, that circuitry can't ever be present on a machine if one is worried about potential compromises or subversions of it.
Derefr's answer in this thread suggests that you can insulate against this regardless of how the vPro activation works; not all motherboards support the parallel packet stream, so you can get a vPro incompatible motherboard and trust that even if there is a "wake and activate" command it won't be received.
Examine the binary for a run of the mill system sometime and check out the partition headers/manifest, see if you are convinced there are no networking functions in there. The processor in my laptop supports WoL (and I've tested that it works), but is not vPro (double-checked with Ark). I'm skeptical.
This slideshow discusses a rootkit in the old Intel ME chipset (which comes with most Core2Duo/Core2Quad processors).
The Intel ME can't fully be disabled because the CPU expects a heartbeat from it. There have been successful attempts to disable large portions of the Intel ME code, but not all of it.
These guys are looking to replace the Intel ME's firmware with a fully open-source version: http://me.bios.io/Main_Page
Tails can boot from USB. You don't need to not use Windows/OSX, you just need to boot into Tails prior to using PGP.
Is it inconvenient? Yes. Almost all privacy/security is. Security is largely a series of tradeoffs between privacy and convenience. The more privacy and security you desire - the less convenient everything is going to be.
Burner phones, faraday cages, encrypted harddrives, using cash/bitcoins (which should be purchased with cash), using a VPN over Tor (and not Tor over a VPN). Making sure any phone calls from different locations with none being near work/home, that the call is kept very short and preferably with a short and precise coded language. Cycling your PGP key frequently, and scheduling all electronic communications as to not give any hint of any "active hours".
All of those are inconvenient, but necessary, if privacy and security are desired.
You don't have to run exclusively on any operating system. You can use multiple operating systems for different parts of your job, and there's a whole spectrum of convenience / security for how to go between the two that you can choose from.
> So for those us that need to run Windows/OSX to run software, like Photoshop, for our job, we should just give up on PGP? Seems like a supporting argument for the article then.
If you're paranoid, run that stuff in a VM. If you're really paranoid, run it on a separate machine.
I think the title is a little inflammatory. The conclusion does not say we should stop using PGP but consider the weakness inherent in its operating model and assumptions when evaluating future replacement. I think it is fair to say that the world is still waiting for said replacement, and until that arrives, PGP still has a number of valuable properties, one of which being it exists.
Thats an impossible statement to prove. Signal is centralized with a concrete list of users. PGP is decentralized with no possible way of knowing how many use it or dont
according to http://keys.mayfirst.org/pks/lookup?op=stats there's currently 4594571 keys on the bulk of public keyservers. considering the rumors that facebook and others also do signal and their user base is around a rumored billion, the k that i glossed over is around 217. even if we assume that there's dark masses that never ever used a keyserver, we ignore the fact that out of those 4.5 million pgp keys most are expired, revoked or simply lost, so the active keys on the keyservers are probably much less, and thus k is also much bigger.
"I also do not recommend using a centralized service that keeps your keys on a smartphone. However I warmly recommend using the Signal Protocol whenever messaging is to be done. Signal can be a direct replacement for PGP someone just has to code up the whole thing (time to flesh out signal-cli)."
signal the app exists and is much more used than already than pgp ever was. it's a different story that i don't recommend it myself, only the protocol.
My biggest point of contention with this is... what should replace it? PGP is the current and retroactive psuedo-standard for verification for everything from email to code to builds.
Any replacement would have to be at least semi-compatible, so as not to break the (likely) hundreds of solutions relying on and expecting PGP.
This is in fact the problem with trying to write eulogies for PGP. PGP is still a useful tool, just not for the application it was originally intended to: it's a very idea to try to retain secrecy among even small groups of people using PGP-encrypted email.
What Keybase does with their new Key model and the way they send encrypt messages is pretty nice. The way they do the public directory is pretty nice, but improvement can still be made.
I used to be skeptical about this... but if the Signal protocol sees more widespread adoption outside of the Signal app and Whatsapp, it could be a good fit.
I'm very open to hearing about reasons why this wouldn't be the case, though.
There are times when you want non-repudiable signatures, times when you want to be able to keep an archive, times when you want your messages to behave more like letters than like spoken conversations. PGP is still the best fit for email-like use cases and long-lived identities; Signal et al don't even try to address that use case.
the protocol is very generic and does not require anything related to phones. it is actually 3 parts, a key exchange, a signature mode and a ratchet. how you combine these is up to you. the app is one way to combine them with phones. there's other ways to use the protocol, that could also be applied to emails
There's no reason why the Signal app couldn't allow you to register mailto:joe@example.invalid as well as tel:+18005551212; it just, currently, doesn't.
And of course the protocol in general is much higher-level than that.
Perhaps I'm just out of touch, but I'm not familiar with any of the alternative tools they mentioned. If we retire PGP (and its GNU clone), what widely available tool should we use in its stead?
>but I'm not familiar with any of the alternative tools they mentioned.
The tinfoil hat part of me has been seeing this push for the Signal protocol straight out of nowhere and am a little worried its being done by a state level actor who knows something about it that we don't. Its much younger than PGP and as such has had less eyes on it. I also don't think Moxie Marlinspike, the founder of Signal's owner - Open Whisper, has the cred and trust Phil Zimmerman had, at least not yet. Its particularly worrisome as he seemingly is only known by a pseudonym.
I also would consier S/MIME, which is baked into most feature-heavy email clients including iOS, a practical alternative to PGP when the use case is email encryption. You and a friend can get certs easily, put them in your client via GUI, and be done with it. No command line skills needed if ease of use is the big complaint here. For the less technically inclined its a pretty good solution, pun intended.
I think I've read about Phil's shortcomings, but in the end he made it happen both technically and politically. I'm ok with software being a community effort and the less talented being helped by the more. Its a group effort to me and I don't believe in the coder superman mythos is required for good software. Not everyone can be a Linus or a Carmack.
opmsg does masquerade as gnupg on the cli, if you take this further you could create a gnupg chameleon which detects what keys are available or what the input is based on auxillary info and then invoke the appropriate tool (which might be gnupg, or something else).
on a different note: gnupg is not widely used, signal is.
Signal is for text messaging and fails as an email replacement. It also does not work for signing code and it requires you give over your contacts list to a third party to work.
Signal is actually quite sufficient as an email replacement. You can send media attachments, and even include several recipients. It is unquestionably a better encrypted messaging system than PGP.
"It is unquestionably a better encrypted messaging system than PGP." You're going to have to back that up. In no way is Signal more secure or private than PGP. Signal is simply easier to use for a very small use case, text messaging.
pgp does not provide anonymity at least not in the widely accepted form. every message has the recipient keyid in plaintext, unless you --throw-keyids but then you run into incompatibilities and inconveniences that make the whole exercise user unfriendly and widely unsupported.
> every message has the recipient keyid in plaintext
The keys are not required to be centralized in any particular location. There is no way to tie a key to an individual, unless that individual wants to be associated with that key id.
It's common practice to post anonymous, encrypted messages on mailing lists or newsgroups. All you can really tell in those cases is that the recipient is a member of that mailing list or subscribes to the newsgroup (though it's not for sure, with the use of remailers, etc).
It provides pseudonymity. KeyIDs exist but they're not inherently tied to e.g. your real-world location the way a phone number is (to an attacker with access to the towers). Whereas OpenPGP can reasonably be combined with e.g. Tor.
By the way, is it possible to use Signal to communicate with people using WhatsApp? Probably stupid question, but since they both use Signal protocol, phone number and stuff I would guess it should be possible to write a bridge that reveals to the MITM (FB) only your phone and the fact you are talking to the person they know, and blatantly lie about everything else, such as your contacts, seen messages (if desired), etc.
Larger than the 10 MB attachment limit of most email providers.
I'm unaware that Signal is geographically limited in any way.
Signal is anonymous as your phone number is. PGP isn't anonymous either, so this seems like an irrelevant criticism.
Again, you can obfuscate the recipient as much as you can obfuscate a phone number. You can't obfuscate the email address you're sending your encrypted email to.
This depends on how they use contacts to match users. I believe they only collect a hash of the phone numbers you choose to share with them.
PGP is, supposedly, "Pretty Good Privacy". Code signing has nothing to do with privacy.
It's true that PGP is a very good code signing system, but saying that sounds like an endorsement of PGP as a privacy system. (And there are plenty of other good code signing systems, from TUF to Authenticode to signify.)
You've intentionally reduced PGP to your narrow use case. PGP/GPG will also sign and not just encrypt. If you're going to debate the subject do so honestly.
Yes, that's my point. PGP/GPG's ability to sign things is an "also", but it's the only thing it's good at. Build a version of GPG that only does signing and verification, and nobody will be tempted to use it for privacy.
Signing is an integral part of a good encryption scheme, to prevent against certain oracle attacks. The two functions are not so easily separated (if you're offering encryption).
Is the sort of signing you want to do for code signing related to the sort of signing you want to do as part of an encryption scheme?
This vaguely sounds similar to how RSA decryption/encryption and signing/verification are the same sets of operations, at the primitive level, making it easy to turn a tool that does one in to a tool that also does the other. But the actual high-level signing and encryption systems (e.g. RSA-PSS and RSA-OAEP) are not the same operations at all, and being good at one is no guarantee of being good at another.
This kind of PGP signing is also critical to the security of Linux software repos. Debian repos sign the contents of the manifest (which includes hashes of packages), and Apt repos sign individual files.
Signal is not much of a PGP replacement. There's a lot that PGP can do that Signal can't: signing, encryption of large blobs at rest, and key management.
I use PGP as part of my backup solution, encrypting my backups at rest with an asymmetric key. I can't do that with Signal.
Signal does a better job of practically everything PGP does with regards to message encryption. Yes, PGP is more useful for encrypting files or signing updates. I agree that it's too early to write of PGP for those applications. But people should use Signal instead of PGP for message encryption.
> But people should use Signal instead of PGP for message encryption.
I won't disagree; when your usecase is in Signal's wheelhouse, by all means, use it.
But as someone who uses PGP regularly; the limit of how I could use Signal instead of PGP is limited to the occasional transfer of PII, passphrases, and private keys (something I couldn't use Signal for, since these are typically sent between GUI-less hosts). A very tiny fraction of my PGP usage.
"Signal does a better job of practically everything PGP does with regards to message encryption."
Except for endpoint security. The ultra-portable, self-contained implementations of PGP can run on countless configurations of desktop or embedded system. Transport methods also vary if they're funning messages or files through other apps. All sorts of hardening or isolation techniques can be applied. Remote attackers have a lot to look at trying to break or bypass GPG for an arbitrary user. Whereas, vast majority of Signal use relies on one app with two OS's.
The extra security tech and obfuscation you can layer on PGP/GPG is still an advantage in its favor until competition gets that.
> But people should use Signal instead of PGP for message encryption.
Signal the protocol or Signal the service? There does not appear to be a mature FOSS toolchain for the former that can replace gnupg and Thunderbird/Enigmail, and the latter is only available on Android and IOS smartphones.
Any chance you could elaborate on this? What about email is inimical to secret messages? Why are they less secret over email than through some other medium?
How would that work? Last I checked, Signal used Signal servers for key exchange (or whatever the equivalent in their lingo is). Is there any way to use Signal without relying on their servers?
There are quite a few things listed there, but I fail to see one that looks like a replacement for PGP. Many are some form of PGP compatible encryption - some different platforms Android(AGP), Windows(GPG4Win, Fort) Apple(GPG Suite for apple mail). Some will only encrypt files with a password, others encrypt your disk. There might be some good software there, but I don't see anything that can replace PGP/GnuPG.
Using PGP/GnuPG isn't always easy, but it works for so many things. Signing code, encrypting files, emails, signing messages - replacing it is a fairly high bar.
I'd love to see encryption easy and used everywhere, but a few popular apps don't add up to replacing this old work horse.
Interesting project, but I'd personally want to see a lot more involvement and vetting from the crypto community at large. Especially when it creates new protocols like DH key exchange over email.
Another "I don't think PGP is good enough" and "here's all these things"
Yet none fully replaces PGP yet. Before you actually retire PGP, maybe you need one of these projects to finish a real, complete, reviewed and high quality replacement ;-)
some of these tools actually fully replace pgp (see opmsg for example) however that is actually a very low bar to master. it seems pgp is seen as a silver bullet handling all use-cases like a charm, this is far from reality, actually it fails in many cases, and specialized tools might actually fit the purpose much better, also surpassing pgp in their special niche.
I started reading to know what's wrong with PGP, but it very quickly escalated to the discussion about making educated bets about cryptography as a whole. I think this is hugely important topic and it is a real shame this is not being discussed more. Maybe security people a more conscious about that (I surely hope so), but general public doesn't seem to be. And by "general public" here I actually mean self proclaimed paranoids and not your grandma or a girlfriend. We talk a lot about if something is proclaimed secure by so-called experts, about theoretical weaknesses of Telegram or something, monitor important 0-days, buzz about how bad it is to give all your private data to facebook or google and how fucked we all are. But we rarely seriously talk about who our adversaries really are, what exactly we are trying to protect and if we're using the right tools for that. About making educated bets. And in the end of the day, this is all it is actually about — making educated bets. Because not all our data, not all our accounts are equally important, and they are not equally important to the different kinds of adversaries. So the only way to be somewhat secure is to recognize, that there's no absolute security and we cannot protect everything. So better start taking it consciously and focusing on what's really important.
This is a bit of a straw-man argument. Forward secrecy or not, if you can get root on the client device, you own everything. So if you are a journalist/whistleblower, and have invested the effort to learn PGP, you should use Tails or something more appropriate for your job than windows or a mac.
Edit: This may be a good use case for hardware support for trusted execution (Intel SGX), along with all the other nasty features that it brings (DRM). The threat model for trusted execution is that the OS cannot be trusted whereas the app is sacrosanct.