Great summary, thanks. I was really surprised that the EFF left off a crucial item:
9. Raise awareness of the practical and moral risks of mass surveillance.
My own view is that the risk of surveillance is predicated on the reliability and intent of the Watcher (in this case, the USG through the NSA). If you had a Watcher who never makes an assessment mistake, and who's intentions are always "good" (in the narrowest sense of being in your personal best interests) - who could object to such a thing? But we are faced with a world with plenty of Watcher governments who would oppress their people with this power, and already are. Citizens living under such regimes in places like China, Iran, Saudi Arabia, are people for whom the risks of mass surveillance are real, now.
Officially endorsed mass surveillance is a lost opportunity to emancipate resistance movements from the surveillance of their oppressive governments, which is a huge moral risk to us.
It's important to remind ourselves what the USG/NSA thinks they are giving up if they give up mass surveillance: they think that America needs to know everything because everything is potentially useful in assessing a threat (or an opportunity). To surveil the world (and interpret the data properly) theoretically minimizes a nation's risk of attack, and maximize it's effectiveness in virtually every field (diplomatically, economically, scientifically, etc). That's a lot to give up.
I think also they should be raising awareness of both the cost and the ineffectivness of mass surveillance. It has failed specatularly to stop some of the worse attacks, because all the resources where tied up playing video games on thier databases, instead of old fashioned boots on the ground monitoring of suspects that on the large where already known to them. The sheer cost of building the interception infrastructure is on a nightmare scale, maybe people would care more if they knew how much it costs them personaly.
> My own view is that the risk of surveillance is predicated on the reliability and intent of the Watcher
This will never be possible because the surveillance is global, there will never be a Watcher that is globally considered reliable and well-intentioned.
Disappointingly, there's no mention of money in here, ie EFF fundraising targets. They pull in about $9m a year and spend about $6m [1], and their primary external expense is litigation. For perspective, this is <<5% of what the NRA brings in and puts out every year. I think it might be worth extending this game plan to include straight-up lobbying activity rather than hoping that the techn companies will divert some of their lobbying budgets to promoting privacy (which in many cases would be directly at odds with their business models as far as consumer data is concerned).
I don't like lobbying much, but the fact is that our US political system is less a democracy than an auction. I don't see much prospect of 'a global movement that encourages user-side encryption.' I was all over PGP in the early 90s but honestly I don't remember the last time I bothered to encrypt something, and if you asked me for my public key I'd need to generate a new one. It's not like I don't know how, I just can't be bothered.
As a 501(c)(3) nonprofit, EFF could potentially dedicate 10% of funds to lobbying. They would need (and maybe already have without my awareness of it) a 501(c)(4) group to do heavy duty lobbying activity.
I would be curious to know more about EFF's fundraising activity. I can confidently guess that the NRA gets a lot of money through direct mail and employs dozens (hundreds?) of major gift officers to build relationships with high-net-worth prospects. These donors are probably majority old white conservative guys. I'm sure EFF's potential donor base has little overlap with them, and I'm not sure if the same strategy would work. They should still have MGOs to get high-dollar gifts from individuals, foundations, and corporations. But where the individuals are concerned, I'm not sure how they would prospect since direct mail probably doesn't work as well on people who care about EFF's issues.
Here in Europe, especially Germany, crypto parties pop up all over the place since the Snowden leaks. PGP & friends are more popular than ever, and receive more and more attention (although not yet "mainstream").
I'm personally banking on the idea that convincing more people to go data-only will be the game-changer. Apps like "Hushed" [1] already wrap phone/sms functionality from Twilio inside an amazing interface where you can pick up numbers anywhere on earth for any period from a week ($3) to a year ($30).
The thought is that when everyone is just seeking out the cheapest data deal, not only does is unbundle and disempower the giant telcos (a good thing), but it also gets users and developers in the mindset of a single data tier. This is where real security and privacy gains can be had.
For example, I've gotten some friends using textsecure, but they've ditched it after being confused why turning off their data prevented them from getting timely sms messages. (Encrypted text messages are silently routed through data when conversation partner has textsecure, even though the identifier seems to be a regular phone number to the user, just like with SMS messages.)
When users start using all-data plans, tools will be developed to help users go into a conservation mode akin to turning off data, just as they do now. Instead of turning off data, users will be able to use a simple firewall to block everything but voip/text messaging apps.
As an added bonus, any trend toward making data-only more convenient will also mean that using wifi-only devices and mobile hotspots will be equally convenient. Those who want the manual baseband separation of that setup (unavailable on current full-featured phones with radios), will not have to jump through crazy hoops to get it, and so security will take a big leap forward :)
Interesting. One concern I'd have is that I actually find the segmentation of data / sms / voice to be a feature because it corresponds well to my priorities. For example, I can turn off auto-sync to save battery power or to decrease distractions from my phone, but still have SMS and be available. If people really want to get a hold of me asap (and they're important enough to me to have my cell number) they will text me.
Of course there are alternate ways to allow users to do this--Google circles and Facebook lists come to mind--but they all are a pain to maintain, use.
Indeed. When CTOs and CEOs can't help but defer security fixes before anything else, since features matter now and security matters sometime later, how could the "common person" ever choose real security?
They might be scared into paying someone money to make the problem go away, like personal firewall and anti-virus (which as we know makes real applications break more than they block malware from working), or they'll do something therapeutic but useless, like some people used to run defrag every day.
I'm not sure exactly what will happen in the future, and besides ever bigger and more disruptive hacks, I'm sure we'll carry on somehow with sillier and sillier measures, adding layers to the ones that already don't really make sense if you think too hard. People always manage somehow.
I think this point is important to creating a wide spread adoption of encryption tools, and here you present one of the major barriers to adoption.
As much as I hate this term now a days, some way to gameify the idea of using encryption might help spur people's use of the technology. Once enough people have adopted a specific app or software then it's much easier to bring outsiders onboard.
Personally I think it's about making it easier for things to be encrypted. For example, when you save a document, perhaps the text editors should just automatically encrypt the file based on a password the user setup when they first installed the program.
Right now things like PGP are difficult for even "experienced" users to setup and maintain, let alone the average joe. Setting up encrypted volumes, or directories, etc... it just goes over most people's heads and/or their willingness/amount of effort, so they just don't do it.
I think switching from "you should consider encrypting things" to an "everything is encrypted by default" society, propelled forwards by software that makes it easy, automatic, and default... that's when we will make progress.
Although for most people the risk of forgetting a password and that way losing their documents forever is so many magnitudes higher than the NSA reading those mostly harmless documents.
I think also for that reason it is that our front doors are easy to open by a locksmith without the need of destroying them completely.
And in this case, the NSA has global keys to a lot of services designed like front doors. The difference is that the effort required to break open the front doors of every home in the world is unreasonable, whereas the effort to break into every back doored or security-free software solution is reasonable enough for the NSA to do it.
Better option than passwords is keys anyway. Way too many services depend on human memorizable numbers, which we know all too well is a transient thing at best - every year, as computational power rises, the "average" password gets weaker, and nobody uses good passwords.
People just need a personal key to encrypt and decrypt with. They need to protect that key, maybe by password, maybe by printing it out. One good key can easily replace all the horrible password schemes in the world.
But software needs to be written, especially save dialogs, to easily "pick" a key to encrypt with, and when you open files a history of keys used and a browser to open keys to decrypt opened files with.
Of course Windows will never have anything like that, but at least Dolphin and Nautilus can look to adopt those kinds of workflows.
Agreed. An impressive piece of transparent encryption technology is Apple's Filevault system, which on-the-fly encrypts and decrypts your files when you save them and open them. Users don't need to know it exists at all, it just works.
It seems to me that encryption can be great in some ways, but too much could lead to a very closed system. Something that I view as the opposite of progress.
It has, and TextSecure is excellent. But being owned by Facebook, and their previous security record, has massacred WhatsApp's credibility amongst many users I've spoken to it seems.
To the point where friends have been talking to friends on the technical merits of them, and some have chosen to use Telegram, which is… um, not of the same quality of design as TextSecure, I'll say politely. But it's not owned by Facebook, and apparently that counts for a lot because people really do distrust Facebook that much. (Despite saying that on Facebook, I gather. Irony.)
I have the feeling TextSecure/Signal is frustratingly close to perfect. If it gained the ability to do voice and video well, gained a good desktop client (a slim one, without ads!), and could use usernames instead of/as well as phone numbers, it'd be poised to replace most of the common uses of Skype.
Add metadata protection in some form (onion/garlic routing?) and a distributed network (Tox used DHT) and we'd really be onto something. Although metadata protection is a particularly hard set of problems, especially when you want essentially real-time communication and low battery life. Still, we're in interesting times, and we have tools to make a whole new set of interesting design tradeoffs!
But being owned by Facebook, and their previous security record, has massacred WhatsApp's credibility amongst many users I've spoken to it seems.
Which users do you speak to?
I have lots of friends who are not technology-minded and zero of them care that WhatsApp is owned by Facebook. Heck most of them don't even know. It doesn't have the FB logo anywhere. I think this is a non-issue.
What's more, my experience has been that most of my friends (on the rare rare occasions when it comes up) trust WhatsApp a lot more than most tech products, because it doesn't have any ads and asks them for money occasionally. Though lots of us don't actually seem to get charged. I keep being given free extensions.
WhatsApp has huge network effects at this point, it's the de-facto standard outside of the USA. So implementing the TextSecure protocol is a huge deal. Not only does it directly help lots of people but it sets a precedent that PFS is not only for nerds and geek products but can be integrated into consumer products too. It raises the bar for everyone else.
Obviously different users! I'm aware of the network effect, and I massively commend them for such a bold, privacy-oriented approach, particularly given their previous track record. (Well done, Moxie! How'd you talk them into that?)
Everyone I know who was aware WhatsApp was Facebook-owned brought it up as a strong, overriding negative; others who heard that followed suit.
Perhaps that says more about Facebook's perception amongst that demographic than WhatsApp. (The plural of anecdotes is not data, I hasten to add.)
If there's a lesson from this, perhaps it's: something new and better needs to come from, if not trusted people, then at least not mistrusted people.
I'm still wondering where xmpp and otr failed. Because it apparently has - nobody uses or implements them. I have a perfectly fine xmpp client here that supports OTR, but nobody I know can or will use these services, despite them having most of what I would consider important - video chat, file transfer, persistence, logging, emoticons, and my client even downloads linked images and videos and displays them inline.
I use XMPP+OTR where possible, and it can be quite painful.
Support for multiple active endpoints isn't solved, at least for the clients I use, so I have to hope that whoever writes me picks the endpoint I'm actually at or I have to go through a bit of unencrypted back and forth to make them connect to the right one (I think XMPP actually has extensions for mechanisms that could solve this, syncing messages between endpoints, but I've never seen them implemented).
At least some mobile clients only maintain the OTR session while in foreground, leading to massive notification spam on the other end/delays while the session is recreated.
No encrypted offline messages. (Yes, I know, not possible while maintaining the full guarantees of OTR, but it adds complications to the workflow)
All these don't really make for great UX, and many people are not willing to put up with it.
Viber is also huge in the Philippines as a telecom there offers free Facebook + Viber without needing prepaid credit/wifi. They could roll a textsecure layer too
WhatsApp is also owned by facebook, so.. i doubt they cut of ways to get to the communications. I think it might use textsecures encryption but they will also have key.
Sorry but my experience is the opposite. Here in France most mobile phone contracts give you unlimited sms + mms. I never felt the need for whatsapp personally, neither do I know anyone using it. (but Viber over wifi is often used by people traveling/working abroad)
No need to be sorry, data > anecdotes. I use Whatsapp with a group of 20 friends to organize nights out and basically use it as a long-running IRC room. We also have different groups for movies, board games, etc. It's the group functionality that makes it better than SMS.
It's very popular in Europe, even more so (as far as I've heard) in South America, where it's used mainly instead of SMS. Heck, even my father use WhatsApp and I can't make him stop clicking on "You have 1 new message" ads. Though I deleted my WhatsApp as soon as it was acquired by Facebook.
That's why we have to build decentralized backend systems to follow them around and store their data. We also need to focus on ease of use on the frontend of those systems. It's not like they are going to do this themselves. We have to do it for them, for all of us.
A friend of mine used the Dropbox API to let his webapp's users store their own data, encrypted and decrypted with npm crypto. If I recall correctly, the user just had to OAuth with Dropbox on signup and sign in -- the rest was handled seamlessly.
I just wrote him to ask if he's blogged about this. Send me a message (in profile) if you'd like more info about this approach.
What made me abandon all hope was when Glenn Greenwald, presumably an expert journalist when it comes to surveillance, said he was reluctant at first to install a cryptography software to communicate with Snowden.
Please don't. I realize this was said tongue-in-cheek, but the ramifications of using local encryption without understanding it are large. You have to be willing to take on key management, or you risk losing every bit of data or being just as insecure as you would have been without encryption (with a slower computer as an added bonus).
Unfortunately, it takes at least a little buy-in from the user to make encryption work. We need better tools, for sure, but that final piece of holding the key needs to be taught and the responsibility accepted.
15 years ago that might have worked, and even then 9/11 would have spooked all the level headed people enough to get laws railroaded through that would take the sting out of all of the above.
Now the horse has bolted and the barn is empty.
> Pressure technology companies to harden their systems against NSA surveillance
Tech companies are only going to be heard in one way: by boycotting the ones that don't have their customers best interests at heart.
> Create a global movement that encourages user-side encryption
User side encryption is great but with the number of exceptions and the way users have been trained to click 'ok' and 'accept' whenever presented with a dialogue box you'd need a major attitude change before that could become effective. In short, nobody cares because they believe they are not the ones in any kind of trouble or under suspicion.
Even here on HN where there is I'd hope a better basic understanding of these issues there are plenty of people in the 'I've got nothing to hide' camp as well as those that believe the state is holy and should be given every benefit of the doubt. This goes for the USA as much as for other countries.
> Encourage the creation of secure communication tools that are easier to use
That one might actually work. But there will always be a fundamental conflict between security and ease of use.
> Reform Executive Order 12333
Not enough data.
> Develop guiding legal principles around surveillance and privacy with the help of scholars and legal experts worldwide
Lots of paper will be produced, lip service will be paid and nothing will change.
> Cultivate partners worldwide who can champion surveillance reform on the local level, and offer them support and promotion
That's pretty vague but it might work and at least help some situations but for the most part the adversaries are nation-states and those have a lot more power and capabilities than you're going to be able to deal with from a local point of view. Bottom line: if they want your data they'll have it.
> Stop NSA overreach through impact litigation and new U.S. laws
The NSA will simply continue to do what it is doing today, only it will try harder to stay out of the public eye.
> Bring transparency to surveillance laws and practices
Good luck with that. Even today we don't have the whole story about what happened 40 years ago, let alone that we'll get transparency in such a timely manner that those directly affected by these activities will be able to make a stand.
As you can see from the above I'm not exactly bullish on this effort, I sincerely hope they'll make an impact but I think that the only place where you could really make change is at the ballot box and sadly that one does not figure into the plan at all.
An organized, carefully executed plan to thwart global surveillance is a good thing. But EFF laid its plan out, and critiques of that plan are fair game.
So: our industry has largely Google to thank for:
* The development and deployment of TLS forward secrecy, a technology that has very little operational importance to big companies but that is critically important for increasing the cost of NSA surveillance.
* The development and adoption of strong, modern elliptic curve cryptography in browsers (the Curve25519 CFRG recommendation has Adam Langley's name on it).
* TLS certificate pinning, which Google pioneered in Chrome, which not only drastically reduces browser susceptibility to CA-based attacks on TLS but also transforms the Firefox and Chrome installed base into a worldwide anti-spoofing surveillance system.
* EndToEnd, the Chrome Javascript implementation of PGP, the team for which includes Thai Duong, of BEAST, CRIME, and POODLE fame, who Google was smart enough to snap up.
* Years and years of the Chrome sandbox and runtime hardening, which has significantly driven up the cost of viable browser clientsides, which are probably the most important software security weapon in NSA's arsenal.
* Years and years and years of Chrome browser security work from people like Michel Zalewsky --- see things like "Notes On A Post-XSS World" for a taste of the security ideas that will be banal and commonplace 10 years from now but that people will forget Google funded.
* One of the industry's best organized large-scale fuzzing and bughunting operations, shaking out hundreds and hundreds of bugs in things like video and image codecs.
I think that first section in this plan could have been written more carefully. I do not see how it could have been written carefully and retain the sense of urgency that the rest of the document has. And that bothers and worries me.
Hey: on the other hand, maybe Google is just as happy with it as EFF is. After all, if they're setting a high bar for themselves, all the better if a bunch of other companies are required to clear that same bar. I know a fair bit about the technical work Google is doing, but virtually nothing about EFF's motives for phrasing things they way they do.
Yep, Google had the foresight to hire the best security people and it's paid off for them and the community.
Another example: privacy activists were saying we should all move to Firefox after the Snowden leaks but ignore how Chrome is very far ahead of Firefox in terms of security and exploit mitigation. So agreed, we shouldn't entirely dismiss Google.
Although I think the centralization and incentive structure that has been a result of Gmail/Google and the ad-based software economy are the big reason why the NSA is so powerful regardless of crypto/tech-breaking capabilities.
All they need is a warrant - handed out via secret courts - and they get everything on any persons life, from anywhere in in the world. This results in a greater risk over IMO, but Google is not entirely to blame for that.
These days people seem quick to defend Google's ad-tracking "because otherwise how would they make their money?!".
But I think Google only started doing cross-site ad-tracking a few years ago, and it only got serious about it when it "unified" its Privacy Policy 2-3 years ago. That was so they can track a single "persona" across all of its services. It wasn't to give you Google Now (or at least not the main reason).
So maybe we have 5 years at most of seriously intrusive ad-tracking. I think Google was doing pretty well financially before that, too.
I don't mind contextual tracking on the site, so they can show me ads based on what I'm reading on that page then. I don't find that particularly intrusive, although I could see how NSA can use that as well.
However, the part about tracking you everywhere and then combining that "anonymized" (but not really) data to create a "profile" (or dossier, if you will) of you is what's really creepy.
Google is pretty good at security, but pretty bad at privacy, and sometimes the two conflict quite directly, making the first worse for it - see no end-to-end encryption in Hangouts, yet Facebook's Whatsapp (supposedly) has it.
It's a pretty obvious evolution to their core business of advertising.
Step One: Advertise
Step Two: Target advertisement with history on the computer
Step Three: Target advertisement even better with history across several computers
It's a natural progression without any mental gymnastics, and "They were making enough money at Step Two, why would they need to go to Step Three?" seems like a very silly argument to me.
None of those things actually thwarts NSA surveillance, because most of the NSA collection methods occur inside the datacenter. Based on one slide, inside Google's data centers.
All the client side TLS improvements in the world don't change the fact that Google -- even just through Google Analytics and Gmail -- is set up to collect everything that everyone does online.
In fact, all of these developments (while super awesome, and not by themselves bad!) are entirely consistent with the goal of giving surveillance access to US interests, but not to anyone else.
My understanding was that the majority of the access was by tapping the private network connections between google, yahoo etc's colocs, at points outside the coloc. This has prompted these companies to start encrypting data accross links they would previously consider secure. Another task should be listing all companies known to be activly working against thier own paying customers, like att and cable and wireless/mercury. Which could form the basis for a consumer boycot. Hit thier partners, so that cost of doing that business becomes too great.
To add a few details to the people section of this:
If you want to see a great résumé, check out Michal Zalewski's - it's on his website.
He is also responsible for American Fuzzy Lop, which is an incredible fuzzing tool for application software[1].
In fact, read his book, The Tangled Web as well, because that should be on every developer's shelf, security engineer or otherwise. It has security best practice checklists covering how the internet and browser works with enough technical detail for developers to safeguard themselves against the most common pitfalls.
Google also employs Eduardo Vela Nava on their security team, who is known for his research in browser encodings, legacy compatibility settings and how these lead to serious security flaws (see Web Application Obfuscation). If there is a way to trigger cross-site scripting on a web application, Vela Nava can achieve it (and he can find it as well). He is a leading expert on web browsers and their various (in)security mechanisms.
Then you have Chris Evans, who leads the Chrome Security team and singlehandedly wrote vsftpd, the primary ftp server shipped with Red Hat and SuSE. He built the Chrome security team into what it is today by hiring the best talent from around the world.
This list goes on, and on, and on. Google has attracted many of the greatest minds of this generation in information security.
>> critically important for increasing the cost of NSA surveillance.
I've said this before and I'll say it again: the focus should not be on increasing the cost of surveillance, because ultimately it's the taxpayer (us) that foots that bill.
Who are you quoting there? Unnecessary was definitely not what I heard. It's been very seriously discussed. In fact, the TLS Working Group had a draft - but then I think the chairs decided to consult the Cryptographic Forum Working Group at the IRTF for recommendations (despite CFRG already having said "yes, Curve25519 is basically fine" at the meeting before that).
In retrospect, this may have been a mistake. CFRG is not good at making choices. It led to months of delay and quite frankly bitter argument about recommendations for new curves. After a bit of discussion about choosing reasonable criteria, which seemed pretty straightforward at the time, it all went a bit pear-shaped.
(Disclaimer: the below is my own personal impression of events; an opinionated rant from someone involved with the process, not a precise factual recollection, which you can find - mostly - by looking at the CFRG list. Please do not quote me here.)
Microsoft Research came in with a we've-been-working-on-it-for-years 'nothing up my sleeves' curve generation method, all shiny and matching exactly what the TLS WG asked for… except amongst other things the curves didn't have complete addition laws like Curve25519 did, and their reference implementation's fast comb algorithm was very neat but they were very conveniently "unaware" that they'd filed a patent on it [or, at least, an algorithm that bears a stunning resemblance to it according to Mike, and with which I personally concur]. Despite, y'know, apparently doing a careful legal review to release it under Apache2? Hm. (Not that that'd ever impact another implementation, like say, everything not Apache2-compatible…?) We'd probably never have noticed, except for a very heated off-list thread I had the dubious pleasure to be involved in (…wow). But that patent wasn't about the curve generation algorithm itself, so apparently that was perfectly okay according to them, because you could always use a slower technique for the curves generated thus? - not that they were willing to confirm that the patent actually covered that comb algorithm, and they'll get back to us, but they never did, so Mike Hamburg disclosed it on-list just in case, and, well, yes.
Then we were going to have a nice performance bakeoff with SUPERCOP and the candidates (and to the extent that we did, Curve25519 still won at the ≈WF128 level, even with ECCLib using the patented comb).
Then the Brainpool guys and their pet hardware (with repurposed RSA multipliers) came in wanting random curves… basically, Brainpool. Pointing out they'd already got Brainpool, didn't really alleviate that, they wanted a new one and they wanted everyone to use that. That went down like a lead zeppelin. (Random curves run like pants if you don't have pet hardware with generic multipliers, by the way: grab OpenSSL 1.2 and test for yourself.)
Then agl it seems Had A Word With the MSR guys offlist; after that Discussion they did a revised curve generation draft along with the hardware people, and what came out the other end (not) coincidentally generated almost Curve25519, which was a huge change in position for MSR. A little more discussion and then it actually matched Curve25519 later, agl did a draft merging that with the 25519 draft proposed earlier!
So we now have a draft adopted for us to argue about in the next phase, which seems right now to be in even more pointless bikeshedding about whether the wire format should be little-endian (because all the existing software for this curve does it) or big-endian (because meaningless traditions in honour of dead architectures; traditions we're explicitly allowed to ignore), which by the way the TLS WG already did with their draft, and decided 'keep it as it is, little-endian'. Or, more usefully, if there's value in wire formats beyond Montgomery-x - there's always going to be trade-offs there, perhaps it depends if we're going to have the same wire format for key agreement and signatures and other stuff, and we actually don't have to so maybe we can just make the best decision for each, if we can actually make decisions that is.
And this is before even getting into signatures, which is barely-discussed and a long way from reaching consensus.
Long story short - as long as you're not baking anything in unchangeably: yes, just go ahead and implement X25519 (that's the key-exchange algorithm formerly known as Curve25519, yes) exactly as djb did it, because it's what we ought to end up with anyway. Better still, use a good, well-tested routine already out there for it; the latest donna maybe, or libsodium, or NaCl. Better use a temporary codepoint for it until TLS WG assign one. I'm guessing Rich and agl may be taking a broadly similar approach, but that's for them. It certainly wouldn't be too hard to patch in. I haven't got a clue about where we'll end up with signatures I'm afraid, so I'm just using Ed25519, like GnuPG and SSH have, which I think is Pretty Good (pun intended) but about which I know there's some disagreement.
Guh. If I sound frustrated, it's because it can be extremely frustrating sometimes! (Oh well, could be worse. Could be Wikipedia! <g>)
All this ignores the contradiction between the privacy invasion business model and privacy needs. As long as you have major tech companies (most obviously Facebook and Google) reliant on being able to read the communication of their users then you're going to by consequence enable an NSA or similar to intercept all the messages.
There isn't a legislative answer to this problem, yet there are theoretical technical answers, but these do not fit with the business models in use today. To fix the problem the market dynamics will have to change.
It's not just the "privacy invasion business model" (which sounds a bit tinfoil-haty), it's the fact that protecting privacy is really hard. I work for Silent Circle, and the MO is to store the absolute minimum data possible, which makes it hard to use services most companies take for granted, such as analytics, error reporting, error logs, etc.
Doing your job is really hard when you can't use things like analytics services, or detailed logging, or proper feedback. Everything has to be open source, self-hosted and some things other businesses can easily use are just flat-out impossible to do. Any business that does something you want but that requires that data be sent to it will just not be used.
Few companies that don't explicitly have the word "private" in the description of their core product will be very inclined to jump through all these expensive hoops. Hopefully changing the legal situation so the NSA can't just jump in and grab whatever it wants will help this a lot.
Like everything else in security, there's a clear cost/benefit curve. It's actually dubious to make a distinction -- privacy is a form of security and its absence is a lack of security.
To get what I call hard privacy online, you must use full isolation and onion routing. There is no other way as far as I know. But we could go a long way toward making mass surveillance harder, less accurate, and more expensive by just deploying encryption, low or zero knowledge services, and by educating users to change their buying habits to favor more secure products. It would still be possible for a determined well-funded attacker to track you when using these tools, but it would raise the bar and that's a start.
While Facebook & Google get a marginal benefit from being able to read user-to-user communications, I doubt they're "reliant" on that. They have plenty of other explicitly-public, or intentionally-shared, signals to target their ads. For example, for now the only thing Google is absolutely 'reliant' on is seeing the queries you send their search engine.
Wait, so the standard by which we're deciding whether or not we're cool with unaccountable private entities reading our personal communications for their own (again, unaccountable) ends is whether or not they're "reliant" on it?
It's not that we're cool with it, it's that they don't need it. They can stop if customers decide it's sufficiently important. If all Google had to target ads with was the search terms for an individual search, and they otherwise had no idea who you are or any history on you, they would still be making megabucks. Progressive and All State are still going to pay a ton of money to show ads to anyone who types "car insurance" into the search box.
They might make less money than if they can target more accurately, but not so much less that their business model isn't viable. Which means the issue isn't the business model, it's how to actually design systems that preserve privacy.
People can and do sue Google all the time for violating laws and their own policies for Gmail. Turns out they rarely have a legitimate case, but the point is that Google is not unaccountable except in the imaginations of cynical paranoiacs who, by painting an exceptionally well-behaved company with the same brush as admitted adversaries, undermine incentives for companies to behave well.
No, but the idea advanced by ~fidotron, in the comment to which I was specifically replying, was that big companies like Google and Facebook will fight secure communications because they're dependent on viewing users' communications to make money.
They're not; other more-public or inherently-shared signals are far more important drivers of profit for them. Their businesses leave them plenty of room to be allies in a drive for secure communication.
(Pressure from their governments is of course another matter. But their business models are not a major problem.)
More precisely -- it focuses only on public sector mass surveillance. Private sector is in some ways more insidious since they are not bound by any laws, and are also in some ways even less transparent than the NSA.
Actually, it might be easier to simply treat them (NSA and Google) as the same entity, given that Google was created by the TLAs[1]. It seems the surveillance business model isn't really separate.
As for technical solutions - I seriously doubt that will stop much when you're facing what is effectively a new version of COINTELPRO. Especially when your enemy has access to not only the government purse, but also the giant pile of money at Goldman Sachs[2].
I fully agree there is a lot that needs to be fixed in terms of technical solutions - encryption must happen - but this is of government agencies going rogue and they need to be reined in if we want to keep what's left of this country.
--
Serious question for everybody:
Many high-level people in the government seem to be explicitly violating the constitution while generally acting like know the law can't touch them, and the NSA and CIA seem to be very successful controlling the situation (CIA: avoiding prosecution for torture when it should have been easy to at least get a trial, NSA: forcing bill amendments like the recent Section 309 mess, not to mention the blatant 4th amendment violation of most of their recent activities). These kinds of activities should be solved by legal prosecution and regulation, but those solutions seem to be growing less and less likely.
If the intelligence community has replaced some or all of the Constitution... at what point can w4e start calling this a coup? Also, at what point does actively trying to subvert the constitution - the thing most of these people should have taken an oath to defend - does this not meet the very definition of "making war" against the country?
Pretty much every piece of IR research at US universities is funded in part by programs that take money from defense programs. That article goes a long, long way from anything supported by the facts the author claims.
I support the EFF, but this will not help our biggest (technical) problem: we no longer own our devices and need to make uncomfortable choices to get them back - i.e. to stop using proprietary operating systems and (black-box) hardware. RMS was never as unreasonable as he seemed, he only thought things through completely and arrived at the right conclusions.
Also EFF is available on http://smile.amazon.com and almost every purchase you make they will receive like 0.5% as long as you make it through the smile subdomain.
I'd like to suggest another tool for fighting mass surveillance: mass chaffing. Most of the time our internet connections are idle. And when they aren't, we very rarely use all our bandwidth. What if, instead, we each used a small amount of our resources to crawl the web. We could easily generate 1,000 to 10,000 chaff requests for every genuine one — completely drowning out any signals from our browsing behaviour. This is particularly nice because it highlights the stupidity of trying to find a few needles by searching every haystack in the world. Let's all make hay.
This is a bit naive. Your ISP peering bandwidth isn't the complete sum of all its customer promised bandwidth. You have access to bursts of high bandwidth precisely because most of the time our internet connections are idle. Yes, this "burst" can actually be sustained 24/7 by some number of people, but not at a mass scale.
At first thought, chaffing seems like a pretty good idea, but isn't it vulnerable to analysis? By which I mean, if the spooks can demonstrate (based on tracking identifiable markers in your cookies or some other means) that you're at your office, shouldn't they be able to effectively ignore any traffic from your home internet connection?
Maybe if we were all running Tor exit nodes or something, but naïve chaffing sounds pretty ... well, naïve.
Obviously this "chaff bot" will need to copy your user agent string and tracking cookies. It'd also need to behave exactly like a web browser in every way that can be detected. But is chaffing itself a good idea?
If you're willing to use a lot of bandwidth for something like this, just set up a Tor exit node. Somebody else's actual traffic is going to be a lot more convincing than whatever some algorithm can come up with to fake it.
Trackmenot does this on some level, I'm using it currently to clutter up my search data. Doesn't matter much if it's just me, but if enough people use it then it's at least fighting back in some small way.
As software developers, we can help by building easier-to-use frontends. For instance, GPG is a great program, but the front end is lacking a bit the last time I used it.
We can also build systems designed from the beginning to think about users privacy and data security.
Front-ends for GPG (particularly: a good front-end that solely implements the 20% of GPG that regular people actually need) and for TLS certificate management are two of the most important privacy projects waiting to be done.
Last I checked there isn't even a good UI to perform simple file encryption on most OSes. It's awful. There are huge holes in the market. I would pay for that.
That's because simple file encryption is really hard, and when it works well, you don't even notice that it's working. Encryption is completely invisible to common people. It's not like locking your money in a safe, where you can feel the heaviness of the door and hear the clank of the lock.
"It's not like locking your money in a safe, where you can feel the heaviness of the door and hear the clank of the lock."
I think the perfect is being the enemy of the good here.
Personally I would pay for a really good UI for doing that -- just encrypting files so I can stash them safely in cloud storage or transfer really sensitive stuff. I'd love to be able to right-click and encrypt/decrypt with a GPG public key or symmetric passphrase.
I have full disk encryption, but that's very coarse-grained. Unlock the machine and it's defeated. My hack right now is to use encrypted OS X .sparseimage files, but that's OS-specific and clunky. I also have these scripts:
They kind of suck but do the job in a platform-independent way. They could be made slightly better by trying to secure-erase the original source file, etc., but they work.
The other problem with invisible omnipresent encryption is that it lacks a quality that I call "situational awareness." I like knowing that something has in fact been encrypted. Seeing a file extension change, like gzipping a file, tells me that yes in fact something has happened.
A classic example I use of poor situational awareness in security is IPSec encryption setup between two boxes. The only way I know of to verify that the traffic is actually encrypted is to tcpdump the raw interface and look. The (piss-poor) IPSec tools do not really tell you this in a non-confusing straightforward way.
Talk to @adamcaudill on Twitter -- he's working on an encrypting camera app (photos are decrypted offline) and would probably love to collaborate on, e.g. frontends for gnupg or reop.
It's too far up the stack and too coarse-grained. If I save a file on my Mac, it is encrypted, but if I transfer it to Dropbox it is decrypted and then transferred.
Right now I solve this by using command line tools or encrypted .sparsebundle images. That sucks.
My thoughts are that we could really do with encryption being the default. If I upload a file via gmail, it should be encrypted with the recipients' public key. If I upload a file to Dropbox, it should be encrypted with my public key. Both should be overridable (e.g. public Dropbox link).
The key itself can be kept secure using hardware tokens for encryption/decryption/signing. The major problem with that seems to be access across multiple devices. Would it be possible to produce a wireless access token with a secure handshake that could be used for all devices?
I have no idea how we'd get there or how we'd convince people to go along with it.
Others have cited Mac options, and I know kleopatra is KDE has a context menu on every file to encrypt, which I just tried and it works fine with my pgp keys. Like three clicks. Even supports compression.
There have been various efforts to solve this problem for e-mail recently.
Mailpile[1] is a great web-based mail client that you can host. It stores mail encrypted and has a very nice interface that makes handling mail encryption easier for users.
Protonmail[2] is both a mail service and a nice front-end for handling encrypted e-mail. When a user sends encrypted mail through the ProtonMail interface, the e-mail is sent with a link pointing to a webpage that asks for a certain shared password. They have raised more than half million in their crowdfunding campaign.
Both are very user friendly, very promising, and still in early BETA.
I've always thought that the problem with the NSA behemoth is not so much that its collecting everything, on everyone, but that its the only one allowed to do anything with it.
I don't think the solution is privacy. I think the solution is to make the NSA's databank available to everyone. This is just as impossible a goal, I know, as getting rid of the NSA in the first place - but hear me out.
Just imagine what sort of world we would be living in if everything the NSA has been collecting and storing about our modern civilization for the last 15 years was available for anyone and everyone to access. I'd be perfectly okay with harvesting all data, everywhere, on everyone - if everyone had access to it, freely and without secrecy.
I think, were this the case, people would be a lot more careful - and more importantly, aware - about the things they do and say online. The only reason this whole thing is so damaging is that its shrouded in total secrecy - I have no idea what sort of 'bad data' is in the database about me, or those around me - were I able to access this data, it wouldn't have as much of a negative effect on me, personally. I could correct it, get it removed, and so on.
Teaching encryption is one thing. Teaching the value of open and honest communication is another thing entirely. Somehow, the solution to the problem of having masters and overlords who have immense power over me, is for me to have some of that power myself. If this were true for everyone - and not just an elite class of humans who have decided to rule humanity through secrecy and domination - then I think the problem would be a lot less relevant. I know as an artist, I'd sure love to be able to broadcast my art far and wide; it seems the NSA would be a perfect platform for that. Kind of like what the Internet was originally promised to be/do, before it became the playground of the corporate elite ..
But its not just about "open and honest" communication, its about them also tracking behaviour, who gets to judge what is "open and honest" behaviour. Do you want somebody trawling through your search and browsing history?
As long as I know who is doing it, and where, I've got no problem with that - because then I can do something about it, like track the person down and ask them why they're doing it.
But of course its not as simple as that - of course there are still nefarious no-good types out there who would use this information to their advantage - and thats exactly the problem we have with the secrecy. If I knew who was tracking me, because their activity is also open and available for scrutiny, it makes it a lot easier to do something effective in the circumstance that I disagree with their doing so; it may also help me find others who are interested in the same things as me. Its a two-edged sword.
> "U.S. law and the Constitution protect American citizens and legal residents from warrantless surveillance. That means we have a very strong legal case to challenge mass surveillance conducted domestically or that sweeps in Americans’ communications."
-----
The EFF is living in fantasy land. The NSA has never considered itself bound to law because it's not a creature of statutory law but came into existence by decree.
> "no existing statutes control, limit, or define the signals intelligence activities of the NSA.”
As long as you can surveil one person, you can surveil seven billion people. Computers automate processes. As processes digitize, they become automated. The NSA digitized their surveillance process, and automated it. At that point, they could surveil one person via software, so why not scale it?
Should we accept mass surveillance, like nuclear weapons, as inevitable? If so, it should follow that we should adopt a "mutually assured destruction" inspired doctrine, whereby multiple countries agree on the limits of surveillance. Perhaps the world would be more comfortable if constantly surveiled, but wholly informed. Who has my data? What have they done with it? Can I see it?
Or maybe, everybody should be able to surveil everybody. True openness, and therefore predictable accountability.
> Should we accept mass surveillance, like nuclear weapons, as inevitable? If so, it should follow that we should adopt a "mutually assured destruction" inspired doctrine, whereby multiple countries agree on the limits of surveillance.
it may be, but the MAD analogy doesn't hold up because nuclear weapons have immediate and distinct effects. surveillance is hard to witness and hard to measure. MAD also uses the threat of nuclear use to keep other parties in check, while surveillance uses, what, the guarantee of use? it doesnt match up.
> Perhaps the world would be more comfortable if constantly surveiled, but wholly informed. Who has my data? What have they done with it? Can I see it?
i don't see that as particularly feasible
> Or maybe, everybody should be able to surveil everybody. True openness, and therefore predictable accountability.
perhaps we may move toward enlightenment about one another's secrets someday, but i'm personally skeptical of that path as a solution to surveillance.
secrets act as a stabilizing force. keeping something unpleasant can keep people in harmony. if the shock of reveal is great enough, it can break down relationships and the institutions they form, and i wouldn't want to risk doing that en masse. uncovering the unpleasant truths make us stronger, but we have to recognize that it's a process, and not all wounds should be opened at once.
but that's not wholly relevant anyway, because there wouldnt be a perfect symmetrical reveal of information. it would be a gradual and asymmetric reveal, which means that the holders of the information could use it to their advantage. you'd end up with a class of people who hold the secrets that could manipulate the other classes. it's not a pretty idea for believers in equitable society.
Most people just don't care or gave up caring. It's too cumbersome to seriously protect privacy. If friends use Whatsapp and they have my address in their phone's address book, Facebook knows it. Same with photos. And as long as NSA has access to all Google and Facebook data, it's efficient for them.
Two interesting projects to mitigate this:
Terms of Service; Didn't Read:
https://tosdr.org/
I don't think the solution is going to be technological in nature. It can only be cultural.
We absolutely need a Peace movement that can overpower the current stance of the war movement. We absolutely need to make peace. Probably the only thing that will get us out of this situation is the hardest thing - the most impossible thing - for any of us to do: defeat our enemies by becoming friends with them.
We have to work harder to defeat intolerance, hatred, bigotry. These things fester, and are contagious - and they are the lynchpin of the argument being used by the powers to enact their heinous rules and laws, which enslave us all.
We need a platform of peace that actually really makes peace happen. Some way of getting the rabid ideological Christian crusaders at the same table as the extremist Muslim terrorists. Some way to bring these massive differences - which are utterly arbitrary and without real substance - to the point where the human beings, on the other side of the books and vitriol, see each other and treat each other as any two humans are capable of doing - with peace, with love, and with an honest and sincere desire to see the party across the table survive, flourish, and prosper.
We need to encourage people to be intolerant of intolerance. This is the biggest issue. The NSA wouldn't have any fish in the pond, if we were to gain a level of human understanding, compassion, and cooperation, between the warring factions.
However, this is a difficult task. Maybe we, the human species, won't ever be up for the task. Unless challenged by something else, from elsewhere ..
We need peace. We cannot fight evil for peace, just as we cannot fight a war against terror. Peaceful demonstration movements are destroyed by making them fight. Its really easy to do it. Vietnam war for example, or the fight for the right to protest, or the fight against X Y or Z, or Occupy being sucked into fighting for the right to demonstrate in a public place. I am even talking about being proactive and demonstrating against evil and wrongdoing. Even famous campaigners and activists championing for what is good and against political injustices can not lead us to more peace. The only ideology really that I have come across is nonviolence.
I also like the alien threat as incentive for global unity idea - it makes sense in terms of human group psychology (The Other). And I suspect that group psychology may actually make my first paragraph utterly false, as any movement, any group, any collection of humans may not be able to be peaceful. (individuals however could be...)
A logo we can place in all our websites? Together with some kind of sentence, like "I support privacy", or "healthy minds need privacy", or "privacy is a human need", or something better you can come up with?
I am genuinely impressed with first few paragraphs, where they explain motivation for doing this, rarely in todays world you will see such noble intentions, I think this is one of the most beautiful things I read in a long time.
I've run search on this page for 'education' and I've not got a result. Education, Education and Education (as Location, location, location for real-estate). Worth repeating: E D U C A T I O N is the game!
Some might find the game plan partly shared (in addition to the main angle on Snowden revelations) by Kim Dotcom and the Internet Party of New Zealand:
One of the most important things that could be done is for browser vendors to change their approach to self signed certificates. At the moment it looks like a disaster when you visit a site with a self signed certificate, and certainly no reputable business would do it. However given what we know about how compromised certificate authorities are likely to be, I actually now consider a self-signed certificate as more private than one signed by a CA. I would much rather everyone signed their own certs and we use pinning and other, more decentralised strategies to track authentic certs than any kind of centralised model.
Is an always encrypted IP standard part of this? I would think at this point public-private encrypted IP would be something that would be a huge selling point for companies. People will buy new hardware if they know it would make it more difficult for their communications to be collected.
First I want to congratulate the EFF in all the great work they do. Thank you guys.
Second I want to address a problem that concerns me a lot and that I see again in this discussion. I have a huge problem with equating the facebooks and googles with the NSA. Some people make it seam as if the NSA (and simular) are the lesser or at least equall evil compared to Google, Facebook and co.
I think this is madness, how can we compare companys that try to sell us ads with a organisation is sending policy and millitary forces to peoples houses all over the world and puts them in prison or even kill them. In what world are we more afraid of a company that overs us 10% of the laptop you thought about buying compared to the serious danger that the NSA posses to democracy, libery and peace. I am unable to understand this position, I would rather have 100 Googles and 100 Facebooks then 1 NSA and I live in a country thta is in frindly relations with the US, if it were otherwise I would rather have 1000 Googles or Facebooks.
I dont want to come accross a defender of big buissness, in a parallel world where there is no NSA and no mass government survallance I would happly join the facebook bashing. I would happly lament the fact that gmail has centralised your email to much. But in this world, those are things that I dont think about a lot, or rather I think about it only in the context of the NSA.
Some people argue that its the fault of Google and Facebook that the internet is centralised and this makes the internet easy meat for the NSA, they then go on to argue that Google and Facebook are basiclly just as bad as the NSA because they helped in the process. This kind of argumentation relies on many assumtion that are, at least in my mind, not at all valid.
First, you assume that the internet would not be centralised without these companys. This is not at all clear, we see a again and again that centralisation has benefits for many users and for the companys itself. It is simply good buissness to gather data, every startup does it and every big company does it too. Any company that does not do so is very likly not going to be the next Google or Facebook. I love OpenWispherSystem and use there products but they are not going to be the next Google.
Second, you assume that centralisation allways makes syping easier and I question this too. Compared to a cyper punk future of peer-to-peer darknet full crypto, onion routing, forward secret (add more crypto buzzwords) this is of course the case but compared to the likly alternative its not that bad. Even for very good technical people like the people in this forum it is hard and very time consuming to run there own mail servers, and use PGP. The likly alternative is people running a non updated box of a old OS with old tools that are bound together by handnitted crypto (if any). We have learned from XP, IE6 and things like that that you simply can not really on common people to keep up to date with features, let alone security.
Without getting a hole lot better at OpenSource and peer-to-peer we will not create a digital revolution. I am still up for trying and will help where I can to make that future happen eventually and I hope you will do so to. But in the mean time we have Google and Facebook. While I do not expect much from the morality (I do expect a little) I hope they are at least smart enougth to protect there own data from the government and the competition. Facebook is using fantastic crypto in Whatsapp, Google has started to use better encryption internally and is fighing on many other fronts. These technical changes will hurt the NSA, this is afterall a fight on economics (cost/benefit of mass survaillance) and 600 million encrypted chat users will cost the NSA, even if they can get at the data by hacking every single smartphone on the planet.
The worst things Google or Facebook could do to me would be unpleasent, the worst thing that the NSA could do is hard to even imagen. The NSA and secret services like it are amung the biggest danger to our freedoms and it does not matter on what side of the politcal spectrum you are, guns are at risk as much as healthcare.
tl;dr:
So lets first worry about evil goverment that have SWAT teams and drones and then worry about to much ads.
>The entity that’s conducting the most extreme and far-reaching surveillance against most of the world’s communications—the National Security Agency—is bound by United States law.
This seems clearly wrong -- when people like clapper can essentially lie under oath to congress and not suffer any legal consequences it is rather dubious to claim the NSA is bound by US law. There are increasingly 'secret' courts, 'secret' laws, 'secret' budgets, 'secret' legal memos, etc etc, why would you begin your game plan with stating something so clearly incorrect. The reason we know much about what is happening is because of whistle blowers, and much of these people are either in hiding, on the run, dead, or being jailed by the US government. The eff is out of touch with reality if they truly believe the NSA is beholden to the law instead of those controlling and using it as they see fit.
Getting caught blackmailing a Congressman would be very, very bad for them. It's far more likely that money from a secret budget ends up in the campaign fund of friendly politicians, or that politicians who do what they want end up with lucrative consulting work after they leave office, much as it is with other industries.
But what do you propose to do about it either way? Go join Lawrence Lessig's anti-corruption campaign. In the meantime it's not like positive change is impossible, it's just harder than it ought to be.
Because bribing and blackmailing are not the same thing. The first one is unfortunately legal as long as its done the right way (through campaign contributions and PACs) and the second one is still a crime.
Except one leaves the congressperson happy and one leaves the congressperson angry. Do you think the angry or the happy person is most likely to anonymously disclose this information to a journalist?
So they'd spy on everyone, throw whistleblowers in jail, pass naked selfies of citizens around and more, but they'd never try to blackmail politicians or others.
There's no better time to get behind projects that aim to decentralize the web, which covers many of the bullet points in the EFF's list all in one go. The primary project IMO to keep an eye on is MaidSafe[0], and I'm actually quite surprised how little attention it gets from the community here. Whereas most service/companies/developers aim to build privacy/encryption into their own apps, the goal of the SAFE network is to create an internet backbone on which all apps are automatically secure, making it much easier to build secure apps for the average coder.
1. Pressure technology companies to harden their systems against NSA surveillance
2. Create a global movement that encourages user-side encryption
3. Encourage the creation of secure communication tools that are easier to use
4. Reform Executive Order 12333
5. Develop guiding legal principles around surveillance and privacy with the help of scholars and legal experts worldwide
6. Cultivate partners worldwide who can champion surveillance reform on the local level, and offer them support and promotion
7. Stop NSA overreach through impact litigation and new U.S. laws
8. Bring transparency to surveillance laws and practices
~~
Addendum: Laws & Presidential Orders We Need to Change
* Section 215 of the Patriot Act, Known as the "Business Records" Section
* Section 702 of the FISA Amendments Act
* Executive Order 12333
* The Funding Hack
~~~
Related cases:
Smith v. Obama [1]
Jewel v. NSA [2]
First Unitarian Church of Los Angeles v. NSA [3]
[1] https://www.eff.org/cases/smith-v-obama
[2] https://www.eff.org/cases/jewel
[3] https://www.eff.org/cases/first-unitarian-church-los-angeles...