Hacker News new | past | comments | ask | show | jobs | submit login
Bruce Schneier: Privacy in the Age of Persistence (schneier.com)
42 points by anuraggoel on Feb 27, 2009 | hide | past | favorite | 18 comments



"Data is the pollution of the information age. It's a natural byproduct of every computer-mediated interaction. It stays around forever, unless it's disposed of. It is valuable when reused, but it must be done carefully. Otherwise, its after effects are toxic."

I agree with this idea; that's why I prefer the regulation of data collection and storage, rather than use. Allowing companies and governments to collect massive amounts of data about people that they aren't allowed to use in certain ways today is a ticking time bomb. This data is attractive to criminals who aren't bound by laws anyway, and corporate mergers or changing laws can retroactively harm privacy based on data that was previously collected.


I've been toying with a Creative Commons like service, one where as an organization, you can choose the criteria which meet your privacy policy. This would have a "visual vocabulary" akin to the CC badges or nutrition facts on food for different privacy models. It seems like a missing component is clarity and transparency when it comes to understanding the implications of several facets: collection, storage, and use being the big three.

This paper by Irene Pollach [http://portal.acm.org/citation.cfm?id=1284627] delves into some of the details and weaknesses in privacy policies and how legalese can weasel out of a real policy.

Schneier's article makes me even more concerned about storage -- which I think most people dismiss if the use argument is addressed.

I think a privacy vocabulary would be wonderful.


That sounds suspiciously like P3P. http://www.w3.org/P3P/Overview.html

Why did P3P fail and how can those problems be avoided in the future?


I think P3P never gained traction in part because it was too early. I don't remember in 2000-2002 people getting in huffs over privacy policies. They weren't common then. Terms of use were -- deep linking policies. Heh.

One of the major P3P criticisms is the lack of enforceability. While say a Creative Commons license is applied to a work, if a P3P "contract" was applied to a site, how does one enforce it?

I think with the oversight a community provides (Facebook ToS is a recent example), a community or communities could keep companies' policy _more_ honest. I'm currently formulating my thoughts on this; I'm not completely versed in privacy nor previous attempts to clarify, add trust, understanding and accountability, like P3P.


I'm think it is wholly because it is unenforceable and unverifiable when it comes down to it. P3P allows websites owners to assert their privacy policies, and some aspects of the TOS, in "machine readable" formats. This was supposed to allow standardization to allow better filtering automatically when you visit a site. You tell your browser you are only interested in sites that "collect cookies for the purposes of aggregate data collection" and "don't sell personal information to third parties", and the browser was supposed to warn you, or change its functionality, based on your targets with the site's claimed assertions. It doesn't work like that though, for the same reason the Firefox bad certificate screen ended up being more annoying than useful: no one actually cared about security more than using the site. It's easier to override the settings and use the site. And you could never be sure, until after the fact when it's too late because the information is already out there, that the policy was ever followed or not.

And because of that and the way IE's default "internet zone" cookie policy worked, you pretty much had to, as a website, assert policies that were amenable to the IE defaults.

This would only work when there is significant competition between interchangeable and interoperable sites anyway. Facebook asserts policies A, B, and C, while Myspace asserts policies X, Y, and Z. Well, those policy differences don't mean anything if I actually want to use Facebook because that's that's where my friends are. Privacy policies are only a differentiation point if the policies are different and the services are exactly the same, which is actually impossible (and not really in the indivdual sites' best interest anyway).

P3P has some use as a way to monitor the privacy policy and TOS on a site, and have your browser notify you of changes. I don't think this is necessarily better than what happened with Facebook TOS where someone was following it closely, actually read it, and raised hell about it. There's an emotional aspect tied to that, one that doesn't exist when your browser pops up a box with a warning you just want to dismiss and get out of your way.


This is a serious issue to consider, but one way we're not going to solve it is by trying to restrict data collection, or simply trying to hide our own digital footprints. The latest facebook ToS episode tells us as much. To continue Schneier's automobile analogy, we're not solving pollution by un-inventing cars, but by coming up with even better clean technology. Similarly, reducing data collection doesn't seem to be a viable option; instead, we're going to have to come up with better technologies for access control & data anonymization.


This is exactly why easy-to-use programs that can ecrypt or anonymize your data are good. People are willing to use something like Skype because it's easy to set up and run. Want to call someone? Type in a phone number and hit 'call'. The average user doesn't know or care that their conversation is being encrypted, but they benefit nonetheless.

I'm really glad to see programs like Adium, used by the majority of my OSX friends, can encrypt conversations by default, and interfaces perfectly with options like OTR ( http://www.cypherpunks.ca/otr/ ). I've managed to convince exactly one friend to use OTR - it's not an effective way to fix things because most people don't care. Getting programs to have useful defaults that protect you while still being easy to use is the key.

Now all we need is an open-source version of Skype ( http://www.qutecom.org/ ? ) that joe-anybody can use on their phone :P


Arguments for the importance of privacy seem to invoke either corruption (eg. 1984, or the Cardinal Richelieu quote) or the risk of error (eg. misidentifying a suspect because they share attributes). These remind me of the arguments used against artificial intelligence research. I see them as problems that can be worked around, not as a basis for more privacy measures.

Instinctively I feel that privacy is important, but I can't find any solid justification for the instinct. It bothers me slightly to know that I can be tracked by cell phone signals or a public-transit swipe-card, but I couldn't win an argument for the importance of privacy.


While corruption and error are certainly major factors in privacy, there are some things that are just private. I don't want people, even people I know and trust completely, to know about the details of my sex life. Not because my sex life is crazy and deviant, because it is private. I don't want to have clear bathroom stalls, not because I'm doing something wrong or worry about someone harming me, but because it is private. Voyeurism is an invasion of privacy, it doesn't hurt people in any other way, but it is illegal, and should be.

In short, privacy is import because of the instinct you mention. The desire for privacy is the reason for privacy. Privacy is like freedom, it is a basic human desire and requires justifications to violate or remove, not to support.


Thanks, this gives me some things to think about. I do have to wonder though, is privacy really something inherently human, or are we conditioned by society to require privacy for comfort? I do think it's the former, but I still have to wonder.


I find privacy important because I do things which a sizable portion of society might find distasteful, but which I do not find distasteful. I value my ability to do those things without incurring the ire of other people.


There are many points involved, including (in no particular order):

1) It's subverting the usual innocent until proven guilty model - why not presume everyone is doing wrong and just keep watching until you find them? Because that's not just.

1.1) Cost - most people aren't doing wrong, if they were everything would fall apart. Erosion of trust issues.

1.2) Well, everyone is doing wrong. And getting away with it. And that's OK. It's how things work. A good deal for a friend, an early warning, you scratch my back I'll scratch yours, nudge nudge, wink wink, say no more. On the small, local, personal level, it's fine - or at least, not always bad, which is practically the same thing. Besides, have you seen how many laws there are? You could be a lawyer all your life and still not even have skim-read once all the laws that you are legally obliged to follow at all times!

2) It's a power thing - someone is deciding what 'wrong' is, and it damn well isn't you or I. And what is 'wrong' will change year on year, governement on government - but your records will always be there to be searched for the new kind of wrongdoing (wrongbeing, wrongthinking, etc).

3) It's very much a power thing. It's always presented in the form of a government or corporation watching the populace. It's never presented "If you've done nothing wrong, stream videos of your boardrooms and offices to the world".

4) Knowledge is power. Where there's a hill, a pressure difference or a voltage difference, there is power. Knowledge is no different - there is power in a knowledge difference as well. Building up your skillset is generating a knowledge differential that makes you more valuable. Industrial espionage is shorting a knowledge gap to weaken its power. Insider trading is illegal because people with high pressure knowledge have a strong advantage. Mass invasion of privacy by government weakens your power against government (and increases their power over you), by companies weakens your power as a consumer, by everyone weakens every knowledge difference. You don't get a better environment, you get everything-the-same heat-death brown, all the hills flattened and the valleys filled in, if I can mangle some geography, cosmology and colouring in.

5) "If you've done nothing wrong, you've nothing to hide" is a false dichotomy, it implies that you only hide wrongdoing. I hide birthday presents and surprise parties. I hide my bank details so people can't take money from me. When parking my car, I hide things in the glovebox so they aren't stolen. I hide my government bureaucracy papers so people wont steal my identity. I hide my interest in computing so people wont take the piss. None of these things are 'wrong' but they are still valid things to keep quiet about, IMO.

6) It implies anonymity is wrongdoing, yet we have anonymous voting because otherwise it would be vulnerable to more corruption, checking if you voted for the 'right' party.

7) People are obliged to wear clothes to cover genitalia, because having visible genitalia is a social 'wrong'. This has elements of hiding something because you are doing wrong (or being wrong), yet the hiding is the right thing to do in the current social climate. To some extent, right and wrong are fuzzy (See PG's essay on Thoughts you can't Think) yet you expect me to believe that there's a simple line around 'doing wrong' that someone else could see if only they could watch me more?

8) Judgement takes knowledge and experience. Do you really want every Tom, Dick and Harry government employee jumping down your neck with accusations of wrongdoing because they have no clue about your work or industry? "Why are you sitting there doing nothing while still on the clock? Scamming the customer, are you?" "No, I'm thinking of an algorithm that will scale to all their documents" "Thinking of an amalgam... amalgamation? You're not going to the dentist or buying a company, get back to work!"

9) Government and companies are paid for by taxpayers and customers. Do you want to foot the bill for them to watch you in case you are doing wrong?


I really appreciate that you took the time to write this, it's the sort of thing I've been searching for online and trying to figure out for myself to no avail.

I realize now that my question was overly broad, and the different types of privacy (from government, from corporations, from friends, from strangers...) can't be lumped together for the sake of argument. And yet you still managed to cover all of them in your points, which is great.


Seconded, thanks for writing this. I especially like point 5, because I hear people and corporations use the "if you've done nothing wrong" argument all the time.


Every time I read something by Schneier, I'm impressed with just how well he's able to put things. He's somehow able to impart the gravity of things without coming across like he's fear-mongering. I wonder how he's able to write so eloquently and accessibly on security, which is usually hard to do - I'd like to be able to write that well.


Agreed. He knows the material intimately, so can speak to it with ease.


Imagine if they would have had the internet with Google, Internet Archive, etc., during the McCarthyism era in the US. That's the type of thing I am most concerned about.


Reading this, I had one of those -- hey! "I invented that first!" entrepreneur moments, since I had blogged on the same topic a week or two ago. http://www.whattofix.com/blog/archives/2009/02/who-was-i-aga...

Of course, like all big ideas, this stuff is "in the air" at a certain point in time and lots of people are channeling it. I think of it as a process sort of like waking up: usually you'll have outliers who warn of problems years or decades ahead of time without any traction, then suddenly everybody's thinking and talking about it. E-commerce was like that, and so it social networking. Who knows? Maybe Twitter is the next big change.

This is a society-changing trend, no doubt, and worthy of all the attention we can give it. While my post was overly lyrical, historical, and elliptical -- Bruce drives a truck right through the reader with direct analysis. I hope to see more writers take this on.

If I had to put the problem into one semi-poetic line, it would be.

Every detail. Easily recorded. Rarely noticed. Never forgotten.

Our species has never existed in a world where nothing was forgotten. Not only is the ability to forget a key part of remaining sane, it may be a key part of a functioning society.

We don't know -- we're in uncharted territory. But I do know that the matter is credibly huge and will not go away simply by us ignoring it.

And no, this is not a privacy issue. To think of it as just privacy is to miss the point. Even if we were the only ones able to access the data about us, is it healthy to have a life in which all the details are remembered forever? I don't think so. This isn't about ownership of the data, it's much more encompassing than that: it's about whether or not people are machines or evolving organisms. Machines don't care for the past. Evolving organisms are always forgetting and remaking the past in order to emotionally move forward. We may be reaching a "wet-ware limit" where our information systems are simply operating at too high an efficiency level for the interface to work properly with us sloppy, emotional, forgetful, slow, and illogical hominids.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: