Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Further Reduces Your Control Over Personal Information (eff.org)
70 points by jlhamilton on April 21, 2010 | hide | past | favorite | 34 comments



This weekend one of my friends (IRL) asked me why I wasn't on Facebook. She was shocked, as many who know me are, when I tell them I am not on facebook. They see me as a internet guru. How can an internet guru not be on facebook? What they don't know is that most Internet Gurus I know are also not on facebook.

The reason I am not on facebook is quite simple, I do not want to give facebook my private information, because I do not trust Facebook. I do not trust Mark Z. I do not believe that they have individual users' best interests at heart.

What I do believe is that they are interested in getting more users, sharing more information, and making more money. None of these are in my best interest. In fact, as Facebook gets more users, the quality of facebook will continue to decline, as will their treatment of those users.

It takes a lot of money to operate 10,000 servers (or however many they have at this point). It takes a lot of ad revenue. It takes a lot of people and investors. All of that means continued focus on the bottom line to ensure survival.

Facebook's only value is in the information people put into it and they will do what they need to do with that information to keep powering those tens of thousands of servers.


I wish I could upvote you twice as this is the essence of what 'normal' people do not understand and the internet gurus do: no big social network can survive without selling out it's user base in some sort. The rest is just mechanics.


I would say giving information to Facebook is giving information to the world. No profile means no information for anyone to refer to you which means no one can bother screwing you in any way. Oh...and you also realize that there is a sun out there and the grass is green and actually it does have a smell.

In case you have profiles up on the internet (not just FB) share only what you are comfortable to loose. Once in different hands it is not yours anymore.


"What I do believe is that they are interested in getting more users, sharing more information, and making more money."

I don't see why sharing more information will make more money in the long term. Keeping the users happy and engaged would seem more profitable to me. What is the logic?

"In fact, as Facebook gets more users, the quality of facebook will continue to decline, as will their treatment of those users."

I don't understand why treating the users badly would increase the value of the company.


This is an obtuse argument. The obvious point being made by the parent is that Facebook, as all companies do, will at some point or another put its own interests above those of its users. When you're selling widgets this is usually no big thing and you just start to fade away or have to reinvent yourself, but when you deal in personal information it's a different story.


as wmeredith replied below and I'll elaborate here:

Facebook has only one product, information. When companies want to grow, they have to produce more product. Facebook can produce more product in only three ways: getting more users, getting users to add more info, or sharing more of the info they've already added. There is no other way.

When new user accounts plateau, when users are content with the amount of info they've added or the additional value of that data declines due to signal-to-noise, the only method they have left is to share more of the information they already have in the form of greater public visibility.

Remember, Facebook's customers are not the users, they are the advertisers. If you want to establish a relationship with a company designed to make profit, profit should be the foundation of the relationship. Facebook's users do not profit. They provide labor to improve the value of facebook with no compensation in return, so the relationship is unbalanced. Even factory workers are paid, albeit not much comparatively, but someone else somewhere is making less and will migrate to the factory to fill jobs people quit.

In facebook's case, turnover can't be filled. The people won't come back and there is a diminishing quantity of new users to pick from. Eventually, users will stop joining because the knowledge we are sharing here will reach them before they do and they'll have lost their naive innocence -- a requirement of any sharecropper.

Why more sharing of information will make more money is because money is made through the information. Basically, advertisers can better target ads. By treating users badly, I mean in an increasingly selfish manner by doing things their users don't appreciate. Users don't want someone unilaterally deciding how much of their, what they consider, private information is shown to third parties -- millions of other parties -- like the FBI, Police, IRS, potential employers.


Facebook's users do not profit. They provide labor to improve the value of facebook with no compensation in return, so the relationship is unbalanced.

I think this is demonstrably false. It's true, my brother doesn't get paid by Facebook to spend time on the site. Google doesn't pay me to use GMail either, and yet I use it every day all day. Money isn't the only way to provide value. And there's no way Facebook could have reached 400 million users without providing value to a lot of those people.


I agree that in the short term FaceBook could turn its current information content into money by disclosing it in ways that will displease the people who create the content. And it is possible that foolish management of the company could make this happen. But there seems to be a long term destruction of the value of the company in doing that because the content creators will migrate to more pleasant sites. The current information content of FaceBook is a fixed resource that will lose value over time. What they need is to have people continually adding new content. Otherwise their visitors will decline and their advertisers will have no audience.


just fyi, they operate around 30.000 servers http://www.datacenterknowledge.com/archives/2009/05/14/whos-...


So the future of Facebook is Myspace? It'll grow and decline in quality until one day some small upstart that goes back to the basics of social networking will take the crown; and the history will repeat itself. Unless of course in the future with cheaper processing power, storage space, and bandwidth it will not be as expensive to maintain a very large social network.


That facebook's concern for its users' privacy is secondary to more "practical" matters has been clear for some time now--though perhaps most evident with Zuckerberg's memorable, "privacy can no longer be an expectation." And this recognition is not limited to HN-types and "internet gurus." General users, college students especially, also realize this and so a pattern has developed in which younger users (speaking from the perspective of a college student) exercise a degree of caution (probably less than sufficient) while in college and then upon entry into the workforce, remove the wall, photos from their profile and thus their facebook profiles becomes little more than a prettier linkedin. If this pattern reflects more than just my experiences, Facebook seems poised to be your online passport at the expense of its capabilities as a social communication hub.


Wow, this is pretty lame. The various stuff I've become a "fan" of (apparently now renamed "like") was under the understanding, explicitly stated by Facebook, that it wasn't part of the public information. Seems they changed that without warning or opt-out? Is there at least an easy way to quickly remove all the hundreds of things I've become a fan of before Google starts indexing them, if I'd prefer them not to be indexed?

What I'm getting from this is that you should assume that all your Facebook information is publicly available, and act accordingly, because they could make it so at any time. At this rate of trustworthiness, I wouldn't be that surprised if in 2 years status updates were made retroactively public with no opt-out (and maybe without even telling you).

Edit: Is this even legal? Consider the following scenario: Someone signed up for a Facebook account 3 years ago, and entered some of this information, at a time when their privacy policy explicitly promised that the information would not be shared. They have not logged in since, so cannot be said to have even implicitly agreed to a change in the privacy policy (and Facebook has not mailed out any notice of the change). Now their information is made public, in violation of the privacy policy. Not sure how easy it'd be to enforce, but at the very least it seems sleazy.


I deleted my response to this by mistake.

Here's what I'd posted, in essence:

http://blog.facebook.com/blog.php?post=382978412130

Users can edit which connections they want to add to their profile. They can also hide some connections on their profile, and remove others if they so choose.

Disclosure: I work for Facebook. My opinions are my own.


> you should assume that all your Facebook information is publicly available

Indeed you should, I'd say this applies to any website.


That would certainly make mint.com pretty interesting...


mint.com is a much more absurd idea than facebook. And the fact that it has users is even more mind-boggling.


And online banking for that matter...


http://news.ycombinator.com/item?id=1281750 pointed to a Time article about http://suicidemachine.org/ which can be used to delete your whole Facebook account if you want.


That's exactly what I did. Merely complaining about this is a bit dependent, because they've repeately demonstrated that they have no interest in users' privacy.


And now Facebook puts TOS and privacy policy changes on a Page (Facebook Site Governance) that you have to be a fan of/follow/like in order to be aware of anything.


[deleted]


Hmm, if it's opt-in that's much less problematic. EFF's description of "removed its users' ability to control who can see their own interests and personal information" certainly doesn't sound opt-in! Are they talking about different things, or just mistaken?

I don't seem to see any dialogs on my own page...


I don't think it's fully rolled out yet.


yup, doesn't work for me yet either. But thanks for the fast response and showing that you care.


Well wait, this doesn't say anything different than the EFF article does it? Yes you can still remove connections, but you also remove the interest as well right? (I have no idea, I like many of the above have zero trust for FB and deleted acct months ago).


Yes you can still remove connections, but you also remove the interest as well right?

Ok, then. If I later decide I really want to repost that info (and keep it friends-only), I'll probably stick it in "Favorite Quotations" or "About Me." If they scrape that too, maybe I'll repost it in captcha-like text in my photos section.


But why should you have to keep playing catch up? Yes it's not HARD to put stuff there, but you can't argue it's not bad UX to constantly pull the rug out from under users like this. Captcha like text? Seriously?


Sure. At that point, participation in Facebook would be nothing more than a game I play on occasion. If I get annoyed with it, I can put it away for a bit. This just makes the "post information" part more difficult. Except I don't much care about that part anyway. If I won't have my information protected, I won't post it, and I won't feel like I've lost much by not posting it anyway.

Really, this just solidifies Facebook's role in my life as nothing more than an occasional diversion. My reaction is mostly amusement at Facebook's self-destruction.


Unless I am misunderstanding this http://bit.ly/c4NFgO it appears that this article misrepresents the changes in a couple of important ways:

1) Facebook says this is opt-in: Opt-in to new connections: When you next visit your profile page on Facebook, you'll see a box appear that recommends Pages based on the interests and affiliations you'd previously added to your profile. You can then either connect to all these Pages—by clicking "Link All to My Profile"—or choose specific Pages. You can opt to only connect to some of those Pages by going to "Choose Pages Individually" and checking or unchecking specific Pages. Once you make your choice, any text you'd previously had for the current city, hometown, education and work, and likes and interests sections of your profile will be replaced by links to these Pages. If you would still like to express yourself with free-form text, you can still use the "Bio" section of your profile. You also can also use features and applications like Notes, status updates or Photos to share more about yourself.

They reiterate this further down the page: "If you don't want to show up on those Pages, simply disconnect from them by clicking the "Unlike" link in the bottom left column of the Page. You always decide what connections to make."

2) If you choose to opt-in you can control whether your friends see the connection on your profile by using the privacy settings.

So, it is opt-in, there is a simple way to remove yourself if you change your mind, you can control the visibility on your own profile, and if you don't want to faff around with the connections, you can just list this stuff as text in some of the available text only fields.

What is the issue?


"As Facebook's privacy policy promised, 'No personal information that you submit to Facebook will be available to any user of the Web Site who does not belong to at least one of the groups specified by you in your privacy settings'."

This is STILL true, and it's publicly known how to maintain privacy. You can keep all your information private to everyone who isn't a friend, and you get to control exactly who your friends are.

If you become a "Fan" of something or post on someone's comment/Wall, you're sharing some of your information with that page. No shit it's public. Why is a misunderstanding/lack of education about Facebooks policies equivalent to that site being sneaky and/or evil?


You're misunderstanding the issue. Today's change takes previously private or limited exposure profile information and uses it to create ad-hoc groups that people never explicitly acted to join. It's very similar to the Google Buzz fiasco, except with interests being exposed instead of contacts.


It's opt-in.


Barely. The explanation of what you're opting into is terse and vague enough that most people won't really understand what's going on, and if you decide not to opt in, FB wipes out almost your entire profile.


How would you support the claim that most people won't understand what is going on?

Edit: I guess the point I'm trying to make is that Facebook actually tests the UI and copy in experiments and in the lab before deploying it; so when you make a claim that it's not working, I'd like to see some supporting evidence.


Because I have friends and family who ask me all kinds of crazy questions about how things work, and I'm guessing the experience I had on FB this morning won't be enough to let me dodge those questions. It's a personal opinion, but one I think is pretty well grounded in my own experiences.

What isn't opinion is that opting out results in wiping out almost everything from your profile. That's a fairly punitive action to take against someone who doesn't want to opt in to an "opt-in" feature.

And saying that "Facebook actually tests the UI" means pretty much nothing to me. Do they publish the tests and the results? How confusing can a feature be before it's scrapped? Does the importance of the business function the feature supports affect that threshold? On top of that, given FB's history of privacy screwups and subsequent rollbacks, why should I have any faith in the company at all in this area? FB has always seemed more than comfortable with the "ask forgiveness rather than permission" model.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: