This is what you get out from centralizing your communications medium. Sure, we all can easily talk to each other but now you have to assume the host will find something you do or say a liability thus remove it as quickly as you post it. To say that Facebook shouldn't do this comes into conflict in protecting share holder value (which means also avoiding illegal content per local laws). So, you can't have it both ways. Either you have a corporate defend it's share holder value or you have it all nationalized and get NPR. I just wish people would realize that corporations aren't our friends, they're here to make a profit. And profit isn't always what's ethical. I think the better solution to the problem as it stands is to force people to start hosting their own content (which is why I support net neutrality and anti-metering laws). That way, you're in charge of your content and responsible for getting people to view it. We shouldn't have to return to the days of AOL (which we have largely done) to get eye balls. People still go to websites, so why depend on Facebook to distribute your content?
Yep, it's not a public utility, it's not governed by a neutral third party, it's not open, transparent or free. What did you expect? It's a surveillance and marketing platform that you subjected yourselves to out of convenience, operated for its own benefit and indirectly to the benefit of its customers, certainly not its users.
I've never really understood social media in this form, nor understood its takeover as primary communication channel and I'm not going to go full RMS or expect anyone else to have the same values as myself, but if people do share those values, and do want better than facetwitsnapagramblr (without any judgement on if they should) then they should have participated in building and using something better instead of deluding themselves.
> Yep, it's not a public utility, it's not governed by a neutral third party, it's not open, transparent or free. What did you expect? It's a surveillance and marketing platform that you subjected yourselves to out of convenience, operated for its own benefit and indirectly to the benefit of its customers, certainly not its users.
With the possible exception of "surveillance" everything you just said is also true of newspapers.
If you're going to start splitting hairs then you don't "have to" use any one particular method of staying in touch with people. Supreme convenience is not force.
In theory, it is not. In practice, it is. Not only that, in case when there is a fight between almost anything (such as privacy, security,...) and convenience, the latter will always win (on large scale).
Facebook is also the town square where people are selling the newspapers, Facebook is the streets that newspaper delivery trucks drive down, and Facebook is the sidewalks on which employees at the newspaper walk to work.
There is no public space on the internet, just a network of private residences.
Facebook has positioned itself as the public square. The freedoms of the press, speech and protest are predicated on equal access to these public forums and they clearly do not exist in our current media landscape.
> I think the better solution to the problem as it stands is to force people to start hosting their own content
I'm encouraged that Tim Berners-Lee is working on this exact problem with Solid (Socially linked data) "a way for you to own your own data while making it available to the applications that you want to be able to use it."
you kinda already have an instance of IPFS from the creator of bittorrent, its called btsync. totally p2p. Tho somebody needs to make a 'Solid' layer on top of it.
as far as social net is concerned I'd settle for a middle ground social network like ello with irrevocable privacy and non-censorship protections built in from the founding doc itself.
The optimist in me would like to believe that we are still in a "Napster" world in terms of content, and we will eventually get back to a more bit torrent centric environment - what we lack is a killer UI that me, my kids, and mother-in-law will all use.
>I think the better solution to the problem as it stands is to force people to start hosting their own content
Does this mean people need to start moderating their own content, also? I don't want to have to decide on rules of what to keep/delete, and I don't want to just let anything go, either. If I wanted to manage my own site, I'd build one -- I almost always just want to use a service that someone else maintains for me, including determining what content to show/hide. If I wanted to manage all that stuff myself, I'd just build my own Facebook.
The way I look at is this, self-hosting should be the norm and not the exception today. We have plenty of hardware solutions for this but the consumer end of the problem is largely due to ISPs not wanting to be what they are: utilities. If we could politically fix that problem (regulate Comcast and company properly) then I think the rest should be obvious for people. The rest comes down to making hardware solutions as simple as pressing a button and go from home.
>I just wish people would realize that corporations aren't our friends, they're here to make a profit.
I wish people would stop saying that corporations are some disparate entity, rather than a collection of people with a defined purpose, like our friends and family members. People just like you in other words.
Corporations don't have a mind of their own, nor do they pursue profit regardless of ethics.
>The whole point of a corporation is to be a separate entity from the collection of people that have interests in it.
It doesn't have a mind of its own, any direction comes from real live people making decisions every day. Or can you point to a single instance where this isn't the case?
Corporations are institutions and like all institutions there are legal and social norms which regulate them. In terms of private corporations (for profit) they have only one norm that trumps all others: profit motive. As this is their primary function, with all else being secondary, you should expect them to act in terms of that. All the appeals to Zuckerberg and the board of director's own consciences is not going to work every time. In fact, it's the sloppiest way to deal with this problem. Let's say tomorrow Mark and the BOD agree 100% with the anti-censorship thesis for Facebook, what then? How are they able to override the other shareholders? What happens the shareholders sue Mark and the BOD? Forcing them into court with an injunction to maintain the status quo at Facebook? See where this goes now? Basically, it goes right back to what we have and that's that. You can't just say "Oh screw the shareholders" when our own laws protect them. It's one of the biggest jobs of the SEC which is to suppose to protect shareholders which include you and me who have retirement accounts. Again, institutions are affected by NORMS not individual consciences. I wish it were the case that appealing to the better nature of billionaires would and could change the world but it doesn't because billionaires don't make the final decision it's institutions like governments, courts, AND corporations that do. Don't like how this works? Too bad, it's the real world. We don't live in loose tribes like we did 100k+ years ago. Thus your individualistic expectations don't fit the actual world. Sorry for coming at you with this rant but it annoys me to no end when I see people naively thinking you can overturn a complex system like what we have in institutions by default. You have to work with what things are and not with what we want things to be.
I'm having a tough time understanding what you're saying. You're saying that the decisions of a company are made due to cultural norms, and not individual consciences? And that Zuckerberg doesn't actually have any power, because the shareholders would out-vote them? And the shareholders make their decisions based on cultural norms? Then where do those norms come from? What leads those cultural norms?
I legitimately don't understand what you're saying.
I'm saying that there's more at work which is beyond the individual choices of each person on the board at Facebook. They have legal obligations to their shareholders, nebulous legal obligations to users, and many other concerns I couldn't list here since I don't know them all. But just of the tension between the legal obligations to their shareholders and that of their users which results in the shareholders having the advantage here. There's little in the way of US law that prevents Facebook from doing what they do since Facebook isn't like a broadcast company or a utility (as per the legal definitions). So the default mode for Facebook's board is to side with investors and protect the value of their stocks, not to protect the privacy of users nor to protect their other civil liberties. You can't just throw away the law whenever you like. At some point, the law has to trump our feelings on an issue like this. And when it's inadequate to handle these edge cases we should take action to reform the law. But that takes time. You can't expect an individual CEO, President, or regulatory chief to make that choice alone if the law doesn't grant them that authority.
The law says corporations are some disparate entity, except when they want protections given only to people, in which case the law can be convinced corporations are made of people.
Of course not, as you mentioned they are composed of human beings who make the decisions about how a company operates.
I am sure you will agree that a significant number of humans will disregard ethics when it comes to personal enrichment. Let me provide one example who comes immediately to mind: Pablo Escobar (thanks, Netflix). US prisons do contain a large number of white collar criminals.
> nor do they [companies] pursue profit regardless of ethics.
This does not follow. Companies are composed of people, and people do disregard ethics.
??? You absolutely have the right and ability to host your own content on your own page. Facebook is incredibly popular, but it's not the "centralized" communication medium. If you want to make a political statement or whatever, you absolutely can-- presuming you can get people to actually come see what you have to say.
People use Facebook or other social media because it's significantly more convenient to do so. And distributing content on Facebook/Twitter/etc. is much more effective than trying to get people to go to your own personal website.
No doubt, but the fact file sharing networks were and still are largely self-hosted proves that people will have no issue doing the same to get a copy of some movie or music album. So self-hosting your FB-like wall or Twitter-like timeline shouldn't be much harder. If anything, I think it would be easier since the remaining data that would be stored would be relatively small (thinking of Vine-like videos, photos like you have on Instagram, etc) with the rest just being hyperlinks. Hell, I think you could make them all hyperlinks and put storage in some other distributed protocol if you have to. In any case, it's not a technical problem in my opinion, it's just a problem of making such a platform as easy as installing it on your phone without a second thought beyond are you JamesBond007SexyTimes123 or whatever you silly handle would be.
Freedom of speech and freedom of expression mean that the government can't put you in prison or punish you for saying or believing what you do. Facebook aren't the government, they're a private entity and don't have to host anything they don't like- including hosting photos that they don't like. It's a walled garden, and it's their walled garden, and if you don't like it you're welcome to leave.
And on the other hand: it's the only garden. If your friends are in that garden, they can't share with you, interact with you, etc, without you also being inside. Facebook's created a 'with us or not with us' distinction that has a very sharp boundary. And it's worked- they've won the social network wars. A billion people are on it.
The question is, as the social network champions does Facebook have to have to public's interests in mind or just their own bottom line profit margin? As a public company, the shareholders will fire their leadership if they don't choose the bottom line. As the major social network of the world, the public will denounce them for actions like this.
The PM posted a criticism saying Facebook should "review its editing policy".
And then Facebook deleted that.
I can concede that there could be good reasons to pull the photo. But I don't see good reasons to pull the PM's post. I don't like any organization having so much power to make criticism about itself disappear.
> Erna Solberg, the Conservative prime minister, called on Facebook to “review its editing policy” after it deleted her post voicing support for a Norwegian newspaper that had fallen foul of the social media giant’s guidelines.
> Solberg was one of a string of Norwegian politicians who shared the iconic image after Facebook deleted a post from Tom Egeland...,
Shared the image? yes. Shared the image in the same post? Not clear.
That's all I was trying to say. The way the two cited sentences are written — and "her post" os on a different sentence from "who shared the iconic image" — does not say the image share was in the same post as the one being talked about.
What's the opposite of 'drawing from incomplete evidence'?
I hate the power Facebook as a behemoth and a monolith holds and abuses. But when accusing, we should be fair. The statements cited as evidence are from a publication, not an offhand comment by a random person on Facebook with no regard to correctness or clarity. And even in that, there isn't clarity for what was being accused.
So, if pointing out a lack of clarity in an accusation is called nitpicking, so be it.
I find these quotes very confusing, did they delete any words of criticism&support as well as the repost of the photo, and if so were they posted separately or together with the photo?
It seems like purely reposting a photo without comment could now be being called a post of support. While fb users can use whatever slang they like, I expect a media that assumes we don't know about these behaviors in the walled gardens and tells us what specifically was censored.
(Perhaps this seems like splitting hairs to FB users, but as a non-user of a communication network, I would tolerate a network that bans pictures of blue but not a network that bans criticism of its ban of pictures of blue. The latter requires government intervention.)
Sure. Facebook is not a history book. It's a social network with kids on it.
Napalm girl is a powerful photo. So are photos of Aushwitz. So too are photos of trench warfare. Saddam Hussein's hanging is historically important, as are all the images of children killed in Aleppo, or drowned families washing up on shores in Italy.
That doesn't mean we need all these things on Facebook.
I'm sensitive to censorship, and Facebook is a powerful and important platform for sharing, but I'm not prepared to say every photo of historical significance must necessarily be allowed.
Who are "we" and why "we" gets to decide what I need and don't need? I mean, when you say "Facebook is a private company and can do whatever the hell they please, including deleting random content because their left foot felt itchy this morning" - I get that. Completely. It's their service, if I like it, I post my cat pictures there, if I don't, I post them elsewhere.
But if you claim the mantle of "we" that is supposed to have special privileges on deciding who needs what on Facebook - I'd really like to know how comes you got this privilege and where it comes from. And no, "think of the kids!!!" is not a good answer.
Facebook is a publishing and communications platform and if I can't use it to discuss politics or history with my list of acquaintances, then it's useless to me.
Also, in case you have children, it would be best if they didn't have FB accounts with strangers in their list. Just saying, historical photos is probably your last problem.
Since Facebook's entire point is that feeds are receiver-specific, "think of the children"-style arguments based on "...with kids on it" may provide a good reason for establishing controls that prevent certain content from going into the feeds of children, but has a much harder time justify preventing the content from being shared on Facebook at all.
Honestly, many of those things would maybe be okay on Facebook. The subject of this photo in particular was the controversy. From the article...
A Facebook spokeswoman said: “While we recognise that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others. We try to find the right balance between enabling people to express themselves while maintaining a safe and respectful experience for our global community."
> it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others
No it's not. 99% of photos of nude children are fine. Children are born nude. Most very young children play nude for many hours a day. They don't have, or at least the ones I know don't have, any matching sense of self-consciousness or modesty for the misguided cultural copy-paste being applied here.
If a photograph is pornographic, then someone will report it immediately, and facebook can take action.
And no matter how difficult it might be in some strange edge cases to decide whether something is "pornographic" (in reality, these cases are probably very rare), this photograph is obviously not pornographic is any sense of the word.
This. If anyone was degenerate enough to pursue the path, pedo-porn is easily available.(and these people should be rehabilitated, not denigrated or 'removed' from society entirely depending on your perspective).
When I say 'easily available' people might misconstrue the statement; so, again, it is relatively easy to acquire child porn and stay anonymous(e.g. no repercussions) if you wanted to do so.
We as society should not allow censorship on this premise; what is the underlying ideology? That once this photo is accepted mainstream people will start posting pedo-porn obfuscated as war crimes?
Spoiler: this photo has existed for quite a while and that has unsurprisingly not happened.
People are meta. You can't optimize against them in this authoritarian manner. Most humans just want the truth; hiding a child in pain from her environmental circumstances just because she's naked is not the solution and it looks orwellian.
Well, that's easy for _you_ to say, you're not the one who gets your door kicked in and hauled off to jail and handed over to the inmates with a "he's a child molester, deal with him" if you happen to guess wrong.
> While we recognise that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.
Just because distinguishing between mere nudity and inappropriate nudity may be difficult in some cases doesn't mean it shouldn't be done, or, moreover, that it isn't necessary to be done in order to "find the right balance between enabling people to express themselves and while maintaining a safe and respectful experience".
Incidentally, protecting people from things that they neither need nor want protection from can be at least as disrespectful as exposing them to things that they wish to avoid.
> protecting people from things that they neither need nor want protection from can be at least as disrespectful as exposing them to things that they wish to avoid.
In a setting where respite from a resource is a click-away, I think that it's always more disrespectful to "protect" people from things from which they ask not to be protected.
It's impossible to draw the line between "mere" and inappropriate nudity, a line that will satisfy all people. This is simply an instance of Facebook making a choice that is just a bit out of balance compared to what the majority thinks, but make no mistake, it's just as much of an arbitrary choice as all others.
What is wrong with a kid seeing that particular photo? The only truly troubling things are the expressions on the people's faces (something which would take a parent about 30 seconds to explain), and the context (which is hard to infer from the photo directly).
If your friend (on Facebook or in person) is showing you photos, saying things you find offensive, isn't your issue with your friend, not the mechanism by which they chose to do this?
Isn't that the same argument we use for things like BitTorrent?
Good news is that you can filter what you see in your feed. Why do you think you or Facebook have a say in conversations between people you have no connection with? Why are you ok with taking down that conversation because there's a disturbing image that you don't even see, because it is not in your feed?
This post is exactly why Facebook will eventually fail.
People don't want a centralized authority controlling free speech. Democracy is what's needed in this context and it's going to be messy(read:human, we're all messy).
>I'm sensitive to censorship, and Facebook is a powerful and important platform for sharing, but I'm not prepared to say every photo of historical significance must necessarily be allowed.
False. If you think 'powerful' means big enough to censor things for everyone then you are not sensitive to censorship for rational values of censorship. We all deserve reality.
What is your argument against this photo? Too 'powerful' for a child to see? Are parents no longer held responsible for their own children? Are we all considered children now?
Who shouldn't see this, and why? Explain it exactly, not with 'feel good' stupidity.
I don't want to see those things either. I don't go on Facebook for some historical reality check, or to be reminded of the horrors that are happening around the world. Every news media organization sensors itself and doesn't show the full extent of the horrors it reports on b/c it counterproductive. It's upsetting and it ultimately just desensitizes you. I think that this photo has already been overused and turns into something like an art piece. If you had a similar new photo of a traumatized naked girl after, for instance, a recent school shooting in the US people could feel a lot more shocked than when they see this photo for the Nth time - and people wouldn't object to it being removed.
You have to draw a line somewhere. No naked children or gore seems reasonable.
>You have to draw a line somewhere. No naked children or gore seems reasonable.
You, as a user, can try to draw the line by controlling your feed, that's your responsibility. But you agreed to be exposed to other people's content when you joined the site - that's the entire point of it.
We, as a society, have no responsibility at all to "draw a line" anywhere to cater to you, and neither does Facebook (nor should it, in my opinion, except where the law forces it to.) Freedom of expression is a right, but freedom from offense in a public space isn't.
"No naked children or gore" is not at all reasonable, on a site with over a billion users, and a billion different use cases.
In the U.S., a law called COPPA creates additional requirements for websites that interact with children under the age of 13. So, to avoid those complications, Facebook sets their ToS to 13 and up, like a lot of other sites.
Thus it seems unlikely that they can admit that there are a lot of under-13 kids on their service, without creating the chance that some enterprising fed will take a look at their COPPA compliance. This will hopefully have the side effect of reducing the value of "for the kids!!" arguments.
Facebook is a new source, maybe the news source, for many people. Sure, children could also be watching CNN, BBC or Fox News... Does that mean they should not report on the real world?
It's worth noting that the freedom of expression is generally believed to apply more broadly than just to governments.
For example, from the UN Declaration of Universal Human Rights, Article 19:
Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.
It's my opinion that Facebook is impeding on that right, and that it's going to come back to bite them at some point.
I'm afraid this analogy is off-base. To echo Ivan's comment, I'm not offering my house as a freely available communication medium to the world, from which I profit. If I were, then this would be a more valid line of argument.
Nobody's spraypainting on Facebook's house here. They're using it as it was intended: as a method of one-to-many communication for messages which contain nothing illegal.
A more apt analogy might be our phone networks for person to person communication.
The public thrives on the basis of such "dumb carriers" as the internet itself.
I can forsee either regulation or self imposed ubiquitous clear CATEGORIES of rights for speech from users on sites and applications similar to software licensing.
You would have a Facebook under the <Proprietary license>
A Facebook under the <Academic Free License>
A Facebook under the <GNU General Public License>
A Facebook under the <MIT License>
either under a parent company or as distinct entity. Some are free, some cost money, some may receive tax benefit or government subsidy, all according to how they align with net neutrality regulations, are staffed and moderated.
Border line is the existence of service contract. If I allow anyone write anything legal on the wall of my house (the service I offer), I cannot restrict "anyone" to "white male", "anything legal" to "something I'll decide later what" etc.
I'm answering generic question. Regarding Facebook, their terms of service are questionable from this point, because they restrict protected freedom of speech.
Fortunately you don't have to come up with your very own made-up example: We have a very concrete case that is the subject of this entire story submission.
There isn't a line, there are poles. Even laws are interpreted by judges who have moods, and what not. Why quibble about imaginary lines in an instance where it's not anywhere close? Could such a line even exist? What is the precise length of the coast of any island? Depends on the resolution, and if you have high enough resolution it also depends on the precise moment you measure.
Can you name any "line", at all, in any sort of dilemma? Obviously it's okay for you to gently tip on the shoulder of someone blocking your way in certain situations, but in other's it's rude, not to mention "physically interacting" with them with more force and different intentions. Where do you draw "THE LINE"? Between being polite and being impolite, as well as being impolite and assault? How would you write the rule book to determine it in any situation for a minimum wage Facebook employee or algorithm so they wouldn't have to apply any of their own judgement? And just because you couldn't, just because nobody including God could, would you still refrain from judging even extreme cases with some certainty?
I'd argue there's a very undefined grey line where companies cross over from just being a small scrappy startup doing something with tech to being a significant influencer of the public with a significant degree of control and impact. At that point it's in the interest of the general public to no longer treat them entirely as a private walled-garden enterprise but as something bigger.
We saw with the Microsoft antitrust case, the government stepping in where a business had grown to the point where it had a dangerous level of control and influence financially; I'm not sure if there's a social equivalent?
Social networks pose a really interesting problem to the usual capitalist model, where competition (y'know, theoretically) ensures the best outcomes for consumers over the long run. If you build a definitely better automobile than General Motors, even only slightly, you can compete and bring the price down/quality of life up for the customer. If you build a definitively better social network than Facebook... no one cares. They're only on Facebook because their friends are. You need to vastly exceed their experience before they'd even consider it.
When the normal checks of a single business dominating a field fail, we need to find new solutions, especially when that field is communication. We don't want a future in which Facebook is non-democratic, but everyone gets their media from it and can't see a good reason to leave. I would hope that over the long term, we figure out public ownership of social media, and enshrine control and understanding of the systems that bring you information as an important right.
Social networks pose a really interesting problem to the usual capitalist model
I wouldn't call the network effect so much a "really interesting problem" but more like "one of the many well known failure modes in the fantasy that market effects lead to the 'best' outcomes for any definition of 'best' that isn't tautologically defined as whatever market effects cause to happen."
We regulate lots of things already to stop private entities from roaming free and against the interest of the general population.
So just playing the "but it's their walled garden" card does not cut it for me.
For me the question is whether facebook has a significant influence on the population or not. I think they already do when it comes to News, more so than traditional news organizations even. On top of that, they claim to provide an open space for exchange and even discussion.
Personally, I'd therefore like to see facebook being regulated to the point where they cannot simply refuse "service" to people of certain completely legal opinions discussing legal topics, same as they and other businesses should not be allowed to refuse service for certain classes of people.
Instead of meddling in the extant gardens, we should make it simple for people to move to one with better rules.
The current state of copyright and tech access law makes that inordinately difficult. If we update these laws, it will be much simpler for competitors to challenge the lumbering incumbents like Facebook and Twitter, and then it won't matter so much what Mark Zuckerberg or Jack Dorsey want to allow or disallow.
The free exchange format the internet offered is threatened by walled gardens becoming the home of 99% of online activity. Even something like Google struggles in light of this, since they aren't allowed to get in and make intelligent calculations about the popularity of online resources (which used to be roughly determinable by links mined out from the open internet).
>> they can't share with you, interact with you, etc, without you also being inside.
I hear this a lot and wonder what people did before social media, and before FB was around. It's the same thing. I closed my account probably 5 years ago and don't miss a thing. My friends are on FB, but I still know what's going because they use multiple platforms, we text and we still use email - so not having FB isn't a deal breaker.
The downside to this is people have lost the ability to effectively communicate with each other. If the only way you have to communicate with your friends and family is FB, then this is the onset of the decline of Western Civilization.
i also thought about that. But a society sets the rules businesses operate in, that ranges from environmental laws to other regulations. I think its perfectly valid to also set anti-censorship rules if the society think its valid.
Freedom of speech includes the freedom to not say something. It would require a constitutional amendment for the gov't to come in and start demanding things from Facebook.
well, i don't know if thats the case in good ol'merica.
But i don't think there are major obstacles to introduce such law in germany/europe. We already have media-laws and special press-laws (which regulate how stories must be published, for example we have a "journalistic duty of care" which amongst other things mandates that unconfirmed statements must be marked as such. The law follows the codex of the German Press Council).
Facebook are the defacto government of online social activities. No other entity comes close, and they have compared themselves to countries before when emphasizing how large their userbase is.
I understand that facebook is free to decide what content it hosts. But I think deleting a post by the prime minister of Norway is extremely disrespectful to the people of Norway. I am shocked by their behaviour and won't be using their service until they apologize.
"Freedom of expression" isn't specific to government-protected or limitations on government-censorship actions. It's a general principle. It's also not generally protected.
Specific legal structures protecting speech rights are government grants, within those legal constructs. The legal doctrine of the US First Amendment free speech tradition is that it is a broad, but not absolute, limit on government prior suppression of speech rights. It specifically doesn't apply to private individual or corporate limitations on speech.
The history of free speech protections in the US is complex and surprisingly recent. It wasn't until 1919 that Oliver Wendell Holmes in a Supreme Court dissent laid the basis for what would become seen as a very broad free-speech protection. Note that as a dissent, his views didn't carry at the time.
It wasn't until further cases in the 1930s and 1960s that the present regime of protected speech -- protected from government interference -- fully came into being in the US.
Well, something to consider: during the time that the US Constitution was being written, the most common site to exchange information (usually pamphlets) was the local tavern or coffee house. Sure, if the proprietor was being a prick you could always bring your friends to the shop across the street. But that environment influenced the founders' ideas about how information is exchanged and debated, and it's the format the law of the land was written for.
Facebook is a global coffee house. Everyone can be there at the same time. But that also means it is very hard to use an alternative, especially as Facebook buys relevant competitors of information exchange like Instagram and WhatsApp.
Suddenly, we are in a situation where more people can connect than ever before, but for the vast majority of people, because of convenience and the lack of tenable other options, there is hardly any other choice. Other social networks that simply tried to clone Facebook never got off the ground because everyone was already on Facebook. I would truly love to close my account and leave but it's hard to stay in touch with friends from outside the United States, especially if Instagram and WhatsApp are out of the question.
Given Facebook's overwhelming presence in modern daily life, it is necessary to regulate what it can or cannot do for the benefit of the common citizens. The degree of regulation will always be debated, and rightly so, but to pretend that a private entity of immense social might is free to do as it pleases is to willingly give away power and authority from the state, and in a democracy, from the people themselves.
EDIT: wow, downvoted to hell for this one. It would be so fascinating if just one person could describe what makes my view so wrong.
You're quite right. Democracy and consumerism don't always push in the same direction, but when they conflict, democracy has more moral legitimacy.
Arguing in favour of democracy does mean limiting some people's opportunity to make money. That can make it an unpopular view.
(Independent of regulatory capture and governmental corruption, of course. The point being that the people are in charge, not what they do with their wallets, because otherwise we end up with an oligopoly.)
I suspect information exchange and debating, was not meant to be enforced on private property during the constitution's writing. If I owned a tavern, I do not suspect the founding father's intent was for me to allow whatever kind of debate/information exchange to take place in my establishment. If anything, I suspect the opposite to be true.
If I own a coffeeshop today, I would have the right to restrict what is hung on my walls (e.g. advertisements, local events), and what is announced (i.e. someone loudly giving an announcement) to my patrons. I may not be able to control small group conversation though.
That being said, posting something publicly to Facebook is greatly different than DMing a person / group of friends. If Facebook deleted the image sent from one friend to the other, this would be a different conversation. I think your analogy and inspection of the constitution's intent, even further justifies that Facebook has right to do what they did.
I would only agree with your interpretation of my argument if you stated that you owned a tavern franchise in every town in the country, and every patron had to have a (free) membership to...
Ok you know what? This analogy falls apart because the mediums of exchange are so vastly different there really isn't any comparison, so I really don't see how Facebook is in the right on this. I already said people can go to other "taverns," but comparing physical locations and Internet social networks is like comparing apples and Orange County, California.
> But that also means it is very hard to use an alternative, especially as Facebook buys relevant competitors of information exchange like Instagram and WhatsApp.
It's not very hard. There are literally thousands of popular forums that are not facebook.
> Given Facebook's overwhelming presence in modern daily life, it is necessary to regulate
No it's not. The fact that you are frequenting a particular site does not give you the power over other people, neither should it.
There are not literally thousands of forums that have Facebook's reach. Network effects, corporate acquisitions, walled-garden approaches, and similar mechanisms make an equivalent reach outside of Facebook difficult.
Facebook sees 7h 45m monthly, vs. 1h 47m for Google, on roughly the same monthly users (153m vs. 138m).
Note that Google, a company with a $525 billion market cap, and a platform with a nominal 3.2 billion accounts, still manages to net only 5-20 million users with one or more monthly public posts. A topic on which I can claim some expertise. And the network doesn't even register above.
I highly doubt you need in your daily life to reach 1.6 billion of people. If you need to talk to your relatives and friends, there are thousands of options. Including, you know, talking to them. If you need to find people with similar interests, there are thousands of places where you can do it, unless your interests require talking to a billion people at once - at which point you are probably an advanced AI which is destined to rule us all and need not to bother talking to a puny human like me.
Of course, if your goal is to see cat pictures on the biggest social network there is in existence, then you have no alternative but Facebook. But I question this goal has any value at all. Facebook may have millions of posts every second, but you will never know about it, will never see them and will never interact with their creators in any way.
If your purpose is to spread your ads to as many people as possible, billion-sized site is also providing reach second to none. But I personally have very little interest about the optimal spread of advertisement, and so does about 99.999% of people that do not work in ad business.
In short, I see nothing that makes Facebook more than a) Schelling point that can be easily replaced with another one, and b) so-so platform to waste one's time because why not.
Psy's "Gangnam Style" video has been viewed 2.6 billion times (it broke the YouTube counter). Mind that this is YouTube rather than Facebook, though much of the video's sharing was through Facebook. So the capacity to reach some billions of people is possible and has been attained.
I'm noting this as an example of a widely-shared piece of content that I'm aware of. I'm also having trouble turning up widely-shared content Facebook-specific stats.
"An Anatomy of Large Facebook Share Cascades" mentions a case of 600,000 re-shares. That's a substantial fraction of total Facebook users.
What I've looked at is the distribution of posts on specific topics of intellectual interest, as compared with nonintellectual discussions. Total community size matters, though it's not a total determinant. Smaller communities with good discussion characteristics (say, Reddit) come close to matching Facebook on actual discussion, and very small communities such as Metafilter actually have very high ratios of relevant discussion, though these may omit specific domain or topic experts.
On a similar basis, if you want to find any one particular subset of people, you're in general going to find that in a larger rather than smaller network. Of course, if you know which specific group you're trying to target, then there may be other networks of specific interest. In the case you make, of seeking out family, friends, or shared interests, in general, Facebook is that network, and the attempts of other networks to challenge Facebook on the basis of social graph has typically failed. Challengers who have succeeded have generally done so on other bases -- intrest graph (Reddit), specific subset networks (e.g., Slack), or 1:1 communications (Snapchat).
Of material I've posted for which I have available stats, I'm aware of reach on the order of thousands to tens of thousands of unique readers, and no, not on Facebook (I don't make use of the platform). That's pretty good, though I find that where there are very specific people I'm interested in corresponding with, reaching out to them specifically (usually through email) is the most effective means.
> Psy's "Gangnam Style" video has been viewed 2.6 billion times
So? How it's relevant to the discussion at hand? Of course youtube is a great platfrom for pop-artists. But we aren't pop artists. And there were platforms before youtube and undoubtedly would be ones after.
> In the case you make, of seeking out family, friends, or shared interests, in general, Facebook is that network,
I have varied interests, and virtually none of them I found on Facebook or they depend on Facebook in any significant capacity. True, I can find some materials regarding to my interests on facebook, and some people sharing these interests are present on facebook, but none of it makes facebook irreplaceable. In fact, if facebook disappeared this second, it would be no more than minor inconvenience for any of the interests I share. Beyond certain point, unless your interests are either extremely rare or extremely broad, size of the network matters very little. If another billion people joins Facebook, it won't change my life even a little.
It may make next Psy earn more millions or billions, and make ad makers' profits soar to the skies, but to my life - and, I am sure, life of about 90% of Earth population - nothing would happen.
> "An Anatomy of Large Facebook Share Cascades" mentions a case of 600,000 re-shares
So what? Yes, cat pictures get shared a lot. Picture of Obama winning election probably even more. Information content of either is very close to zero, and neither is necessary for anything - Facebook is a crappy platform for being informed about political matters (even leaving the question of confirmed bias aside) and cat pictures are not something one can't live without anyway.
> Smaller communities with good discussion characteristics (say, Reddit) come close to matching Facebook on actual discussion
Facebook is one of the most crappiest platform in existence for actual discussion. Only twitter is worse, and that is because it's impossible to hold a conversation there at all, by design. Reddit is way better but that's a low bar to clear - most of discussion on Reddit, by volume, is garbage, with small pockets of excellent content if you can find them.
The comparison of YouTube and Facebook is one of noting the reach of specific platforms. You seem to be pointedly failing to recognise the fact that 1) a video platform without YouTube's reach would not have served Psy, and that 2) Facebook was a key component in the propogation of the video itself. Both relate directly to the question of platform size.
I wasn't aware of counter/hoax status, though I've bumped into int limits myself, so found it plausible. Thanks for the information.
Your personal experience on any social network is almost never a solid basis for extrapolating to others' experience. That's something my own experience in quantifying the size of active user communities and range of conversation across networks has established quite clearly. It's an argument I've long since tired of.
On G+, the median number of followers for a non-publicly active profile -- 94% of all profiles -- is two. For profiles with any public posting activity at all, it's slightly higher: five.
Your original contention was that "There are literally thousands of popular forums that are not facebook." My response was that there are at best very few platforms which can match Facebook for general discussion. Something I've studied directly, in large part because I'm interested in finding good general discussion.
I'm also not a Facebook user, because I find the morals and practices of the company and its founder dispicable. A frequent affliction of mine, as a brief perusal of my G+ profile may indicate.
My larger points are these:
* On any given topic, for any given discussion, odds are far higher of finding a relevant discussion group on Facebook than on virtually any other platform, with the possible exception of Reddit.
* For any mass media online message, such as, say, the Prime Minister of Norway might care to engage in, Facebook is, hands down, it. Nothing else, not Twitter, Tumblr, Reddit, Usenet, not nothin', will compare. (You've been, I'll note, peculiarly silent on offering specific alternatives.) I'm not saying that this is my specific use case. I am saying that if it were, FB is reigning king.
Your observations on the specifics of conversation on Facebook as opposed to other platforms matches my own direct quantified research to a large extent. You might care to look again (or for the first time?) at the link I posted above. "Tracking the Conversation: FP Global 100 Thinkers on the Web":
Yes, Reddit beats out FB, though Metafilter knocks out both on the basis of relevant discussion, albeit at vastly smaller scale. Blogs in particular score exceptionally highly for content, but have a discoverability problem -- again, FB can address that.
And, being written by corporations, it could allow corporations to sue countries if they don't like their IP laws or enforcement policies... oh, wait...
It's similar to having the right to assemble peacefully in pseudo-public spaces like malls. But I would say the argument for freedom of speech on someone else's website is weaker than that; it's easy to just go to another website and say something there instead. Well, at least it's easy today. If Facebook succeeds at establishing a Facebook-only internet in parts of the world like India and Africa, then you might have a real reason to enforce that freedom, since those users wouldn't have any other choice.
I use Path with my family for photo sharing. Everything else gets put on my blog.
I find Facebook usage to be a tell for personal insecurity. It's ok- everyone is insecure in their teens. But when you get past 35, if you are still a heavy Facebook user, it indicates personal problems.
If you're a Facebook user and you are unhappy with the way the company strongarms, censors and manipulates its audience, the most effective way for you to express this dissatisfaction is to close your account, block social media bugs and encourage your friends and family to do the same.
Facebook doesn't care how you feel when you use their service; their bottom line simply depends on your contribution to the statistics they use to sell ads. Apathy, or even outrage, are perfectly acceptable provided you express it through channels they control and profit from.
As far as I'm concerned, as long as this conversation is couched in trying to appeal to Mark Zuckerberg's imagined sense of ethical responsibility it will lead nowhere.
Absolutely not. Government should have no say what a private entity can/can't do in their walled garden. The fact that over a billion people use it is a different issue.
A private individual, sure. I'll accept that argument.
But a corporation should have no such leeway. Companies exist for the mutual benefit of the citizens, and their shareholders. Just because we've lost that sense of ethic doesn't make it not true.
Now, the predominant view is "A company can fuck over whomever they want, as long as it lies within the confines of the law", is preposterous. Public corporations are already required to be sociopathic, given the legal requirement of profit (exception given to benefit corps).
Or in much more shallow words, "I'll believe corporations are people when they execute one in Texas."
Well it's an interesting idea to consider a corporation more like a derivative of the state, as it comes into existence via state power, and is in effect immortal until its existence is ended via state power; rather than like the derivative of a person. Therefore there may be a compelling argument that as the 14th amendment extends the bill of rights requirements to all of the states to uphold, that a corporate charter extends some narrow portion as well. As in, the corporation can abridge free speech of employees as spokespeople for their company, a consistent message of what the company "says" is important to be kept clear. But abridging the free speech of its customers seems a lot more problematic, to have a broad brush to just censor anyone using the service for any reason.
I doubt such treatment will come to pass anytime soon though. It's Facebooks infrastructure, and the EULA everyone agreed to gives Facebook, contractually, the power to engage in this censorship. If you don't like it, then maybe you shouldn't have agreed to their EULA which basically says all of your contents is not yours it's theirs. And this is something people who actually read these EULAs warned about and most people just gave it a hand wave as if it wouldn't actually ever be an issue.
A corporation crosses the line when it has a single person who is protected from the legal actions of the corporation by statue (i.e., a shareholder). At this point the quasi-libertarian theory of a corporation as "just a bunch of guys" completely breaks down as there is a special legal protection for shareholders.
A corporation isn't a partnership. A partnership is just a group of people who have agreed to do business together. All of them are jointly legally liable for the criminal actions and liabilities of the corporation.
A corporation, on the other hand, has legal protection for shareholders. A VC can bankroll eMurder.com and the VC isn't liable when the officers of the corporation go on a shooting spree. The options are : Investors get no special legal protection, and creditors and the law can pursue them and their personal assets for the liabilities and crimes of the corporation OR a corporation really is a special instrument of the State, and only has particular standing because we as a society believe the corporate veil provides social benefits.
The state can still attack people in the corporate structure individually. For example, if VP Jones decides to go execute a competitor, VP Jones can be brought to trial for murder.
But if that company decides to, say dump toxic chemicals in the waterways, it's only a few hundred thousand dollars, versus millions to dispose of them. Or if the company decides to refuse doing a recall because the cost of recall doesn't equal or exceed its liability, well, too bad.
Corporations can, and do, a great deal of harm. And many of those behaviors are collective across multiple levels and people. But yet, the most we do is a hand-slap of a fine.
> Or if the company decides to refuse doing a recall because the cost of recall doesn't equal or exceed its liability, well, too bad.
Is that wrong (I think that's what you're implying)? I'm serious. More money can be used to make things safer, but there's a point where the benefits become marginal compared to the costs.
People can, and do, a great deal of harm. There's nothing in corporation that is not actions of people consisting it. Corporations are not able to do anything by themselves, there's always some person that takes specific action. "Company" never decides to dump toxic chemicals, there's always a person deciding to do it, and there's always a person doing it. At least until we invent AI, then we may have something else. Until then, there's only people all the way down.
People seem to lose sight of that: corporations are REQUIRED to be as ruthless as they can legally get away with, or they are literally in violation of the law in terms of making money for their owners.
The law is the ONLY thing that curbs their bad behavior, other than publicity so bad that it (temporarily, until forgotten) causes a sales drop.
Yeah, I'd like to see more corporate death sentences handed out, as well. It may or may not deter any other bad corps, but it would cut down on second offenses :-)
This is false. There is no such law that requires nothing but ruthless profit optimization. I invite you to disprove my claim by pointing to a relevant piece of legislation.
First, your "claim" is loaded with a false question to begin with.
Laws can either be created (legislature), or interpreted (judicial). The basis of corporate law has been founded upon the judicial branch, as well as the requirement to shareholders.
> The case still most often used in law schools to illustrate a director’s obligation is Dodge v. Ford Motor (1919)—even though an important 2008 paper by Lynn A. Stout explains that it’s bad law, now largely ignored by the courts. It has been cited in only one decision by Delaware courts in the past 30 years.
> Oddly, no previous management research has looked at what the legal literature says about the topic, so we conducted a systematic analysis of a century’s worth of legal theory and precedent. It turns out that the law provides a surprisingly clear answer: Shareholders do not own the corporation, which is an autonomous legal person. What’s more, when directors go against shareholder wishes—even when a loss in value is documented—courts side with directors the vast majority of the time.
Real world example: Tim Cook said outright, in an Apple shareholder meeting, that he did not consider the ROI on every decision. He's still CEO; to my knowledge no one has even sued over the statement.
Do shareholders not elect a board of directors? Do the directors not have hire/fire authority over the executives?
Not saying this chain of authority micromanages every decision, but I suspect it's the rare CEO who goes out of his way to upset the board. Or is there some law that allows the CEO to say "You cannot fire me" (outside of anti-discrimination laws or such)?
(I'm not a lawyer so take this all with a pinch of salt.)
Fiduciary duty is not profit optimization. It basically means you need to be responsible with the company's money. You can't spend it on hookers (well, unless that's your companies business). It's basically a "don't waste money" rule, not "earn lots of money". Basically, the idea with corporations is that the shareholders own everything and the management is taking care of all the assets; it's not the management's so the management has to be careful not to waste money.
For example, legal guardians have a fiduciary duty.
> Dodge v. Ford Motor Co.
"The Michigan Supreme Court held that Henry Ford could not lower consumer prices and raise employee salaries. In its opinion, the discretion of the directors is to be exercised in the choice of means to attain that end, and does not extend to the reduction of profits or the nondistribution of profits among stockholders in order to benefit the public, making the profits of the stockholders incidental thereto. Because this company was in business for profit, Ford could not turn it into a charity. This was compared to a spoliation of the company's assets. The court therefore upheld the order of the trial court requiring that directors declare an extra dividend of $19.3 million. It said the following." (from https://en.wikipedia.org/wiki/Dodge_v._Ford_Motor_Co.)
"Among non-experts, conventional wisdom holds that corporate law requires boards of directors to maximize shareholder wealth. This common but mistaken belief is almost invariably supported by reference to the Michigan Supreme Court's 1919 opinion in Dodge v. Ford Motor Co.[5]
Dodge is often misread or mistaught as setting a legal rule of shareholder wealth maximization. This was not and is not the law. Shareholder wealth maximization is a standard of conduct for officers and directors, not a legal mandate. The business judgment rule [which was also upheld in this decision] protects many decisions that deviate from this standard. This is one reading of Dodge. If this is all the case is about, however, it isn’t that interesting.[6]"
> eBay v. Newark
"When eBay refused to sell, Jim and Craig deliberated with outside counsel for six months about how to respond. Finally, on January 1, 2008, Jim and Craig, acting in their capacity as directors, responded by (1) adopting a rights plan that restricted eBay from purchasing additional craigslist shares and hampered eBay's ability to freely sell the craigslist shares it owned to third parties, (2) implementing a staggered board that made it impossible for eBay to unilaterally elect a director to the craigslist board, and (3) seeking to obtain a right of first refusal in craigslist's favor over the craigslist shares eBay owns by offering to issue one new share of craigslist stock in exchange for every five shares over which any craigslist stockholder granted a right of first refusal in craigslist's favor. As to the third measure, Jim and Craig accepted the right of first refusal [7] offer in their capacity as craigslist stockholders and received new shares; eBay, however, declined the offer, did not receive new shares, and had its ownership in craigslist diluted from 28.4% to 24.9%."
" eBay asserts that, in approving and implementing each measure, Jim and Craig, as directors and controlling stockholders, breached the fiduciary duties they owe to eBay as a minority stockholder of the corporation."
I didn't understand this completely but it looks like craigslist was trying to restrict eBay's usage of the stock and that's a breach of fiduciary duty because eBay is a minority shareholder of craigslist and so craigslist is harming shareholders by their actions.
Perhaps it's an "emergent property" of the system of laws, and culture, that are in place?
How much "charity", percentage wise, do you think corporate management could actually take part in? (I don't mean P/R posing as charity costing 1% of net profit, I mean things like deciding to do something cleaner/safer which would cut net profit on a major product in half)
Those can also be looked at as cost reduction measures: medical premiums, lawsuits and regulatory fines result from blatant negligence. (as well as other indirect effects such as dissuading anybody competent from staffing your death-trap)
> Government should have no say what a private entity can/can't do in their walled garden.
I have some sympathy with this point of view. I would like to see a world where anyone can build whatever website they want, and the amount of attention the website gets is determined by its "worth" as defined by how many people choose to use it.
However...
> The fact that over a billion people use it is a different issue.
...there's the rub. The usefulness of a social network depends on how many people are already on it. Network effects mean that its very vary hard to build a competitor to Facebook or Twitter.
One possibility might be if social networks, one they reach a certain size (say >10 million users) be required to put all their public data on standard APIs (such as RSS feeds) under terms that allow anyone to re-use them.
This would make it a lot easier for anyone to piggyback a service on top of Facebook/Twitter/etc and if their censorship or other policies got too cumbersome, there would be much less stickyness preventing people from moving to sa competing service.
> One possibility might be if social networks, one they reach a certain size (say >10 million users) be required to put all their public data on standard APIs (such as RSS feeds) under terms that allow anyone to re-use them.
I'm genuinely glad that someone had made this point of putting public data accessible to other people after reaching critical mass.
I'm equally sad that it might never happen.
PS: There are other discussions going on having a common standard/protocol for social media communications so that it can be picked up by anyone like an email provider without locking out the people using a particular service. I'm now wondering what is stopping this from happening. FB and other private monopolies can be forced to adopt the protocol and data from them can be shared between other services like FB. How FB would react to this is whole another matter.
> Government should have no say what a private entity can/can't do in their walled garden.
Except in matters of de-segregation, food and drug safety, safe and healthful working conditions for working men and women, a minimum wage, waste disposal and environmental protection, truth-in-advertising, Employment and Labor discrimination, privacy of employees and customers, and licensure of trades...
Absolutely yes. Facebook has more members than entire countries and is the primary method that many of those people use to communicate. It has long surpassed any kind of protected walled garden and government absolutely has a say in how Facebook operates.
You don't think there should be laws over obscenity, terrorism or monopoly ownership, then?
The reason I'm asking is that if a Facebook user decided to send out full instructions about how make a pneumonia virus with home dna equipment, for example, how would you feel about that?
Let's say they were in another country where they law couldn't or wouldn't stop them? Is it ok for Facebook to keep distributing the information in its walled garden?
> Government should have no say what a private entity can/can't do in their walled garden.
If an entity has no say what another entity can do within some domain, then the first entity is not the government with respect to the domain, and (assuming that no other entity has such a say), the latter is the government.
Facebook has become a de-facto political "agora" for the politicians and high-Government officials from my country (I live in Eastern Europe). More exactly, that's where now they first express their views on current political, social and economics affairs affecting my country. Nowadays I would say that most of the important stuff first gets posted on FB, and only then it gets re-published by the local media (whatever it's left of it at this point).
So, even if I would want to "get off" FB and decide to ignore it completely then I would just have an "inferior" experience as a citizen of my country, i.e. I would not have first-hand account on what's shaping the things around me. It's not that I like what Facebook has become, but there's nothing that I can do to change it at this point.
So once your platform gets popular enough, the government can tell you what to do with it? Why? It's not Facebook's fault that politicians started using it.
Because with great power comes great responsibility, and companies like to cherry-pick only one out of two. It's in the interest of society that when something becomes mainstream enough to be considered "infrastructure" - something with so little competition that ignoring or bypassing it is infeasible given the state of the market - that this infrastructure works within frameworks that we've built and refined over centuries.
Rent-seeking is bad, it's facilitated by ownership of too large a share of the market. In return for allowing Facebook to leverage its size and influence for rent-seeking behavior, we as a society, a.k.a. the government representing us, have the right and responsibility to set the ground rules for what we get in return.
That's not about fault or entitlement, it's about how we can continue to uphold the values that are important to us. Free speech and the ability to publicly disagree are a crucial part of that. If your platform gets popular enough that strengthening or suppressing your voice has a material impact, profit and shareholder interests can't be the only thing that decides how things are supposed to work. There's a lot more at stake here.
Exactly yes. Once your platform becomes this popular, you wield real power over human society. And society should absolutely start to dictate to Facebook what it can and cannot do to prevent it from abusing its power. How many times do we need to go over this?
Any organization, when it becomes sufficiently large, essentially becomes a government.
>And society should absolutely start to dictate to Facebook what it can and cannot do to prevent it from abusing its power.
We have no good mechanism to prevent the government from abusing its power.
Those here defending FB are not arguing that FB can do no wrong, they are arguing that fixing a problem _with another broken system_ is not an improvement, and is, in fact, a step backwards.
I can opt out of FB, and I have. Facebook exerts no influence over me, compared to the influence the government exerts over me.
The problems of power abuse and effective communications are ancient ones. There's nothing that prevents private individuals or corporations from abusing power either, and there's a very long history of them having done just that.
Government, for better or worse, is a vehicle for channeling and aggregating power, in a way that at best benefits society as a whole, expresses the preferences of the majority, and respects the rights of the minority. It's far from perfect. But it's better than most alternatives.
Facebook is granted rights by governments, ultimately also to benefit the public at large. Facebook is not itself a government, though it transacts the online communications of a population larger than all but the very largest countries.
Along these lines, the argument says "Facebook is large, and should therefore be subjected to regulation".
Government is far larger than Facebook, and far more insulated from the people than is Facebook. I can't imagine why we extend trust to the government to fix a situation where we don't trust facebook.
It's all run by people, with their own competing goals.
And lets not forget that facebook could collapse overnight if enough people decided to stop using it. I don't think a population could exert that level of influence over a government.
>Government is... far more insulated from the people than is Facebook
Is it? Facebook doesn't hold elections. It seems entirely unaccountable to me, with no system of checks and balances to reign it in.
>facebook could collapse overnight if enough people decided to stop using it. I don't think a population could exert that level of influence over a government
They could and do, all the time. See: any successful rebellion ever. See also: the collapse of the Soviet Union.
Any given API change, content promotion or censorship, freebooting support (or restrictions), represents a taking of usefruct or allocation amongst various parties. With little or no recourse to them.
Does the victim of pollution, or of a highway rerouting, or a relocation of a major traffic draw for brick-and-mortar foot traffic, have any contractual claim to their previously-enjoyed benefit?
Power and rights are far more about the ability to effectively press a claim than specifically detailed binding pacts.
The matter was whether or not Facebook could effect a taking. Not the rights of recourse of others. You're dragging goalposts.
I disagree that anything on Facebook is "public" in the sense of "public ownership". Facebook is a private platform. Just because things are (maybe) publicly visible doesn't mean they're public property. Should the government be able to force people to put up or take down posters in public-facing Windows?
If I buy or rent space for a billboard I am not allowed to put anything I wish on it. That is no different than putting a poster in a public-facing window. It gets more interesting when a website in one country is offensive to another country.
I'm pretty sure that if you put a two-story tall porn image on your property it's not going to last long. I'm pretty sure that if you post a hugely racist message as big, or some other things, not matter how public that space will be, you'll run into some problem with the law whether at local or federal level.
Whether you're a private or public platform here is not as relevant as the fact that is is a public forum, it is practicing certain forms of censorship, and it a medium of information. Other platforms get regulations applied to them in many respects too.
Governments do in fact regulate posting of notices, including both commercial and political speech, routinely. This includes both forbidding specific types of notices in some instances, and compelling them in others.
> once your platform gets popular enough, the government can tell you what to do with it
I'd say there is considerable public interest, yes. Facebook without the users is nothing, but the users without Facebook are the same society, just using another software.
I don't think Facebook has changed society in a way that isn't repeatable. It's a low risk to the system when google+, or any other competitor, could meaningfully take its place given enough pressure.
Huh? What are you talking about? I'm defending the right of everyone to be free of excessive government interference, whether it's an individual or a popular company.
They can tell you what to do with it and not use it too. This is the way it should be. Think of the harm that can be done in all sorts of abusive ways if there was no regulation of sites and services.
Facebook arguably has a monopoly position in the way that your local mall, movie theatre or newspaper does not.
If I don't like the Times, I can read the Guardian instead. If I don't like what's on at one cinema, I can go to another (granted this may be harder in a small town with only one).
Of course I can close my facebook account and move to diaspora, but then I'd lose everything from keeping in touch with my friends' latest accomplishments to the schedule of my local hiking club.
I already find if I'm away hiking for a weekend without internet access and couldn't be bothered to catch up on facebook on Monday morning because there's already a pile of e-mails to deal with, at coffee break it's like I've missed some important social information that everyone else knows, because team member A has done something and posted about it on facebook and everyone else has been liking and commenting on it, and everyone expects you to be up to date with this. Without a facebook account you may as well wear a badge that says "antisocial".
I personally think governments should regulate media outlets in proportion to their influence on people's lives. FB is the new IE.
It's perfectly possible to be happy and healthy without a Facebook account. You might lose out on one particular kind of "social" connection, but that doesn't mean you lose actual social connections. You just catch up with people in person, via telephone, via email, via chat, or a million other ways.
Personally, I don't really find the "social" information Facebook has to offer all that valuable. It's just a stream of trivia from extended not-really-friends.
The parent poster just said that he felt like he missed on some important communication that he was expected to have been aware of. It's not possible to be happy without FB, it's not possible to be happy with FB either.
It is so possible to be happy without FB. Just decide your life is fine, despite the occasional time lags in you discovering certain information.
FWIW, I deleted my FB account months ago, and my quality of life has gone up. If someone expects me to be available via FB, I give them my email address.
>I personally think governments should regulate media outlets in proportion to their influence on people's lives. FB is the new IE.
IE went into a decades-long decline, because superior products came to market. No regulation required.
Who regulates the governments in proportion to their influence on people's lives?
I wish my government didn't bomb other countries. Can I stop paying taxes, so I can stop supporting the murder of people around the world?
You're asking an organization that backs its wishes with the threat of violence and jail to regulate an organization that lets you keep up with your friends hiking accomplishments.
Facebook could be irrelevant in five years. The laws you're asking to be implemented would be in place for another 100. Are you sure this is a good idea?
> IE went into a decades-long decline, because superior products came to market, because superior products came to market. No regulation required.
The decline followed behavioral changes in response to regulatory action and ongoing litigation (even it began before the resolution of the litigation), so I'm not sure you can concluded from it that regulation was unnecessary to the outcome.
Because it is becoming the predominant way people consume media. If it were only used by 10% of its current use, it shouldn't be, but it current levels it should.
As Stalin said "quantity has a quality all its own".
.. both of which are subject to different kinds of government regulation. But Facebook is definitely a medium. In some ways like a telecom or transit operator.
Start criminally charging Facebook for every criminal act they fail to remove. If they're playing the censor, then they need to be punished when they fail in that role.
The other choice, is not to censor anything, other than by the requirements of the law. Like the telephone system does.
It may be "more effective" to implore the government to crack down on media policies you don't like, but it's also as much a threat to free expression as Facebook's ridiculous and overt censorship of this image.
Free expression already has limits (No "Fire!" in a crowded theater when no danger exists), I'm absolutely fine with government saying what Facebook isn't allowed to do (remove journalistic posts without due process).
Yes free expression obviously does have limits; It probably doesn't have that silly "Fire in a theater" restriction.
Holmes (the supreme court justice you are quoting) said this with respect to this case: https://en.wikipedia.org/wiki/Schenck_v._United_States where they prosecuted Schenck for anti-war pamphlets. This is something that would be protected by the constitution today. This quote is a pretty lousy excuse for suppressing free expression considering Holmes's track record of dismissing free speech.
With regards to Facebook removing posts, why should there be a "due process" for a (specific) company to remove posts. What sense does this make?
FB isn't just a specific company. FB has grown to such a size that it has become a large part of the media/press. So it has to be regulated like press.
The first amendment says: Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press...
Where do you get that we should regulate the press?
Obviously you could argue that the constitution is wrong, and that's fine with me. It's just a document, but when you say "it has to be regulated..." what is the mechanism you're referring to?
For example, there are laws that restrict defamation, incitement to violence and guarantee the right of reply. They can't put false information out, like "The president of X has said Y", or use a cooked up image as real. They have to keep secret the names of minors when they are involved in police-related stories, they are required not to divulge privileged (state secrets) or private information, such as was the case with Gawker. All of these have their exceptions, but in general press is not quite free to do as they like, and it's better that way.
Asking the government to protect freedom of speech is the exact opposite of a threat to free expression. It's analogous to congress passing a law saying that newspapers cannot edit letters to the editor. Some newspapers might complain about having to publish things they find disagreeable, but nobody is being censored anymore.
Imagine if a paper newspaper were told that they had to publish every letter sent to them or no letters at all. They would stop publishing letters entirely. There are a lot of ways to construct laws such that they have the opposite of their nominal effect. It's important to note that the US First Amendment bars the government from passing laws respecting both freedom of speech and freedom of the press (among other related rights).
Wrong analogy. Newspaper does not offer a service of publishing customer letters. They offer a service of receiving and reading letters on premise that they may choose the letter to be published. It's a different story.
If a newspaper published all letters, then government could act on behalf of citizen to protect freedom of speech, by making sure there's no arbitrary censorship of publications.
The relevant detail is that you can indeed censor a publication by adding a rule that says "you can't censor your publication in the following ways". I don't personally use Facebook at least in part because I don't want to use a centralized, censored system, but the fact of the matter is that it's a bad idea (not to mention likely an unconstitutional one) to tell Facebook what they cannot publish and indeed what they must publish.
Although it's irrelevant to the point, I also don't totally understand the nature of your objection. If you are saying that any publication gets to define what it does and does not publish (i.e. Facebook says "I am a company that publishes non-nude photos" or "I am a company that publishes censored material"), then they merely have to re-define their service in order to censor people while complying with whatever scheme you have in mind.
Facebook is not a newspaper, because it does not produce the content that it makes available - freedom of speech cannot be applied to Facebook itself. FoS applies however to the content producers and limits the terms of contract that Fb may offer to them, by banning the arbitrary censorship (the distinction between artwork and offensive content is arbitrary and decided by Fb moderators). Also, by saying "unconstitutional" which constitution do you apply to it? It's multinational and has the business in many countries - each one can regulate Fb within its sovereignty.
If you measure everything purely by effect, there are more efficient ways of restricting media organizations from doing certain things. E.g., many news organizations avoided republishing the famous Muhammad cartoons, not because their parliaments were prohibiting it, but because they were afraid of violent retaliation. If you are OK with abandoning morals, you can be very efficient in preventing people from doing things. Mafia can be very efficient in preventing witnesses from talking, for example.
Now, you can say, employing threat of government-mediated violence through congressional law is not the same as employing the threat of direct violence. But the only real difference I can see is that you can claim "it's all for common good". Oh, wait, they do the same too...
That's a very dangerous path on which to travel. China passes some strong laws what media organizations can and can't do. Free speech isn't a right given by governments, but one cultivated and protected by the people.
The marketplace solves these problems. Nobody is forced to use Facebook. If so many people care, perhaps they ought to vote with their dollar and seek alternatives. If they are unwilling to do that, it's on them and not the company that would support censorship.
Don't like tacos? Don't buy tacos. The government ought not tell taco vendors they can't sell tacos but instead reduce regulation to make it easier for a pizza guy to start a business. Less regulation by definition, increased freedom.
How does the market solve the problem that it is morally wrong (or at least dubious) to delete an iconic picture that exemplifies the cruelties of the Vietnam War?
Do you believe that billions of Facebook users will switch to another social network because they are some sort of moral saints who recognize Facebook's mistake and act accordingly? Or do you instead believe that the majority of Facebook users are that much of a moral authority that we should readily concede that Facebook is right, because almost none of their users complain?
"the market solves this" is a funny meme, but it's often plain wrong. Markets can solve certain optimization problems and find price equilibria, but they cannot somehow magically solve the vexing moral question of how much (if at all) a social media company should censor their user's content and who should control this in which way.
There are a few problems with your examples, I think.
There is a difference between bad regulation and good. Some regulation is necessary in many circumstances. China's strong laws over media organisations are an example of over-reach: Not calling emergency crews to your house to complain about how your think they are misusing tax dollars isn't bad to regulate.
The marketplace solves some problems, but definitely not all of them. Discrimination is a lot harder to solve just relying on the market, for example. Regulation comes in to help.
> Perhaps they aught to vote with their dollar and seek alternatives The problem with alternatives is the same problem folks have with seeking alternative utilities. The alternatives are few and far in between and offer an inferior service comparatively. In fact, facebook is more like a telecom that refuses to connect to other telecoms at this point. Your basic options are to participate... or not.
It isn't like buying tacos at all - tacos has a market that competes to an extent, and this just isn't such a thing. And to get it to the point that it is easier for other folks to start the business, you are knee-deep in regulation, including standards so the different services can communicate with each other easily and other such monopoly-busting sort of regulation, which will probably last for many years.
It's not dangerous, it's the only path that society can choose. Facebook is dominant on social networking market and is subject to anti-trust regulations, that must be adopted to the nature of the service. Same can be said about Google in some countries.
It's almost impossible today for a person to switch SN, because for this to happen it has to be a group decision - most of the contacts must do the same switch. I have accounts in Path, VK and few more "yet another" social networks, but they are useless without social graph and it's beyond my control to move this graph there. It has to be regulated, because having freedom of speech in hands of a single corporation is wrong.
For sure. A few dozen bureaucrats should be dictating what is and what isn't acceptable content and the punishment for not adhering to their sense of appropriateness.
Thats not really true. I dont use facebook (i use wechat, never had anything censored, and the privacy controls are awesome), but i dont think facebook wakes up in the morning and thinks "why doesnt Hugh use facebook? We should make it more like wechat". If you want change, the most effective way to get it is to ask for it.
> is to close your account, block social media bugs and encourage your friends and family to do the same.
Much easier said than done.
First, there is no way to 'close' your account - you can only 'deactivate' it.
I did it in July, but then I noticed that a lot of social activities - like concerts, festivals, parties, etc, are organised using Facebook so if you're not on it, you can't participate.
This is very frustrating and wrong, but that's what the world does.
I personally think that Facebook is breaking the Internet and we're just starting to see the first signs of the bad things to come out of it.
Back to account 'deactivation' - if you log in to your deactivated account, Facebook conveniently 'reactivates' your account automatically, so you can't just log in to look at your data, which of course is not yours and is the currency which Facebook exchanges for real cash.
Telling friends and family to do the same is useless - most don't care even for 1 second about 'privacy' or things like that.
So 'closing' your account is more than just stopping to use a web application. It's a lifestyle choice - do you want to stay secluded, excluded from a lot of social activities and considered an 'introverted loner' or do you go with the flow and get trapped more and more into this social experiment ...
> I did it in July, but then I noticed that a lot of social activities - like concerts, festivals, parties, etc, are organised using Facebook so if you're not on it, you can't participate.
They are organised this way because it is an effective way to organise them; so the only sure way to stop them being organised this way is to make it ineffective (by not participating, by urging others not to do so, and, crucially, by making your reasons known to the organisers). It is true that not using Facebook is an inconvenience, but there is no guarantee of a right to protest without inconvenience!
You can delete it there are what only 5 datacenters...finding your data and cleansing it should not be too hard just alittle repetative ;) But the more effective way to make Facebook listen is to hit them in the pocketbook. Keep your account use an ad-blocker if enough people did that long enough...
They'll still profit from additional ad revenue when other people, like your friends and family, view your posts and use facebook longer than they would have because they are interested in you and your life.
> You can delete it there are what only 5 datacenters...finding your data and cleansing it should not be too hard just alittle repetative ;)
I don't know if your comment was tongue-in-cheek, so here goes. Actually, it's not about the number of data centers Facebook has, but the number of CDNs and edge caches around the world and how FB manages those, including third party companies (like Akamai) that provide this service for Facebook. Plus, Facebook has had a lot of trouble, in a very shameful and absolutely incompetent kind of way, in removing the visibility of photos that were "deleted" by users. See this saga spanning from 2009 through 2012 as reported by Ars. [1] [2] [3] [4] [5]
> But the more effective way to make Facebook listen is to hit them in the pocketbook. Keep your account use an ad-blocker if enough people did that long enough…
Facebook is already trying to push more users to use its mobile and desktop apps so it can have more control over the content (read as "ads") shown and collect more information that's not easily wipeable by end users (like cookies, cache, etc.). We will see a time in the coming years when there won't be a browser interface for the platform, and the cat and mouse game between ads that look like content and ad blockers (to block FB ads that look like content) will continue on. Depending on the platform, people may start needing content blockers on their routers (or an internal proxy server) to deal with this.
For many users, abandoning Facebook is like abandoning the web. So no matter what shit Facebook will pull they're not gonna leave, unless something better appears.
From the other hand, this whole situation is a bit ludicrous. The journalist doesn't seem to understand how Facebook works and is directly attacking Zuckerberg like he explicitly asked to remove the picture. Come on guys, it's just an algorithm that detected nudity and decided to censor the pic. You can't write a rant like that and not understand how the damn thing works in the first place. It's poor journalism. And then you make the assumption that just because Mark has power he's using that power to manipulate the world. How the fuck is that objective reporting?
Or the journalist understand that it is automatic, but does not want a society where algorithms blindly determine what is allowed or not. After all "it's the algorithms" is a very convenient excuse for Facebook that allows them to be a faceless entity beyond criticism. It is Facebook that decide which algorithm they use to censor content, and they have chosen a rather prude set of policies they think will avoid offending anyone in any culture, and are very quick to remove controversial content. This is something Facebook have chosen to do, and to little degree something that is enforced on them.
The picture was removed by moderators, not algorithm. So Yes even though Zuckerberg did not delete the picture, he is responsible for acts and policies of the company.
Technically it's possible to have an index of all images published is some registered media and assume they are already vetted by editors of that media. Then you compare the published picture and voila - you do not have this problem at all. Of course, it adds some costs, but it's exactly what M.Zuckerberg can decide to spend money on (before delivering this service to people in Africa and other places where it's easy to abuse Facebook censor mechanisms - ask Russian opposition how Putin's media warriors take down their posts).
I think that it’s much more safe to prohibit all nudity than nitpicking what to allow and what not. There are TBs of pictures uploaded daily on Facebook, it’s not a trivial task.
What's wrong with this image? Of course it should be allowed. I see no difference with David by Michelangelo, for example. Banning all nudity is just stupid censorship that has no moral ground.
I believe that a lot of conservative people would find a picture like that offensive. We have to keep in mind that Facebook's audience is in the billions.
It's not the reason for censorship. Facebook may offer parental control and display explicit content warnings to serve this audience. After all, people who find nudity offensive may limit their subscriptions to accounts which do not post it - it's not that they are forced to look at it.
There's no need to jump straight to closing your account. There's a wide range of escalating actions you can take when companies dissatisfy you. Going straight to firing a company every time there's a problem doesn't really help.
Facebook absolutely cares how you feel when you use their service, because certain feelings are associated with less engagement (and thus less revenue) or people quitting the service. Expressing your dissatisfaction to them is a way to let them know that you're moving in that direction.
Do you really think that anyone in a position of power in Facebook is personally opposed to this photo, at least to the extent of feeling moved to change policy if it is shared a lot? (Also remember that, if they do feel so strongly about it, then Facebook has the power to remove it at will.)
On the other hand, don't you think that everyone in a position of power at Facebook cares if they lose users, especially en masse?
Don't make a dent in Facebook. Make a change in your own life, and maybe lives of your friends.
There is a number of large things that you might not like. The smallest effective thing you can do is to not be a part of such things, to prevent them from consuming your mental resources and from defining any bits of your agenda.
But, I'll need something else to replace it with. Even if we ignore the network effect, it will be owned by some other corporation whose policies I'll not agree with. A govt owned will be worse for simple reasons. Do we have truly free system which is open, easy to use and still safe enough to let a 13 year old to use it.
>Do we have truly free system which is open, easy to use and still safe enough to let a 13 year old to use it.
Of course not. "safe enough to let a 13 year old to use it" can only be accomplished through censorship. You can't have a truly free system that also respects cultural norms or legal authority - any system that does so isn't free.
It's too useful to be ignored. Soon, interaction with businesses will be done through their platform. It's easier to set up a FB page with services + marketing included than to create a website.
I closed my account back in 2010-2011, and I haven't missed it the least. Only problem is that the board game group at work organize through a Facebook group, but we overcome that by talking to each other.
If a company can only communicate with me through Facebook then it's probably not a company I want to communicate with in the first place.
Any centralized social network is subject to moderation because if it's centralized, it can be attacked, fined or shut down by a court. So facebook can't escape that rule and must decide what is acceptable or not and have to anticipate any flak they can get.
In the end, moderation is a gruesome job and nobody really wants to do it, and it will be subject to how moderators anticipate public perception, so it's a PR race.
So of course you will have those situations where facebook will make bad choices, but it doesn't only depends on their moderation team, it also depends on political correctness. That's why decentralized networks are better, because nobody is really responsible, and it can hardly be attacked.
You can decide to either have a politically correct website and get investments, or disagree with political correctness and be like 4chan.
It's not great, I'm sure people realize that, and that the internet will go back to decentralized systems.
That's a false dichotomy, especially since they're getting more flak for deleting the controversial photo than they would have for removing it. It's pretty easy to distinguish between a historically significant photo, to keep, and 4chan-style offensive trolling, to delete. There is a gray area, but this was clearly on the right side of it.
This particular picture was on the cover of the New York Times. Surely a human reviewer from Facebook must consider this as reason enough to revert the censorship?
Sadly, the human reviewer at Facebook is quite likely to not have even been alive in 1972 (or not old enough to even remember the photo) and so it is likely they are just following the "official facebook policy" without thinking, and thereby hitting the "censor" button.
Note - this does not excuse their actions, but may put some context around 'why' it is happening.
I think facebook use bots at one point or another, who automatically moderate thing. The crux of the problem is how users do try to dispute a moderation, and if facebook really bothers to validate automated moderation.
Just imagine the quantity of content passing through facebook, and imagine the amount of people required to review all those report that comes in. I'm sure they off loaded a big chunk of it towards some sort of AI.
Was the Norwegian Prime Minister's post removed because she posted the image again in that post ? This is a crucial question and not clear from the article. If Facebook censored only words then this is a much larger issue. If they censored the whole post (including photo) then while debatable this is Facebook's policy i.e. a blanket ban on such imagery, irrespective of history.
Edit: I don't find it clear journalism, but the fact is there:
Solberg was one of a string of Norwegian politicians who shared the iconic image after Facebook deleted a post from Tom Egeland
So the post was removed because it had the image, not because she had dared to criticize FB.
In the article itself the PM gives Facebook credit for at least being consistent, w.r.t removing posts that contain that photo. So yes, it seems likely it was removed because of that picture.
However, you probably want to think twice before removing a post by any PM. Having done so only fuels the fire, which arguably can be a good thing in this case.
Yes.. I don't like her policies, but she has her good moments, and while I don't think they'll keep reposting it, as they'll have more important things to do, the Minister of Culture has also posted it and had her post taken down, and she has already indicated she very much wants to get Norwegian newspaper editors and Facebook together for a meeting. They'd not do that unless they're very displeased.
This iconic picture was not only a Pulitzer Prize winner, but was also on the cover of the New York Times. Surely this will help the anonymous "Facebook spokeswoman" determine on which side it lies of the thin red line of "censor" / "do not censor"?
Why doesn't fb just blur the content that users find disturbing like "Viewer discretion.., flagged by our users". Then you can click to view or adjust the sensitivity in your account settings.
At Google certain images are considered EDSA (Educational, Documentary, Scientific or Artistic). I wonder if this would have been considered EDSA vs Facebook's decision to say it's against ToS.
That said, it totally makes sense that they have a consistent policy. Whether you find their overall abuse ToS objectionable should be the main consideration here. It's OK to me that they seem to have decided that imagery containing nude children should be hard-banned. It's a decision couched in the desire to protect children, not some heavy-handed censorship.
I think censoring the PM's complaint is a bad move by Facebook. Regarding censorship of the photo, I think it should be left to Norwegians to decide whether it's appropriate or not - I think different people might have different views on this.
As much as I'd love to see some alternative to Facebook that provides generally the same basic features (easy sharing of original content and links/reposted media with a large group of contacts) I feel that they'll all face the same issues for the foreseeable future.
Same issue with the ones you mentioned or even ones like G+ that are backed by another big player: critical mass of users combined with no standardized protocol.
I've made the analogy plenty of times before but basically I compare and contrast with email, another service where you can "roll your own" but most general users just stick with a service, either paid or ad-funded, from another company. The thing with email is that it can be 2001 and you're happily using your @aol.com email account but down the line you decide to switch over to Yahoo. Then maybe later you switch to Hotmail and later still, to Gmail. All of those switches are up to the user and nothing stops you from even setting up your own email server if you have the desire.
Regardless, none of the people you email need to do anything or change anything. And it's a good thing too because it would suck if in order to move off AOL or Yahoo you need to convince every one of your friends, relatives, business contacts, and potential email recipients to switch over as well. It might be feasible to get that 5% of your "techie" friends to switch services based solely on some new feature or whatever but you won't have much luck getting Aunt Mabel, your kid's 2nd grade teacher, your barber, or your potential clients to all switch over to some new UI and remember some new account address.
But since email is based on a common protocol it doesn't matter. No matter how popular one provider is, you can switch to any other one and not lose the ability to email. Contrast this with centralized social media and you're basically screwed if you want to switch to another one.
This really struck home when G+ came out. I found it to be a better Facebook for the most part. Faster mobile app, more granular sharing permissions, no stupid game/app spam, better media hosting, and better text/video chat. Maybe 10 of the people I generally connect with on social media felt the same way but the other couple hundred in my friends/contact list? They found it annoying to learn a new interface or they weren't already using Gmail so they'd need to sign up for another account or they just didn't like something else about it.
And that's legit. Variety in services means there will be preferences. But in this case it didn't matter because the end result was the same: unless everyone moved over to G+ (or Diaspora or whatever) en masse, you'd either be cutting off contact with most of your list or you'd be maintaining multiple profiles and sharing links and pics on two sites.
So G+ only got used for more specific groups and niche interests while Facebook kept more general social networking. And now Google seems tired of trying to build a better Facebook and is de-Plus-ifying most of their services. I seriously wonder how anyone will conceivably succeed at this where massive companies like Google can't pull it off.
Instead, just as centralized social media replaced mass email chains for most people, I don't see Facebook going anywhere until something entirely different (not just new and improved) comes along and grabs mass user attention in the same way.
Something that "gateways" to Facebook would be great, but of course Facebook would fight it with every weapon possible if it were to get any kind of traction.
Let's say you built a distributed social network where you can add plugins to communicate via different protocols... And one of them happen to know how to interact with Facebooks APIs or website, so that you would not need to care whether your contacts are facebook users or non facebook users.
The same way we have IM clients that can talk multiple protocols.
Facebook almost certainly would try to shut it down if it got any kind of traction, because the set of people you could interact with on this new platform would be a strict superset of Facebook users. It'd be bound to have clunky usability issues (e.g. group communication would be hard to get right with groups split across platforms), but it'd enable users to start extracting themselves from Facebook more and more if/when more of their friends decide to try it out.
I'd love to see someone try. I won't try myself for two reasons: I don't use Facebook enough to know what features would be necessary, and secondly it's likely to be a game of cat and mouse with Facebook both technically and legally.
I totally agree with the concept of "gateway". Kind of like a social protocol similar to SMTP(which is for emails), by which Facebook/G+/<some-new-player> inter-operate. That's the best solution and the only solution to fix this situation.
If needed, Facebook must be forced to open up. Its a wrong kind of monopoly.
>I'd love to see someone try. I won't try myself for two reasons: I don't use Facebook enough to know what features would be necessary, and secondly it's likely to be a game of cat and mouse with Facebook both technically and legally.
SMTP protocol[1] is created and maintained by IETF. Ditto for social. I wonder why don't Google, who didn't succeed with G+, submit a 'social' protocol already to IETF.
The biggest problem I have with Diaspora is the lack of width of types of people on it. Almost everyone there is an anti-corporate type or someone who is almost certainly only there because they've been banned everywhere else.
I suspect a large part of this isn't so much an attempt by facebook to impose US cultural norms to the rest of the world, as much as an attempt to avoid financial burden by simply applying the ban stick as bluntly as possible.
After all, being multicultural, providing good editing suitable for several countries acceptable norms, while trying to advance/modify them...
Well that might be viewed as admirable work or cultural imperialism.
The point is it's not work that they want to do, nor do I think is it work that they feel they can get paid for.
It is kind of scary to see how countries are powerless when it comes to Facebook. I know that this article and the whole discussion here is not about that but I get a eery feeling reading about it.
Theoretically, the PM could find herself having trouble with her social media reach in the lead-up to the next election.
The primary criticism that gets leveled toward Facebook is about privacy, but I think the degree of control they can exercise over communication is the more worrying issue.
If you want something from FB: its reach, you need to play by its rules, however arbitrary they may be. If you wish to change the laws of physics, go and get yourself your own planet. It is much easier in this case: just choose a different forum.
Having said that, this incident should teach Norwegians (and the countrymen of any country) a thing or two about where they stand on the totem pole of power.
Facebook > Every other country on the planet
Facebook is a country because it is acts as an independent sovereign state which is not answerable to anyone at this point. Apparently, it already makes up its own taxation laws[1]. I expect them to release their own flag, maybe a national anthem?
But of the many truly troubling things I see with FB's policies - their alarming intrusiveness and ruthless exploitation of our need for being social, choosing its own censorship policy is not one of them, especially if it is consistent. I would rather see them made answerable to privacy violations.
Maybe the censorship team was too young know the significance of the picture. I am guessing average Facebook employee is under 30. Probably younger than that. Vietnam war is over 40 years old, most American students learned about it and knew that picture, but I don't know how much Vietnam war is taught in other countries. It might have been a combination of age and where the person grew up that contributed to deleting the picture?
"Most American students learned about it and knew that picture..."
I'm 38: We didn't learn much about the Vietnam war, honestly. The sections on everything from the 50's onward were very short, and sometimes glossed over due to time constraints. I think we had one teacher tell us that it was simply because not enough time had passed since it happened, which always seemed ridiculous to me.
It is kinda like we knew it happened, but didn't really learn much about it formally. Lots of heresay and rumors and things.
49 here, and High-school history class 'finished' not much past WWII/Korean War era. Very little if any touching of the Vietnam war. And at that time it was the same reason. End of school year arrived, and that was as far forward in time as the teacher had achieved. And so that was that.
I'm 48, Italy, and I know about the Vietnam War but not because of school. Same problems you experienced.
I have few memories of it from TV news back then. I remember katyusha missile launchers firing and general happiness when the war was over. But many movies kept remembering everybody about that war. I read books, visited the memorial in Washington when I was there on vacation. I browsed wikipedia in the last years. I studied many wars at school (European history is pretty long) but I know less about most of them than about the Vietnam War. It's pretty easy to get informed even if school didn't care much about that war when we were little. Same about the Korean War.
If your only real exposure to history was through primary and secondary school, you really owe it to yourself to explore the space yourself.
It's not just late 20th century history that's elided, I'm finding (despite some exposre) that my knowledge of Greek and Roman history, of the post-Enlightenment era in Europe, much of English history, the late 19th century, 20th century issues of labour, race, drugs, culture, immigration, women's rights, and more, was tremendously elided.
* James Burke's Connections and The Day the Universe Changed are good introductions to histories of technology and philosophy. From these, continuing the PBS series mode, I'd recommend Kenneth Clark's Civilisation and Jacob Bronowski's The Ascent of Man. These are truly excellent productions.
* Daniel Yergin, The Prize, whacked me over the head with just how tremndously momentous the discovery of petroleum in Western Pennsylvia was, and the impacts across the last quarter of the 19th century and all of the 20th. Follow this with a more technical exploration in Vaclav Smil's Energy in World History and Manfred Weissenbacher's Sources of Power.
Really? We got a decent amount on Vietnam, though not so much in high school. I remember in middle school one of our teachers had us watch some of the actual newscasts, as well as exposing us to some other things like this picture. I'm 35 btw.
Yeah, really. Different books, perhaps, or you had a teacher that made sure to teach it. I remember watching stuff from the first Iraq war in school, but basically everything between WWII/Korea and whatever they said were current events was basically rushed and glossed over.
I'm 30, we definitely did. However, (I assume) both our parents were draft-age at the time of Vietnam. We also went to school during a time prior to the overly structured curriculum. Part of it may simply have been that the curriculum was generationally influenced.
I think my dad was a touch too young at the time, be was born in 1956 I think.
>We also went to school during a time prior to the overly structured curriculum.
This is probably why there's so much variation.
>Part of it may simply have been that the curriculum was generationally influenced.
This is probably a huge piece of why you and I got a lot on it. For many people that were teachers when I was young, this was a huge moment in US history, and warranted the attention.
I can't remember if I took AP or advanced history (I don't remember if they offered AP in that class at that school or if I simply didn't do the test). It would have been the 94-95 school year since seniors took government and econ. Considering the lag time for updating books, they might not have gotten through much of Reagan's stuff. It was during Clinton's first term, so Reagan would have been out of office less than 8 years. Short section if anything. Timing is everything.
I remember 3 books for summer reading, and a different project was having a somewhat historically accurate journal of someone moving west. She taught with a college book, and we had to buy it instead of rent the book.
But the same thing. Emphasis was much earlier, and by the end of the semester or school year, the stuff was glossed over. The information about things so current became more of an overview without so much information.
"A Facebook spokeswoman said: “While we recognise that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others. "
>it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.
I'm assuming this is just PR, but if not, what a dumb statement. I'm sure there are cases where this might be true, but given the photograph we're talking about, give me a break. You really can't create a distinction between this and most other nude photos?
Unless she meant algorithmically. I could see that being a problem.
And that justifies the initial removals. It does not justify not having an effective appeals method to someone who isn't just blindly applying their rules, and a means of white-listing things that have been approved once.
This is not a new problem that they've never been exposed to before.
> Unless she meant algorithmically. I could see that being a problem.
"The posts would have been reported by a user to Facebook’s community standards team, who would then have made the decision to remove them, rather than being removed automatically by algorithm."
I find that violence doesn't really feel real to most Americans, and I think one of the main causes of this is that the US is so isolated from world events.
WWII, Vietnam, etc. all happened "out there somewhere", and aren't really considered by those who didn't actively participate. Veterans of such conflicts are acknowledged as being deeply affected, but the rest of the populace effectively shrugs as if to say "sorry, I don't get it", and go about their lives.
Terrorist attacks and Mexican cartel atrocities affect less than 0.00001 % of the population. (Assuming perhaps 3,200 people per year, which is quite a high estimate.) To the rest, it's just a special effect display or soap opera drama on TV. Street violence in poverty-stricken communities is the closest thing people have to experiencing violence in the US, but that only affects a relatively small portion of the country, and those with little influence.
tl;dr: I think violence is acceptable in US media because most Americans have no true comprehension of violence. It's all a video game or a movie to them.
>Television brought the brutality of war into the comfort of the living room. Vietnam was lost in the living rooms of America - not on the battlefields of Vietnam.
-Marshall McLuhan
The Vietnam War was the only one which was so honestly broadcast to the American public. Since then, the media has realized that real violence is not good for ratings, or the war effort.
The way it's depicted, especially in more modern productions must contribute a lot too. Little wonder it seems like a game when gun battles (and often fight scenes and even car chases) now all look like parody.
For instance, some critics of extremist Islamic ideas are regularly banned in Facebook because their posts are reported as islamophobic (even if they themselves might be Muslims). This is not an American norm.
Yeah because all of Europe sucks at making good web tech for some reason. Seriously - look at the top 100 websites in the world; not a single one produced by Europe[1].
Edit: Downvote all you want. I'll redact my statement if someone can show me that Europe does not suck at making web tech. But you can't, because Europe is not competitive in the web tech scene. I don't know why, but that's the way it currently is.
> look at the top 100 websites in the world; not a single one produced by Europe[1]
Did you actually read that link before you posted it? On that list there are sites hosted by companies based in Cyprus (xHamster), the Netherlands (booking.com), Poland (onet.pl, xnxx), and the UK (BBC).
It may surprise you to learn that all of those countries are, in fact, in Europe.
It's a bit skewed to assume e.g. Google produces everything in the US; Google and many of those companies are multinationals, with both legal and development entities all over the world, including Europe. There's also the fact that a lot of European-based startups incorporate themselves in the US because of legal issues - it's easier to sell a US company's software in Europe than the other way around. A lot of US companies are required by law to only use US-made software, too. Probably leftovers from the Cold War.
> it's easier to sell a US company's software in Europe than the other way around
Well maybe this is the reason Europe's software is not generally competitive? If Europe discriminates against the sale of its own software, it's no wonder.
Debatable. Everyone loves to take credit for the invention of any revolutionary innovation. And indeed, credit can be spread all around.
It's like Europe claiming sole credit for the invention of the PC because Alan Turing was British even though Americans invented the integrated circuit and actually built the first viable PCs.
First you make an absurd statement as absolute fact, but now its "debatable". It looks to me like you are backtracking. HTTP came out of CERN... what exactly is it you want to debate about?
It's not an absurd statement, it's the truth. Europe can't compete with American web tech (and probably even software in general). Think about it. In your day to day dealings with computers, what percentage of the software you use is American/European? I'm betting it's something like 99:1 (unless you try to make the claim the Linux is Finnish even though the majority of the developers for it are American and the creator himself is an American citizen). Compare to cars, which are more evenly split between countries (Japanese, European, American, etc.).
I'm not backtracking on that argument.
OP made a poor attempt to excuse Europe's lack of competitiveness in the software/web tech scene by waving their hands and saying "We invented all of it" (which is highly debatable). Claiming that Europe invented the web because HTTP came out of CERN and some European person envisioned a global network is bogus. Those are only tiny parts of what make up "the web". You conveniently sweep ARPANET, TCP/IP, and other core web technologies under the rug.
Most software and languages I deal with are open source and the names I see associated with them seem very european to me. Im also pretty sure that these are the same tools that my fellow american developers that you are putting on a pedestal use as well. Just walk away, you really are backing yourself into a corner.
I would say about half of the stuff on this list was either written in Europe or by Europeans that learned to code in Europe... and I didn't really have to try really hard or pretend, but by all means keep clutching to your silly nationalistic pride.
https://www.quora.com/What-is-Facebooks-architecture-6
It's not competitive. Obviously every once in a while Europe will be able to make a popular web app. But for every European example I can give you 20 American examples.
To some degree that's a compromise. There are people who want neither nudity nor violence on TV, but they've lost the violence fight (somewhat: there are still limits, see http://screencrush.com/avengers-rated-r/ ), while winning the nudity fight.
Often when laws don't make sense, you'll find the law was forged as a compromise, rather than a logical overmind creating it. the constitution is a clear example of this.
Also the censorship of a few spoken words on TV while showing gore and other violent scenes are not a problem. I find it weird also cause when the word is censored as a viewer you already know that's what the character was meant to say.
Gore is a problem, yeesh. The whole "americans love violence" is overrated. The love of violence is a worldwide thing. Not like John Woo, Dario Argento, or Takashi Miike are american.
i didnt say Americans love violence. my point is that if your tv show is targeted to adults (due to its violent nature for example) then how is allowing those words uncensored a problem?
Edit: This just made me realize you can bump G rated content up to PG-13 just by adding beeps and blurs. I bet that's how most reality TV content is generated.
I'm always late to the party but I get there eventually :) I'm curious though, if this were officially release on TV in the modified form would it actually get different rating? If so, it would mean the beeps and blurs themselves are viewed as a form of suggestive dialog.
Um, plenty of violence is deemed inappropriate. I doubt much of USA broadcast TV even comes close to a hard R rating. I'd also point out that Italy in particular has had little problem with making hyper-violent content like giallo movies, in addition to showing copious nudity, and Norway has quite a few violent horror films under their belt.
It's more broadcast standards originally prevented anything, nudity or extreme violence, from being shown, and mature content was exiled to cable.
There's an episode where a man's skin is flayed off his back and turned into metaphorical wings, which are then used to hang him from the ceiling.
In depicting this from behind, you could originally see the man's buttocks. NBC censors objected, so the "fix" was to digitally add a massive amount of dripping blood to cover them up.
You may be right. It's just interesting that on Sunday afternoon they have no problem showing movies where the hero kills dozens of people with pleasure but even the slightest hint of nudity or swear words is not OK.
To be fair it is child nudity in this case, which has different legal ramifications in some situations. Also the views on nudity vary a lot from person to person in the US.
Only if deemed to be pornographic. I think it's fair to say this picture doesn't meet any of the requirements to be deemed as obscene (at least in a sexual sense; using napalm on people is another matter entirely).
All the messages from Facebook seem pretty clear that it's the nudity in the image that's the problem. Have you seen something else from Facebook claiming it's the violence?
> We place limitations on the display of nudity to limit the exposure of the different people using our platform to sensitive content. Any photographs of people displaying fully nude genitalia or buttocks, or fully nude female breasts, will be removed. Photos of women actively engaging in breast feeding or exposing reconstructed nipples for awareness are allowed. We also make allowances for digitally produced content for educational, humorous, or satirical purposes, and for photographs of real world art. We understand that these limitations will sometimes affect content shared for legitimate reasons, including awareness campaigns or artistic projects, and we apologize for the inconvenience.
> Therefore I ask you to remove or pixelize this picture.
Honestly to me it's more Facebook in this case; of the larger social networks, they tend to be one of the more restrictive when it comes to nudity in general. This hard-nosed attitude has come back to bite them on occasion. (See: the long-running "breastfeeding pictures censored" controversy.)
I wonder how much Facebook rely on algorithms to do their filtering these days. This is one of those cases where I can easily see those algorithms scanning the image and detect "child nudity!", resulting in the takedown. Algorithms aren't great at knowing context yet. (Even not every human reviewer would know context, even though it is pretty easy to reverse GIS and find out, oh, this is a Pulitzer prize winning photo...)
Citation? The Puritans thought sex was a wonderful thing within Christian marriage, and on at least one occasion excommunicated a man for neglecting his wife sexually, based on the Biblical command that a husband's body belongs to his wife and vice versa. Contrast this with Catholics, who still require celibacy of priests in 2016, and who imagine the Virgin Mary remaining a virgin her whole life, despite being married, because this, in their view, is part of her holiness.
> The Puritans thought sex was a wonderful thing within Christian marriage
Ok.
> Contrast this with Catholics
Who also think sex is a wonderful thing within Christian marriage.
> who still require celibacy of priests in 2016
"Celibacy" means "not being married", requiring priests to be celibate (which, strictly speaking, Catholics do not [0]) is orthogonal to beliefs about whether sex is wonderful within Christian marriage.
[0] The Catholic Church does not permit priests to marry, but does not bar married men from ordination to the priesthood (it does bar married men from ordination to the episcopate, however.) The Latin Rite of the Catholic Church has a particular discipline which requires that only unmarried men may be ordained to the priesthood, a discipline that was adopted to deal with particular disciplinary problems that arose during the Middle Ages, but which has never been held to be theologically necessary (and which isn't applied within the Church outside of the Latin Rite.)
I've been curious how much impact they have had on our current culture; a people who (to paraphrase Robin Williams) were so uptight they got kicked out of England.
Christian mores definitely still drive our cultural mores, and while there is some easing of the restrictions there, sex is still much less acceptable than violence. As a simple example of this, attempts to reduce the violence in highschool football are as vehemently opposed as attempts to provide more than "abstinence only" sex-ed.
"Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed"
This is a place where I do not agree with Facebook's decision but I agree they have a right to decide who and what can be on their platform. Freedom of speech does not give me the right to come into your home and say whatever I like without being asked to leave. I'm free to do so in the public park across the street though. Your property rights trump my free speech.
It's particularly troubling because facebook is primarily about communicating with your own friends and acquaintances. Censoring public content is troubling, but removing content that is private and only available to people who took the step to friend you on Facebook is really really crappy.
Facebook is very arbitrary in its censoring and account deactivation decisions. Many cases I have read about are instances where Facebook is in the wrong and does not provide users a way to get things resolved (perhaps these instances surface online more often or more prominently).
Every time I read about Facebook's decisions, I feel extremely frustrated and downright angry. Humans need an alternative to Facebook that's not as evil and can get better traction (no, this does not mean everyone closing their FB accounts and switching to email or text messaging). I'm waiting for that to happen.
It would be neat to have a decentralized social network simply to avoid the editorial demands of the walled garden. I think we'd have a lot more unsavory content making its way to people's eyes though. There'd need to be more sophisticated ways of filtering information than just "unfriend", I suppose. And people would need to have tougher skins for it to work.
I can accept why they need to draw a line on naked child images and be done with it. Like most silicon valley companies they want everything automating with as little human customer service as possible.
However if they aren't going to do that job of editorial then they need to stop trying to be a news source while abdicating any responsibility that entails by saying they are a tech company.
On the subject of whether speech protections apply to the government only: it's all well and good to apply a legal analysis to free speech issues, but if you're looking to the law to tell you what's right and wrong, you're trying to buy milk at a hardware store.
Facebook needs to be broken up like Ma Bell was. It's too big to manage well and network effects are preventing alternatives from gaining ground. The world needs more diversity in policy than it has with this mediated communications juggernaut.
The bigger problem is that the new media is US controlled,and youre going to have some culture conflicts. Maybe legislative action could force facebook to federate the users content
I'm not on fb. I read in the comment from the spokeswoman that the distinction cannot be made by their robotic rules. So I believe this illustrates a limitation of their AI. And they don't care so much about the people than their algorithms. Just an opinion.
The ones afforded to us by our libertarian utopia obviously. How could anyone could possibly consider lobbying to take away Facebook's rights as a person to do as it pleases?
People who work there are, but that's irrelevant. The right of freedom of press isn't based on the press themselves being human. The people who live in society have a right to access information that is violated through government censorship of media outlets.
There is free competition between social networking sites, while governments are definitionally regional monopolies.
Facebook can publish what they wish to publish, and not publish what they fundamentally disagree with. They are a private entity. That is pro-freedom, not anti-.
I'm not sure it is accurate to describe a situation where two companies have a couple billion users between them and the rest don't come close as "free competition". Facebook and Twitter compete, sort of, between themselves. Does anyone really compete with them?
Sure. Off the top of my head, linkedin, google plus, whatsapp, instagram, wechat, vkontakte, and yikyak. I can keep going all day. Just because they are winning in the largest markets doesn't mean they have no competition. In contrast, I can't just decide "yeah I'd rather not be subject to US law, I'm going to declare myself a citizen of Petoria", and it's just that nobody WANTS to.
I feel it incumbent upon me here to mention that two of the companies in your list belong to one of those here under discussion. Something perhaps you've overlooked.
I didn't overlook it, it's irrelevant in this context. The Surface Pro and the surface book are both made by Microsoft, they are still competing products that consumers have to choose between.
There's the rub. It can be hard to tell whether theory doesn't have any bearing on reality at the moment due to circumstance, or whether the theory is inherently flawed.
> First some background. A few weeks ago the Norwegian author Tom Egeland posted an entry on Facebook about, and including, seven photographs that changed the history of warfare. You in turn removed the picture of a naked Kim Phuc, fleeing from the napalm bombs – one of the world’s most famous war photographs.
> Tom then rendered Kim Phuc’s criticism against Facebook for banning her picture. Facebook reacted by excluding Tom and prevented him from posting a new entry.
You just have to read the Article. It's stated in the 2nd paragraph why. Basic defense of liberty of speech was why.
"Erna Solberg, the Conservative prime minister, called on Facebook to “review its editing policy” after it deleted her post voicing support for a Norwegian newspaper that had fallen foul of the social media giant’s guidelines."
Anyone here knows him or some of his close friends and want to ask? I might be wrong but I'm pretty sure someone here is at least friend-of-a-friend to him.
Why are we thinking of FB as some monolithic entity? Isn't the most likely explanation that some low-wage contractor in the Philippines saw a picture of a naked girl and flagged it? That contractor may not even know the historical significance of the picture.
You're in a low-wage job and have to look at horrifying shit all day, every day. Are you going to let the one image through that maybe will cost you the job that you really need?
Unlikely. A lot of publicity was generated when the original photo was removed. Not only was a photo removed, but the newspaper employee who submitted it, IIRC, had their account suspended. So that's at least two levels of decision making. Then (as linked to in the OP), the newspaper's editor made a public appeal that was well-read in media and tech circles.
And now this, involving the Norwegian prime minister. I don't think the higher levels of Facebook are ignorant to the issue at this point.
Guardian is pretty trashy for tossing that picture up twice in one article. The article isn't even about napalm or the war, the picture is being used as snuff shock. Show some respect for human dignity.
> However, the realisation came to her she did not have to remain an unwilling victim. The photo was, in fact, a powerful gift that she could use to help promote peace.
> "I realised that now that I have freedom and am in a free country, I can take control of that picture," she says.
She should simply publish it somewhere else, such as her own blog or some other website. When she signed up to Facebook.com she ticked a box agreeing to their terms.
I never signed up so couldn't care less, but aren't most people on Facebook talking about what they had for breakfast and how awesome stuff is? I'm not sure where Napalm girl fits in with that culture except maybe "awesome war photography - thumbs up!!".
Facebook wants to be a place for news as well, with things like their "Instant Articles". It's a historically significant photo and was actually on the front page of the NY Times when it came out. So, would they censor a post from the NY Times if it had the same photo?
The US governments make us legally complicit in child pornography if we don't have automated processes to take this stuff down. People keep blaming corporations for censorship of porn-like (but not porn) content, song lyrics that get mistaken for terrorist threats, and overly zealous take downs of anything that might infringe on IP. Do you think we want our users to get angry at us over this shit? Look at the US child porn laws, the numerous governments spying under the banner of the war on terror, and laws like the DMCA. Our hands are tied and you are blaming the wrong people.
I'm starting to get tired of making this point in some of the many discussions of this topic (most recently https://news.ycombinator.com/item?id=12457371), so the last time for a month, I swear:
You tell me when I can open a segregated lunch counter, or refuse to bake a gay wedding cake, and I'll consider your argument to have merit. Otherwise, it's only a matter of time until Facebook and company are brought to heel under that sort of "public accommodation" approach.
I still don't understand the gay wedding cake argument.
If I'm a baker, why can't I refuse to make certain cakes? I mean, I would never turn away a customer, but if the customer asked me to write obscene things on the cake, don't I have the right to refuse their request?
"I'd like to have a rainbow cake" "ok" "with little penises sticking out of it for my gay wedding" "no, sorry."
What's wrong with that? It's no different from refusing a straight man's request to have a cake with the outline of a nude woman on it.
Well, in that case I agree. You should not be able to refuse service based on the person, but you should be able to refuse service based on the request.
> Longstanding Colorado state law prohibits public accommodations, including businesses such as Masterpiece Cakeshop, from refusing service based on factors such as race, sex, marital status or sexual orientation. Mullins and Craig filed complaints with the Colorado Civil Rights Division (CCRD) contending that Masterpiece had violated this law. Earlier this year, the CCRD ruled that Phillips illegally discriminated against Mullins and Craig. Today’s decision from Judge Robert N. Spencer of the Colorado Office of Administrative Courts affirms that finding.
Federal law, however, prohibits discrimination against those seeking housing, employment, credit, or disaster relief.
To be clear, the case in question involved a baker turning away the customer purely because of who the customer was, nothing about the cake they requested.
This is obnoxious, but probably doesn't need to be illegal, so I suspect we agree.
The difference is whether you belong to a protected class of people as dictated by the government. Strangely, often the difference in those cases is also how you respond.
"I want a cake for my gay wedding." "Sorry, I'm too busy to fulfill your request." Likely nothing comes of it.
"I want a cake for my gay wedding." "No, I don't support your lifestyle and I'll pray for your soul." Lawsuit.
Granted, extreme example of a response there. Lesson is, just say no but don't say why.