Hacker News new | past | comments | ask | show | jobs | submit login

Please tell me why a private company should be forced to host content that they fundamentally disagree with.

Why should Google be forced, legally, to carry Chinese state propaganda, for example? Why should Google be forced, legally, to carry ISIS propaganda?

There are alternatives to Google. They should host that content there




Why do we not allow telephone companies to censor our conversations? Why can't the power company choose not provide power to offices for political campaigns they disagree with?

We afford monopolies certain privileges, but we also require monopolies to have certain restraints.

Many of these large tech companies have become natural monopolies. I think its reasonable to expect similar restraints.


I agree with you on restraints on natural monopolies. That should come in the form of limiting their anti-competitive behavior and probably also harvesting user data unconsentually for profit.

But regarding limiting these companies from being able to decide which speech they don't want to host: I don't think you've thought this out fully.

What speech should they be forced to host? Art? Disinformation? Propaganda? Porn? Spam? Terrorism? For illegal speech: Whose laws should they be forced to obey?


With respect, no speech should be illegal, regardless of how absurd, abhorrent, or inaccurate the speech is to anyone. Free speech is fundamental to a truly free society, full stop.


You disagree with your own statement and you don't even know it!

"No speech should be illegal" - should I be able to threaten people then? That's speech. Should I be able to detail my plans for how I'm going to commit a crime? That's speech.

Should I be able to scream at the top of my lungs in a public space? Should I be able to use a loudspeaker to broadcast my voice (or an advertisement) to drown out all other sound in a public space?

It sounds nice what you're saying, but it's not what you actually believe, so I kindly ask you to argue less in bad faith.


You're right. We should make exceptions to free speech only for recklessly, knowingly, or intentionally: -making unreasonable noise and continuing to do so after being asked to stop -disrupting a lawful assembly of persons -speech made for the principal purpose of creating panic -if the speaker intends to incite a violation of the law that is both imminent and likely.

My solution to the original problem would be to make it technologically unfeasible to solve. The architecture of the communication platform should be constrained in such a way that operators & owners are unable to make choices about which speech to include. This is because nobody is equipped to solve this problem, and nobody should be forced to do anything to try and solve it.


Disinformation, and propaganda are just specific types of protected free speech. I disagree with socialism, but communist propaganda is protected speech which should be allowed. Disinformation can be dangerous too, I understand that. But i'm not willing to allow Google or my government to make a decision on what is disinformation.

As for everything else. You should clearly follow the laws of the country you are doing business in.


> Please tell me why a private company should be forced to host content that they fundamentally disagree with.

Prior to the civil rights act, companies didn’t serve certain people based on their race because they disagreed with that race.

It’s not that Google should be forced to carry stuff, they should be forced to not discriminate because they don’t like it. ISIS propaganda is illegal and should be taken down for that reason.


> It’s not that Google should be forced to carry stuff, they should be forced to not discriminate because they don’t like it. ISIS propaganda is illegal and should be taken down for that reason.

Bright-line rules like that worry me. This is, a lot of stuff is subjective -- the line between "legal" and "illegal" isn't so clear or immutable as one might naively guess. So if something more binary -- like host-or-remove -- is tied to such a fuzzy, dynamic determinant, it'd seem to give rise to all sorts of problems.

For example, say we forced big companies to host all legal content, but remove all illegal content, and then we want to know if something controversial is legal (e.g., taxes on Bitcoin back when it was newer). Then someone could post two images: one telling people to pay taxes on Bitcoin, and another telling people to not pay taxes on Bitcoin. Then the hosting-company would have to remove exactly one of those. By contrast, a hosting-company could normally just remove stuff they're unsure about because they're not required to host legal content, sparing them the burden of having to properly determine the legality of everything.

Basically, the problem is that we'd be stripping hosting-companies of their freedom to operate in safe-waters, forcing them into murky areas and then opening them up to punishment whenever they fail to correctly navigate those murky waters.


I don’t think it’s perfect, I think it’s just better than the current system.

I trust society’s laws for legal/illegal more than Google’s arbitrary decisions of info/misinfo.


If hosting-companies become responsible for determining what's legal/illegal, then they'll have reasonable cause to become an authority on the topic.

It'd probably make them more influential and powerful rather than less, because their judgements would carry the implication of legal-determination, and in popular perception, be law.

They'd essentially be elevated to the status of being lower-courts.


I wasn’t thinking they would be responsible for determining legal or illegal. That would be determined by courts and the legal system. For example, libel/slander would require a judgement, not the provider saying they think it’s libel/slander.


It sounds like you're proposing that they have to remove illegal-content, but host legal-content, right?

For example: say someone's advertising a new drug of questionable legality (say, Δ-8-THC). Presumably a hosting-company, unsure of legality, would just take that down for violating their policies -- without necessarily asserting that it's illegal.

But if you're proposing that they can't do that, then presumably they'd be forced into making a clear determination on its legality (as they must host it if legal, and must remove it if not). Right?

---

Actually, just to mix crypto in:

Say someone posts a time-locked encrypted-file (a file encrypted such that it'll open after a few hours/days/weeks/months/whatever), and there's reasonable suspicion that it may contain illegal content -- but a hosting-company isn't sure yet, because it was just posted and no one's completed unlocking it yet. Should they be forced to host it?

Now say that an entire community springs up around this: many of the files end up being perfectly legal, while others turn out to be very illegal. How should a hosting-company react?


> ISIS propaganda is illegal and should be taken down for that reason.

What specifically is illegal about pro-ISIS speech?


Calls to violence, images of chopped off heads, etc.

If it’s literally just assholes saying “ISIS is great” then that shouldn’t be taken down. Just like if two ISIS-lovers are IMing each other messages about how much they love ISIS and nothing else illegal it should be allowed. I think.


In the US, images of chopped off heads aren't illegal. Calls to violence are illegal, but only in very particular circumstances that it's unlikely ISIS propoganda videos would meet


Exactly this. A lot of the conservative angst about having their speech moderated on private platforms would vanish if they realized the baggage that came along with what they were asking for. The first amendment is extremely permissive, only very narrow limits are allowed and 99% of pro-ISIS speech is perfectly legal.


Are you arguing that we should pass legislation that forces all US companies to host all legal content?


Not at all. But I would like to see legislation (or some strong rule) that forces huge corporations or companies over a certain market share to host all legal content.

Similar to how television broadcasters have regulations that that force them to provide equal time to all major candidates.

I think there’s some reasonable threshold that doesn’t require small providers to host everything.


Equal time for TV hasn’t been a thing for decades. And it was justified because TVs used the public airwaves. I don’t know how you could write a law that would pass constitutional muster. And it seems very unlikely that you could get a constitutional amendment through on this topic.


Yes. Any company with an user base or influence above a certain threshold should not get to make moderation decisions unilaterally without the input of society at large. That input is called "the law".


The argument would be the magnitude of their impact on how a member of society can search for, view or transmit information is too large for Google to be deemed a private company.

If a private company impacts a nation's states democracy to such an extent that it rivals it in power, they ought to be classified as something else.


So, in your opinion, what should a government force the company to do when they are classified as what you describe? Force them to host all speech? Does that include art? Disinformation? Propaganda? Porn? Spam? Terrorism?

I don't think you've thought out the consequences of what you're advocating for.

If you have thought it out, please explain exactly what speech they should be forced to host and what speech they shouldn't be.


Right. I personally never thought it was that complicated.

Once a company is classified as a public utility (I believe Google is) it should be forced to host all legal content. You tell me what's illegal and I can safely tell you it can't be hosted by Google.


By whose laws? Even within the US, there are plenty of different laws. Should it be an intersection of all laws, everywhere? Only content which is lawful around the world? Or regionally? Should people outside those regions be segmented off from content that isn't in their region?

Moreover, you're saying that spam should be forced to be hosted by these companies, just like our snail-mail protects. Even if it takes up Exabytes of information.

Should people who have their content removed be able to sue these companies for removing it?


They are American companies, they therefore fall under American laws.

Just with Twitter, we know that content is regulated by region. Setting your location to Germany will prohibit seeing certain content from the U.S. Many more such regional cases.

The spam example would be a problem, but it's more of an annoyance to solve than a basic human rights case. You simply cannot have a democracy where segments of the population are barred from interacting with public officials online. Especially when public business (advertising, fundraising, making political arguments) is now a core of online communications.


If you're saying US companies (that classify as whatever you're defining them as) should be forced to carry all legal speech, no matter how terrible or cruel or provocative it is, I'd be okay with this, and that means literally all spam, and that admins would not be able to moderate any legal speech. If it's any less than this, I'm not okay with what you're advocating for.

And effectively this would turn these sites into platforms that are so filled with trash they will be unusable. And the chaotic part of me would love to see that happen. But it means basically the end of these companies to function.

Realistically, I think we should keep to the standard we've had in the past: we can't compel companies to host speech they disagree with, and we should take strong measures to limit their anti-competitive behavior and break them up into competing companies if necessary (like we did with telecom)


I don't want to keep arguing. Mostly informative exchange.

I would say though, Twitters model from around 2012 was extremely open compared with today (remember the Arab Spring?) and in no way was it an unusable, trash/spam laden platform.


Don't be disingenuous. The problem is viewpoint discrimination. Spam isn't a viewpoint. Porn isn't a viewpoint. Libel isn't a viewpoint. We can limit the ability of tech companies to arbitrarily censor points of view while still keeping the platform free of spam.

How? Create a cause of action whereby if a tech company removes someone's content, that person can go to court and ask that a judge determine whether that content removal is some kind of anti-spam operation or viewpoint censorship. You don't let the company have the final say.


The first amendment is going to be a problem. According to the Supreme Court, those companies have the same first amendment rights that you do. Compelled speech is frowned upon.


> According to the Supreme Court, those companies have the same first amendment rights that you do.

No, those companies don't have first amendment rights. Commercial speech has always been more limited than personal speech. If the law worked like you claim, common carrier laws for railroads would be unconstitutional because they'd violate railroad company freedom of association. These laws are, in fact, constitutional, and so will be the laws that stop big tech censorship.

Besides: corporations? Rights? Total bullshit. This country ought to be run for the benefit of its citizens, not abstract entities like big tech companies. A corporation is an artificial construct that can exist only because society --- made up of humans --- determines that the corporation is in society's best interest. When a corporation is no longer in the best interest of the humans that make up the world, screw the corporation.


According to Citizens United v FEC, they absolute DO have first amendment rights.


I'm sorry, but no one from a reasonable standpoint is going to look at Google not hosting "misleading content" and then think democracy is threatened.


I do. A candidate for political office being barred from posting campaign clips to YouTube is a threat to our democracy.

Most average people view YouTube as the defacto video portal of the entire internet.


Once again, a candidate not being able to post to Youtube is not a threat to democracy. Nothing is stopping this candidate from posting this on their campaign site, or an RNC affiliated site or even Facebook where most of their supporters are likely to be.


> Once again, a candidate not being able to post to Youtube is not a threat to democracy.

Yes it is. Youtube is a huge conduit for communication.

Imagine if ABC banned a candidate. The argument that there’s still CBS and NBC are available is not relevant as a major media outlet is favoring a candidate by blocking their opponents.

For small outlets it’s not an issue. But YouTube is the biggest video provider on the planet, not allowing a political candidate would be detrimental to democracy.

Even if that candidate said stupid stuff like “world is flat.” People have to make their decision and as long as we’re a democracy, that choice should be individual.


Don't NBC and ABC and CBS ban lots of candidates?

If you don't have a big enough chance at winning, you don't go to debates

Mind you, news networks recently had a very strong preference towards the incumbent president, aligning their news to match the president's talking points, and having nightly calls to align their messages


They don’t ban candidates from running ads.

They only air debates from candidates that meet some shared threshold.

Once the candidates are locked in, they can’t give preferential air time to one over the other.


Of course they can. Where did you get this idea? Do you think Fox News doesn’t give preferential treatment to certain candidates?


YouTube would be literally interfering with a democratic election. If threat is too strong a word, fine. But you can't deny that they are actively participating in public elections. We want that? We want to privatize democracy? I don't.


Yeah google de-indexing all pages criticizing the democrats and the company is also not threat to democracy. They must provide us with a curated set of sound information vetted by the politicians, FAANG and the state department.


No one is curating anything. If you want your right-wing search engine, then create it. Google is under no obligation to give you top page rank.


Can we at least agree on some basic facts? Google does in fact curate search. Where a page is listed does not depend on how many times the link was clicked. Yes?


>If you want your right-wing search engine, then create it.

And your true intentions come out.

If Google delists AOC from all their properties in the next election and replaces the results with those from her opposition, I'm sure you'll be the first to say it's totally fine and she should just host her own search site if she doesn't like it.


> Please tell me why a private company should be forced to host content that they fundamentally disagree with. Why should Google be forced, legally, to carry Chinese state propaganda, for example? Why should Google be forced, legally, to carry ISIS propaganda?

They already do host them. CloudFlare, the biggest CDN, despite their words and claims routinely censors sites meanwhile defending hosting terrorist site's free speech.

https://www.fastcompany.com/90312063/how-cloudflare-straddle...

> the company serves at least seven groups on the U.S. State Department’s list of foreign terrorist organizations, including al-Shabab, the Popular Front for the Liberation of Palestine (PFLP), al-Quds Brigades, the Kurdistan Workers’ Party, al-Aqsa Martyrs Brigade, and Hamas.

> CEP has sent letters to Cloudflare since February 13, 2017, warning about clients on the service, including Hamas, the Taliban, the PFLP, and the Nordic Resistance Movement. The latest letter, from February 15, 2019, warns of what CEP identified as three pro-ISIS propaganda websites.

CF claims terrorist organization's websites are free speech:

https://blog.cloudflare.com/cloudflare-and-free-speech/

As for the whole "private speech" argument, so, Rosa Parks should have just started her own bus company too? Discriminating against people based on race was legal after all. And if race is a "different" topic, then how about religion? Religion is based on ideas. These tech companies claim to censor religion based offensive content too. But almost all LGBT content is against all religion. Isn't that offensive too and should be censored too? And just like religion is based on ideas, political opinions are ideas too.

Railroads, telecom, electricity and water companies should be able to refuse service too?

Are you against the FDA, EPA, FCC, FEC, COPPA regulations, regulations of fire insurance rates etc?

How about Net Neutrality? Private businesses should be able to charge whatever they want and for whatever content they want right?

How about the government-forced lockdowns forcing private businesses to shut down and go bankrupt?

And how about the baker who refused to bake cake for the gay couple for religious reasons?

How about the current Administration banning menthol cigarettes, flavoured cigars?

How about government banning incandescent light bulbs?

How about Fauci's emails where he's emailing with Zuckerberg (some of which was also redacted). Fauci is the government and him working together with FB in building their "COVID dashboard" which censored many people, especially those talking about the lab leak theory as well as Ivermectin. Is that not government enforced censorship?

There's a whole community on TikTok (you can find them here on Twitter, too) which scorns people as "fatphobic" for encouraging fitness and weight loss. They falsely say obesity isn't unhealthy. Should that be allowed considering that's also "misleading/misinformation"? Are social media companies that allow it "killing people" (Biden's words from today)? Are social media companies also guilty of "killing people" if they allow content encouraging people to be obese, to consume fatty junk food, content which glorifies cigarette smoking and large amounts of alcohol consumption and a sedentary lifestyle?

Seems like the "it's a private business" crowd is totally okay with government enforced regulations and lockdowns for their political benefit but when it comes to political speech of their opposition, they suddenly discover the "private" business.

SCOTUS Justice Clarence Thomas opined couple months ago discussing big tech censorship quite extensively. The case was regarding whether President Trump was allowed to block people on Twitter and it being a 1st amendment violation. While the case was declared moot as President Trump left office, Justice Clarence Thomas took the opportunity to discuss censorship. How politicians like President Trump aren't allowed to block users on big tech but big tech is able to block and ban government employees and how this creates a weird power dynamic. Here's a few excerpts:

> "But whatever may be said of other industries, there is clear historical precedent for regulating transportation and communications networks in a similar manner as traditional common carriers. Candeub 398–405. Telegraphs, for example, because they “resemble[d] railroad companies and other common carriers,” were “bound to serve all customers alike, without discrimination." ... "Internet platforms of course have their own First Amendment interests, but regulations that might affect speech are valid if they would have been permissible at the time of the founding. See United States v. Stevens, 559 U. S. 460, 468 (2010). The long history in this country and in England of restricting the exclusion right of common carriers and places of public accommodation may save similar regulations today from triggering heightened scrutiny—especially where a restriction would not prohibit the company from speaking or force the company to endorse the speech." ... "The similarities between some digital platforms and common carriers or places of public accommodation may give legislators strong arguments for similarly regulating digital platforms. [I]t stands to reason that if Congress may demand that telephone companies operate as common carriers, it can ask the same of ”digital platforms." ... "For example, although a “private entity is not ordinarily constrained by the First Amendment,” Halleck, 587 U. S., at ___, ___ (slip op., at 6, 9), it is if the government coerces or induces it to take action the government itself would not be permitted to do, such as censor expression of a lawful viewpoint. Ibid. Consider government threats. “People do not lightly disregard public officers’ thinly veiled threats to institute criminal proceedings against them if they do not come around.” Bantam Books, Inc. v. Sullivan, 372 U. S. 58, 68 (1963). The government cannot accomplish through threats of adverse government action what the Constitution prohibits it from doing directly. See ibid.; Blum v. Yaretsky, 457 U. S. 991, 1004–1005 (1982). Under this doctrine, plaintiffs might have colorable claims against a digital plat- form if it took adverse action against them in response to government threats. The Second Circuit feared that then-President Trump cut off speech by using the features that Twitter made available to him. But if the aim is to ensure that speech is not smoth- ered, then the more glaring concern must perforce be the dominant digital platforms themselves."

> "As Twitter made clear, the right to cut off speech lies most powerfully in the hands of private digital platforms. The extent to which that power matters for purposes of the First Amendment and the extent to which that power could lawfully be modified raise interesting and important questions. This petition, unfortunately, affords us no opportunity to confront them."

The last 2 points are important as Justice Thomas is basically saying "give us a case which brings up these two questions and then we will have a deep look."

I would highly recommend reading his opinion:

https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf

One can even cite Amazon's recent censorship of SCOTUS Justice Clarence Thomas's own documentary as well as Eli Steele’s documentary as examples:

https://archive.is/aNv3B

Also based on recent revelations of things like Press Secretary Psaki admitting that they are flagging content *FOR* Facebook, advocating for censoring any person from all social media if they are censored on only one social media as well as Fauci's emails showing him actively emailing with Zuckerberg on the COVID news which led to censoring of anyone who brought up the lab leak theory, this makes these companies State actions and not private companies.


> routinely censors

Do they routinely censor? As far as I can tell they do not 'routinely' do this. As a matter of course, they routinely do the opposite. There are only two examples I can think of where they have censored.

https://arstechnica.com/tech-policy/2017/12/cloudflares-ceo-...


They also took down the image CDN for TheDonald earlier last year.


By her own standards, Psaki should be banned for misinformation. She said Hunter Biden's story is Russian disinformation: https://twitter.com/jrpsaki/status/1318382779659411458?lang=...

It was later found to be true: https://www.wsj.com/articles/the-hunter-biden-laptop-is-real...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: