Hacker News new | past | comments | ask | show | jobs | submit login

> Please tell me why a private company should be forced to host content that they fundamentally disagree with.

Prior to the civil rights act, companies didn’t serve certain people based on their race because they disagreed with that race.

It’s not that Google should be forced to carry stuff, they should be forced to not discriminate because they don’t like it. ISIS propaganda is illegal and should be taken down for that reason.




> It’s not that Google should be forced to carry stuff, they should be forced to not discriminate because they don’t like it. ISIS propaganda is illegal and should be taken down for that reason.

Bright-line rules like that worry me. This is, a lot of stuff is subjective -- the line between "legal" and "illegal" isn't so clear or immutable as one might naively guess. So if something more binary -- like host-or-remove -- is tied to such a fuzzy, dynamic determinant, it'd seem to give rise to all sorts of problems.

For example, say we forced big companies to host all legal content, but remove all illegal content, and then we want to know if something controversial is legal (e.g., taxes on Bitcoin back when it was newer). Then someone could post two images: one telling people to pay taxes on Bitcoin, and another telling people to not pay taxes on Bitcoin. Then the hosting-company would have to remove exactly one of those. By contrast, a hosting-company could normally just remove stuff they're unsure about because they're not required to host legal content, sparing them the burden of having to properly determine the legality of everything.

Basically, the problem is that we'd be stripping hosting-companies of their freedom to operate in safe-waters, forcing them into murky areas and then opening them up to punishment whenever they fail to correctly navigate those murky waters.


I don’t think it’s perfect, I think it’s just better than the current system.

I trust society’s laws for legal/illegal more than Google’s arbitrary decisions of info/misinfo.


If hosting-companies become responsible for determining what's legal/illegal, then they'll have reasonable cause to become an authority on the topic.

It'd probably make them more influential and powerful rather than less, because their judgements would carry the implication of legal-determination, and in popular perception, be law.

They'd essentially be elevated to the status of being lower-courts.


I wasn’t thinking they would be responsible for determining legal or illegal. That would be determined by courts and the legal system. For example, libel/slander would require a judgement, not the provider saying they think it’s libel/slander.


It sounds like you're proposing that they have to remove illegal-content, but host legal-content, right?

For example: say someone's advertising a new drug of questionable legality (say, Δ-8-THC). Presumably a hosting-company, unsure of legality, would just take that down for violating their policies -- without necessarily asserting that it's illegal.

But if you're proposing that they can't do that, then presumably they'd be forced into making a clear determination on its legality (as they must host it if legal, and must remove it if not). Right?

---

Actually, just to mix crypto in:

Say someone posts a time-locked encrypted-file (a file encrypted such that it'll open after a few hours/days/weeks/months/whatever), and there's reasonable suspicion that it may contain illegal content -- but a hosting-company isn't sure yet, because it was just posted and no one's completed unlocking it yet. Should they be forced to host it?

Now say that an entire community springs up around this: many of the files end up being perfectly legal, while others turn out to be very illegal. How should a hosting-company react?


> ISIS propaganda is illegal and should be taken down for that reason.

What specifically is illegal about pro-ISIS speech?


Calls to violence, images of chopped off heads, etc.

If it’s literally just assholes saying “ISIS is great” then that shouldn’t be taken down. Just like if two ISIS-lovers are IMing each other messages about how much they love ISIS and nothing else illegal it should be allowed. I think.


In the US, images of chopped off heads aren't illegal. Calls to violence are illegal, but only in very particular circumstances that it's unlikely ISIS propoganda videos would meet


Exactly this. A lot of the conservative angst about having their speech moderated on private platforms would vanish if they realized the baggage that came along with what they were asking for. The first amendment is extremely permissive, only very narrow limits are allowed and 99% of pro-ISIS speech is perfectly legal.


Are you arguing that we should pass legislation that forces all US companies to host all legal content?


Not at all. But I would like to see legislation (or some strong rule) that forces huge corporations or companies over a certain market share to host all legal content.

Similar to how television broadcasters have regulations that that force them to provide equal time to all major candidates.

I think there’s some reasonable threshold that doesn’t require small providers to host everything.


Equal time for TV hasn’t been a thing for decades. And it was justified because TVs used the public airwaves. I don’t know how you could write a law that would pass constitutional muster. And it seems very unlikely that you could get a constitutional amendment through on this topic.


Yes. Any company with an user base or influence above a certain threshold should not get to make moderation decisions unilaterally without the input of society at large. That input is called "the law".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: