> There is no way to run a targeted ad social media company with 40% margins if you have to make sure children aren’t harmed by your product.
More specific than being harmed by your product, Section 230 cares about content you publish and whether you are acting as a publisher (liable for content) or a platform (not liable for content). This quote is supposing what would happen if Section 230 were overturned. But in fact, there is a way that companies would protect themselves: simply don't moderate content at all. Then you act purely as a platform, and don't have to ever worry about being treated as a publisher. Of course, this would turn the whole internet into 4chan, which nobody wants. IMO, this is one of the main reasons Section 230 continues to be used in this way.
Also want to note that the inverse solution that companies could take is to be overly Draconian in moderating content, so as to take down anything that could come back negatively on them (in this case, the role of publisher is assumed and thus content moderation needs to be sufficiently robust so as to cover the company's ass).
More specific than being harmed by your product, Section 230 cares about content you publish and whether you are acting as a publisher (liable for content) or a platform (not liable for content). This quote is supposing what would happen if Section 230 were overturned. But in fact, there is a way that companies would protect themselves: simply don't moderate content at all. Then you act purely as a platform, and don't have to ever worry about being treated as a publisher. Of course, this would turn the whole internet into 4chan, which nobody wants. IMO, this is one of the main reasons Section 230 continues to be used in this way.