That's the difference between the case and a monolithic electronic bulletin board like HN. HN follows an old-school BB model very close to the models that existed when Section 230 was written.
Winding up in the same place as the defendant would require making a unique, dynamic, individualized BB for each user tailored to them based on pervasive online surveillance and the platform's own editorial "secret sauce."
The HN team explicitly and manually manages the front page of HN, so I think it's completely unarguable that they would be held liable under this ruling if at least the front page contained links to articles that caused harm. They manually promote certain posts that they find particularly good, even if they didn't get a lot of votes, so this is even more direct than what TikTok did in this case.
It is absolutely still arguable in court, since this ruling interpreted the Supreme Court ruling to pertain to “a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata,”
In other words, the Supreme Court decision mentions editorial decisions but no court case has yet backed up if that means editorial decisions in the HN front page sense (as in mods make some choices but it’s not personalized.) Common sense may say mods making decisions is editorial decisions but it’s a gray area until a court case makes it clear. Precedence is the most important thing when interpreting law, and the only precedence we have is that it pertains to personalized feeds.
> Because TikTok’s “algorithm curates and recommends a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata,” it becomes TikTok’s own speech.
HN is _not_ a monolithic bulletin board -- the messages on a BBS were never (AFAIK) sorted by 'popularity' and users didn't generally have the power to demote or flag posts.
Although HN's algorithm depends (mostly) on user input for how it presents the posts, it still favours some over others and still runs afoul here. You would need a literal 'most recent' chronological view and HN doesn't have that for comments. It probably should anyway!
@dang We need the option to view comments chronologically, please
Writing @dang is a no-op. He'll respond if he sees the mention, but there's no alert sent to him. Email hn@ycombinator.com if you want to get his attention.
That said, the feature you requested is already implemented but you have to know it is there. Dang mentioned it in a recent comment that I bookmarked: https://news.ycombinator.com/item?id=41230703
To see comments on this story sorted newest-first, change the link to
> HN is _not_ a monolithic bulletin board -- the messages on a BBS were never (AFAIK) sorted by 'popularity' and users didn't generally have the power to demote or flag posts.
I don't think the feature was that unknown. Per Wikipedia, the CDA passed in 1996 and Slashdot was created in 1997, and I doubt the latter's moderation/voting system was that unique.
Key words are "editorial" and "secret sauce". Platforms should not be liable for dangerous content which slips through the cracks, but certainly should be when their user-personalized algorithms mess up. Can't have your cake and eat it to.
Dangerous content slipping through the cracks and the algorithms messing up is the same thing. There is no way for content to "slip through the cracks" other than via the algorithm.
You can view the content via direct links or search, recommendation algorithms isn't the only way to view it.
If you child porn that gets shared via direct links then that is bad even if nobody can see it, but it is much much worse if you start recommending that to people as well.
Everything is related. Search results are usually generated based on recommendations, and direct links usually influence recommendations, or include recommendations as related content.
It's rarely if ever going to be the case that there is some distinct unit of code called "the algorithm" that can be separated and considered legally distinct from the rest of the codebase.
Winding up in the same place as the defendant would require making a unique, dynamic, individualized BB for each user tailored to them based on pervasive online surveillance and the platform's own editorial "secret sauce."