Hacker News new | past | comments | ask | show | jobs | submit login

This turns on what TikTok "knew":

"But by the time Nylah viewed these videos, TikTok knew that: 1) “the deadly Blackout Challenge was spreading through its app,” 2) “its algorithm was specifically feeding the Blackout Challenge to children,” and 3) several children had died while attempting the Blackout Challenge after viewing videos of the Challenge on their For You Pages. App. 31–32. Yet TikTok “took no and/or completely inadequate action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent the Blackout Challenge from being shown to children on their [For You Pages].” App. 32–33. Instead, TikTok continued to recommend these videos to children like Nylah."

We need to see another document, "App 31-32", to see what TikTok "knew". Could someone find that, please? A Pacer account may be required. Did they ignore an abuse report?

See also Gonzales vs. Google (2023), where a similar issue reached the U.S. Supreme Court.[1] That was about whether recommending videos which encouraged the viewer to support the Islamic State's jihad led someone to go fight in it, where they were killed. The Court rejected the terrorism claim and declined to address the Section 230 claim.

[1] https://en.wikipedia.org/wiki/Gonzalez_v._Google_LLC




IIRC, TikTok has (had?) a relatively high-touch content moderation pipeline, where any video receiving more than a few thousand views is checked by a human reviewer.

Their review process was developed to hit the much more stringent speech standards of the Chinese market, but it opens them up to even more liability here.

I unfortunately can't find the source articles for this any more, they're buried under "how to make your video go viral" flowcharts that elide the "when things get banned" decisions.


> Their review process was developed to hit the much more stringent speech standards of the Chinese market

TikTok isn't available in China. They have a separate app called Douyin.


They are saying that the reason TikTok also has high-touch moderation is that it grew out of Douyin.


Should be pages A31-32 in this appendix for the appellant's brief I think:

"The TikTok Defendants Knew the Deadly Blackout Challenge Had Killed Multiple Children"

https://storage.courtlistener.com/recap/gov.uscourts.ca3.118...

Rest of the filings here:

https://www.courtlistener.com/docket/67500541/tawainna-ander...


I don't think any of that actually matters for the CDA liability question, but it is definitely material in whether they are found guilty assuming they can be held liable at all.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: