"But by the time Nylah viewed these
videos, TikTok knew that: 1) “the deadly Blackout Challenge
was spreading through its app,” 2) “its algorithm was
specifically feeding the Blackout Challenge to children,” and
3) several children had died while attempting the Blackout
Challenge after viewing videos of the Challenge on their For
You Pages. App. 31–32. Yet TikTok “took no and/or
completely inadequate action to extinguish and prevent the
spread of the Blackout Challenge and specifically to prevent
the Blackout Challenge from being shown to children on their
[For You Pages].” App. 32–33. Instead, TikTok continued to
recommend these videos to children like Nylah."
We need to see another document, "App 31-32", to see what TikTok "knew". Could someone find that, please? A Pacer account may be required. Did they ignore an abuse report?
See also Gonzales vs. Google (2023), where a similar issue reached the U.S. Supreme Court.[1] That was
about whether recommending videos which encouraged the viewer to support the Islamic State's jihad led someone to go fight in it, where they were killed. The Court rejected the terrorism claim and declined to address the Section 230 claim.
IIRC, TikTok has (had?) a relatively high-touch content moderation pipeline, where any video receiving more than a few thousand views is checked by a human reviewer.
Their review process was developed to hit the much more stringent speech standards of the Chinese market, but it opens them up to even more liability here.
I unfortunately can't find the source articles for this any more, they're buried under "how to make your video go viral" flowcharts that elide the "when things get banned" decisions.
I don't think any of that actually matters for the CDA liability question, but it is definitely material in whether they are found guilty assuming they can be held liable at all.
"But by the time Nylah viewed these videos, TikTok knew that: 1) “the deadly Blackout Challenge was spreading through its app,” 2) “its algorithm was specifically feeding the Blackout Challenge to children,” and 3) several children had died while attempting the Blackout Challenge after viewing videos of the Challenge on their For You Pages. App. 31–32. Yet TikTok “took no and/or completely inadequate action to extinguish and prevent the spread of the Blackout Challenge and specifically to prevent the Blackout Challenge from being shown to children on their [For You Pages].” App. 32–33. Instead, TikTok continued to recommend these videos to children like Nylah."
We need to see another document, "App 31-32", to see what TikTok "knew". Could someone find that, please? A Pacer account may be required. Did they ignore an abuse report?
See also Gonzales vs. Google (2023), where a similar issue reached the U.S. Supreme Court.[1] That was about whether recommending videos which encouraged the viewer to support the Islamic State's jihad led someone to go fight in it, where they were killed. The Court rejected the terrorism claim and declined to address the Section 230 claim.
[1] https://en.wikipedia.org/wiki/Gonzalez_v._Google_LLC