Personally, I wouldn't want search engines censoring results for things explicitly searched for, but I'd still expect that social media should be responsible for harmful content they push onto users who never asked for it in the first place. Push vs Pull is an important distinction that should be considered.
Did some hands come out of the screen, pull a rope out then choke someone? Platforms shouldn’t be held responsible when 1 out of a million users wins a Darwin award.
I think it's a very different conversation when you're talking about social media sites pushing content they know is harmful onto people who they know are literal children.
Does TikTok have preempt this category of videos in the future or simply respond promptly when notified such a video is posted to their system?