That's just branding. It's called Home in Facebook and Instagram, and it's the exact same thing. It's a form of discovery that's tailored to the user, just like normal searches are (even on Google and Bing etc).
Indeed, regardless of the branding for the feature, the service is making a decision about what to show a given user based on what the service knows about them. That is not a search result with no terms; the user is the term.
Now for a followup question: How does any website surface any content when they're liable for the content?
When you can be held liable for surfacing the wrong (for unclear definitions of wrong) content to the wrong person, even Google could be held liable. Imagine if this child found a blackout video on the fifth page of their search results on "blackout". After all, YouTube hosted such videos as well.
TikTok is not being held liable for hosting and serving the content. They're being held liable for recommending the content to a user with no other search context provided by said user. In this case, it is because the visitor of the site was a young girl that they chose to surface this video and there was no other context. The girl did not search "blackout".
> because the visitor of the site was a young girl that they chose to surface this video
That's one hell of a specific accusation - that they looked at her age alone and determined solely based on that to show her that specific video?
First off, at 10, she should have had an age-gated account that shows curated content specifically for children. There's nothing to indicate that her parents set up such an account for her.
Also, it's well understood that Tiktok takes a user's previously watched videos into account when recommending videos. It can identify traits about the people based off that (and by personal experience, I can assert that it will lock down your account if it thinks you're a child), but they have no hard data on someone's age. Something about her video history triggered displaying this video (alongside thousands of other videos).
Finally, no, the girl did not do a search (that we're aware of). But would the judge's opinion have changed? I don't believe so, based off of their logic. TikTok used an algorithm to recommend a video. TikTok uses that same algorithm with a filter to show search results.
In any case, a tragedy happened. But putting the blame on TikTok seems more like an attack on TikTok and not an attempt to reign in the industry at large.
Plus, at some point, we have to ask the question: where were the parents in all of this?
«Had Nylah viewed a Blackout Challenge video through TikTok’s search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content.»
You can of course choose not to believe the judges saying it matters for them, but it becomes a very different discussion...
> That's one hell of a specific accusation - that they looked at her age alone and determined solely based on that to show her that specific video?
I suppose I did not phrase that very carefully. What I meant is that they chose to surface the video because a specific young girl visited the site -- one who had a specific history of watched videos.
> In any case, a tragedy happened. But putting the blame on TikTok seems more like an attack on TikTok and not an attempt to reign in the industry at large.
It's always going to start with one case. This could be protectionism but it very well could instead be the start of reining in the industry.
Is not the set of such things offered still editorial judgement?
(And as an addendum, even if you think the answer to that is no, do you trust a judge who can probably barely work an iphone to come to the same conclusion, with your company in the crosshairs?)
I'd say no, because they averages over the entire group. If you ranked based on say, most liked in your friends circle, or most liked by people with a high cosine similarity to your profile, then it starts to slide back into editorial judgment.