Hacker News new | past | comments | ask | show | jobs | submit login

There are way too many videos for manual review and the relatively rare false positive (one high profile case every few months vs how many videos uploaded every day?) is worth the gain in efficiency. But there should certainly be a better manual review process once the AI messes up.



Perhaps this is true - it's quite plausible that there are too many videos uploaded to YouTube for any manual review to scale in a cost-efficient manner.

However to me this does not negate YouTube's good-faith responsibility to clarify what kind of product they are and what they intend to do with their service as the years go by. These mistakes are extremely costly and potentially offensive to the end user, no matter how rare they may be in daily practice.

YouTube made very foolhardy decisions to encourage the ingestion of any and all video content from just about all kinds of people. This had the impact of normalizing video content production practices, giving AI-directed content moderation a false sense of success in its early days of deployment. As content has grown more complex and nuanced on these platforms, the AI systems deployed have miserably failed to accommodate these changes in the ecosystem. It's all in an (IMO) futile effort for YouTube to make good on their reckless company roadmap without specific people having to take personal responsibility for their blunder.


There are way more than one rare false positive per month. These cases are so frequent they usually aren’t even considered news anymore. A few months ago some creators tried to organize but I’ve not heard anything from that effort recently. Google isn’t a government, there isn’t much you can demand from them. They aren’t accountable to you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: