> This legal inconsistency – the fact that child sexual abuse imagery must be reported, but reporting child sex trafficking is not legally required – is a major problem, says Staca Shehan, vice-president of the analytical services division at NCMEC.
This seems to be a pretty significant legal oversight.
This seems like an obvious first step to combat the problem. Much of the activity is public too, it's not all in private messages. The article does make a good point that it's much easier to detect CSAM than trafficking, but still, I'm sure Meta and other platforms could be doing so much more to detect this activity and reporting it.
This seems to be a pretty significant legal oversight.