Hacker News new | past | comments | ask | show | jobs | submit login

This report [0] is all I could find. I would have expected this to be more widely reported, and I'm surprised to see it isn't.

[0]: https://www.europarl.europa.eu/doceo/document/TA-9-2021-0319...




That link shows the existing legislation, approved in July. Today it doesn't mandate anybody to scan anything but effectively _allows_ service providers to scan for CSAM, if they wish, by defining "combating online child sexual abuse" as a legitimate reason to process personal data within the limitations of the EU's various privacy & data protection directives (GDPR being the obvious one, but there's others too).

It's worth noting that this was legal everywhere else in the world already (Google are free to scan a US user's Gmail for CSAM with no problem, if they want to).

The motivating concern was that service providers who do already scan for this kind of abuse material (everywhere in the world) were at risk of having to disable such scanning entirely for EU citizens due to privacy laws. This is 'temporary', in that it's a quick fix to avoid service providers immediately disabling all those systems, until more concrete rules on how/when/if service providers should scan content for CSAM are put in place.

That part is not a big problem (imo). The risk is that future legislation goes much further.

The article here seems to be largely lobbying and pushing general awareness of the issue, rather than reporting on any real news (not that that's necessarily bad). So far, there's no concrete proposal AFAICT, let alone a planned vote on such legislation. The EU Commission (who propose legislation, which is then voted on by the parliament etc) have previously said they're looking into mandatory scanning, but given the pushback it's unclear whether that's still the plan.

In the end, it's probably not widely reported because the mandatory part doesn't appear to have a concrete proposal in motion yet. Articles like this are appearing anyway because it _might_ be proposed legislation soon (there's mentions of Autumn 2021) and it's important to mobilize to inform the public and make their opinions visible as early as possible, rather than waiting until the last minute.


>> The motivating concern was that service providers who do already scan for this kind of abuse material (everywhere in the world) were at risk of having to disable such scanning entirely for EU citizens due to privacy laws.

I'm curious if this was a situation where they might have decided that scanning was more important given the implications of processing CSAM? It seems like this is a way for authorities to influence any data processor.

The Apple news last week seemed to come out of the blue and it felt a bit like a trade-off to get out from under the authorities to some degree. This is just conjecture though.


Thank you for the great summary.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: