> Over a novel, non-standard definition of child pornography.
The problem is, there are jurisdictions in which loli and similar content can be classified as CSAM [1], and there are jurisdictions where e.g. the operator of a federation server can be held liable for facilitating access or for not blocking access to such material, even if the material in question isn't hosted at that server, such as in Germany (the case in question [2] was about software piracy, but the general legal principles apply just the same).
Legal systems worldwide haven't even begun to catch up with tech developments, and I think it will take at least a decade until regulations adapt.
The problem is, there are jurisdictions in which loli and similar content can be classified as CSAM [1], and there are jurisdictions where e.g. the operator of a federation server can be held liable for facilitating access or for not blocking access to such material, even if the material in question isn't hosted at that server, such as in Germany (the case in question [2] was about software piracy, but the general legal principles apply just the same).
Legal systems worldwide haven't even begun to catch up with tech developments, and I think it will take at least a decade until regulations adapt.
[1] https://en.wikipedia.org/wiki/Lolicon#Legality_and_censorshi...
[2] https://rsw.beck.de/aktuell/daily/meldung/detail/bgh-kein-un...