I have some experience with this in a previous job -- one of the commenters here is right: going back to before smart phones, there has always been cut & paste nudes/porn made by kids. From magazines, through jpgs, now even mkvs.
However, even back in the day there has been quite the spectrum of just how bad this kind of thing is: one image of a crush, who is 17 (technically underage), shared with a best friend? Inappropriate but not across the line of "law enforcement action" even if the crush's parent is very offended.
A google drive link with 100s of pornographic deepfakes, of a single person (or sometimes many students) from a school, many of them 12-16 years old, shared extensively across the internet? This is a proper serious problem. Kinda like the difference between throwing a rock at a bus vs detonating a bomb in a train station. Both are some amount of kinetic force, but the magnitude is wildly different.
I'm not pretending that physical violence is the same as making pornographic imagery, but my experience in that field (no jokes plz!) shows a much stronger link than many would like to admit, from one to the next. It's useful for teenagers to learn where these boundaries are in their particular society - life is often more pleasant when ~everyone agrees on the local standards of behaviour and they are (mostly) followed.
As for police action where I lived, it too followed a broad spectrum: on one end, there is arrest and court action which on occasion is definitely warranted (another pun, sorry). On the flipside, there's educating kids about the reality of the world they live in: If you share a nude, it might get shared again -- consider not including your face and your reproductive organs in the same picture. If someone generates a fake picture of you, it'll feel hurtful and you will be wronged, but what if they wrote something mean (and fake) about you? Eventually you will need to deal with the fact that some people are arses.
Some combination of protecting victims (but also helping them grow a thick skin in this world), educating kids on what is appropriate so they can know there is some boundary where things move from 'funny' to 'wrong'. They usually already have a sense of 'private' vs 'public', though that distinction is a becoming lot blurrier/different for current young generations.
And nothing above has anything really to do with 'AI'.
However, even back in the day there has been quite the spectrum of just how bad this kind of thing is: one image of a crush, who is 17 (technically underage), shared with a best friend? Inappropriate but not across the line of "law enforcement action" even if the crush's parent is very offended.
A google drive link with 100s of pornographic deepfakes, of a single person (or sometimes many students) from a school, many of them 12-16 years old, shared extensively across the internet? This is a proper serious problem. Kinda like the difference between throwing a rock at a bus vs detonating a bomb in a train station. Both are some amount of kinetic force, but the magnitude is wildly different.
I'm not pretending that physical violence is the same as making pornographic imagery, but my experience in that field (no jokes plz!) shows a much stronger link than many would like to admit, from one to the next. It's useful for teenagers to learn where these boundaries are in their particular society - life is often more pleasant when ~everyone agrees on the local standards of behaviour and they are (mostly) followed.
As for police action where I lived, it too followed a broad spectrum: on one end, there is arrest and court action which on occasion is definitely warranted (another pun, sorry). On the flipside, there's educating kids about the reality of the world they live in: If you share a nude, it might get shared again -- consider not including your face and your reproductive organs in the same picture. If someone generates a fake picture of you, it'll feel hurtful and you will be wronged, but what if they wrote something mean (and fake) about you? Eventually you will need to deal with the fact that some people are arses.
Some combination of protecting victims (but also helping them grow a thick skin in this world), educating kids on what is appropriate so they can know there is some boundary where things move from 'funny' to 'wrong'. They usually already have a sense of 'private' vs 'public', though that distinction is a becoming lot blurrier/different for current young generations.
And nothing above has anything really to do with 'AI'.