If they'd cut out their classmate's head from their yearbook photo and pasted it onto the body of a pornstar with elmer's glue would police have gotten involved? This type of bullying is as old as dirt.
Bullies suck, and the kids who did this should get in some kind of trouble with the school and their parents, but this is not something police need to be involved in.
The moral panic over deepfakes is ridiculous, but the endless push to give literal children police records for acting like dumb kids is what's actually dangerous.
A deepfake is more detectable as a fake than a good photoshop, and people have been using that for this sort of thing for decades, but ultimately it doesn't really matter. The fact is that giving children police records doesn't improve the situation in any way. These kids need education and parenting, not jail time, court dates, and a sex offender status.
> glued on photo will always be uncanny, but deepfake will not
Eh, this is still a civil matter. Unless the picture was made for the purpose of harassment, there is no ill will we can presuppose the perpetrators had towards the victim. Just immense immaturity and a lack of judgement and empathy. Judgement and empathy can be taught. (If it was done with the intent of causing distress, i.e. cruelly, it's a matter for the police.)
> Did you read the article? It states they created the images and were sharing them.
I'd need to see more to conclude harassment, i.e. proof of intent to cause psychological harm, versus teasing without any clue about the impacts of one's actions.
> 'boys will be boys' mentality
I don't think this is equivalent. This is immature kids playing with new technology and pushing the boundaries of decency with it. Without proof of intended malice, or evidence of actual harm, it's not a matter for criminal procedure.
> Teasing, whether you have a clue or not, is intent to cause psychological harm
As someone who teased and was teased, was bullied and may have crossed into bullying, I disagree. The behaviours, when done with a group, are the seeking of social approval. The victim is closer to irrelevant than targeted. Where it crosses into bullying or harassment is where the victim is germane.
Fair enough. But this is like 1/100th of the transgression of spreading actual photos from gym class showers, which is a thing that happens quite a lot since everyone has a camera nowadays.
You don't hide it. You take a camera and photograph some girl or guy far down in the pecking order.
I guess the FBI might knock down my door soon after trying to Google for some report or stats about it. Just some random news articles. Maybe I overestimated the problem?
Well, yes, that's clearly true whether or not it is also a criminal matter.
> Unless the picture was made for the purpose of harassment, there is no ill will we can presuppose the perpetrators had towards the victim
That is almost tautologically true, but on what bases do you conclude that it is so certain as not to warrant inquiry that there was no intent to harrass?
> on what bases do you conclude that it is so certain as not to warrant inquiry that there was no intent to harrass
I conclude it is uncertain enough to warrant solely civil engagement on the basis of them being kids. Otherwise, every five year old pantsing their sibling becomes a matter for armed law enforcement.
Yeah, no, that doesn't follow. "Kids" aren't a uniform category; there's a reason that kids under 7 are generally viewed as incapable of the necessary mental capacity for criminal accountability, and between 7 and 14 there is in most states a presumption of incapacity that must be overcome in addition to the usual proof requirements for crime. So, no, you don't have to treat acts by minors in their late teens as irrefutably noncriminal without investigation to avoid considering acts by five year olds to be that way (even before considering ways that the act you suggest for the five year old was dissimilar to the one at issue for the older minors in the article, which would be another axis on which treatment could also reasonably be differentiated.)
The act of creating something is usually just irrelevant tinkering. Discussions like this on HN tend to overemphasize the importance of creation, probably because most of us have jobs where we are supposed to create something. But the real-world consequences of creation usually don't come from what was created or how it was created, but from distributing the creation.
And that's what laws should be concerned about. Creating something for your private consumption is usually a private matter. Distributing it should be judged according to the potential harm, as well as to potential legitimate uses.
> Distributing it should be judged according to the potential harm
I'll admit that showing your buddies a deepfake/photoshop/cut and paste job of that one kid you hate from social studies is very different from creating a website or fake social media profile where the whole world can see it. In this case, it seems like some pics were shared with classmates who deleted the photos when they were caught and now the police can't even find copies to see what all the fuss was about, but the internet adds an opportunity for harms that extend far beyond the classroom.
There might be some instances where a situation requires intervention beyond parents and teachers, but I see no evidence that this was one of them.
Dicey grounds here, but it should be legal to imagine _anything_. Thoughtcrimes are always wrong, and there never should be any law on what you can think or imagine.
If it were a crime we'd likely have to repeatedly arrest every single student until they got out of high school. The curiosity of children and the hormones of puberty make the crime of imaging a certainty.
Every single human adult alive today has imagined minors naked, and particularly in a sexualized way, including you. We were all teenagers once, and in the company of other teenagers whom we were attracted to.
I seem to recall, there was a case where a man was convicted of child pornography, because he cut out pictures of children's faces, and glued them onto the bodies of adult porn actors.
That seems kind of wrong to me too, but the main thing that differentiates it is that the person in that case was an adult. We know that kids are going to do stupid things and childhood is the time to teach them to be better. It's not helping anyone to ruin a kid's life for acting like a child.
The spread of a sheet of paper, is different than a digital sheet of paper, is different than a digitally stored video.
The moral panic might be ridiculous but taking some time to research the impacts so far should be a little sobering to a generalization of an interpreted generalization.
In this case, an extremely straw law might go a bit further than trying to suppress or control technology.
The article describes them as teens so they’re almost certainly old enough to experience some more serious repercussions for sexual harassment. It’s the old as dirt behaviour that this is “acting like dumb kids” so it should slide that helps perpetuate this sort of thing.
> It’s the old as dirt behaviour that this is “acting like dumb kids” so it should slide that helps perpetuate this sort of thing.
I never said it "should slide". This sort of behaviour is common in children and childhood is exactly the time to teach them to be better. This is a matter for the school and the parents to handle. There is zero reason for police to be involved in any of it.
If you just shrug it off as “bullying” but I’m not convinced this behaviour is so innocent and I suspect that at some point between the age of 13-19 as in this case there should be more serious repercussions.
> I suspect that at some point between the age of 13-19 as in this case there should be more serious repercussions.
Child development doesn't advance on a set schedule. Kids at the same age can be at vastly different levels of maturity and even older kids often make stupid choices carelessly. It's part of growing up. Their brains are a work in progress and that applies most to the parts that help them make good choices (https://www.aacap.org/AACAP/Families_and_Youth/Facts_for_Fam...).
That doesn't mean that kids can't make good choices, but it does mean we need to expect them to make terrible ones from time to time. As adults, it's our job to show them how they screwed up and set them right, but destroying their lives when their brains let them down isn't helping anyone.
Children need teaching and guidance as they grow up, not prison sentences or sex offender status. We gain nothing by treating children like criminals for acting like kids.
Dude, the amount of people that want some boys to go to jail for what amounts to a doodle using advanced drawing technology is crazy. Glad to see some sanity left in the world.
Great we should assess the child on a case by case basis as part of any investigation. That’s said within those age ranges, depending where you are you can both pass the age of consent and age of majority. As people age towards adulthood we should increase their responsibility and accountability particularly when they hurt someone. Sexual harassment isn’t this fluffy, “nothing” thing where other children didn’t get hurt.
That depends on the extent of the harm. Some bullying is “mild” and recoverable. Assault is another matter. Reaching for the law as the first step is not a good idea, but excluding the possibility simply because they’re minors would be a mistake.
I have some experience with this in a previous job -- one of the commenters here is right: going back to before smart phones, there has always been cut & paste nudes/porn made by kids. From magazines, through jpgs, now even mkvs.
However, even back in the day there has been quite the spectrum of just how bad this kind of thing is: one image of a crush, who is 17 (technically underage), shared with a best friend? Inappropriate but not across the line of "law enforcement action" even if the crush's parent is very offended.
A google drive link with 100s of pornographic deepfakes, of a single person (or sometimes many students) from a school, many of them 12-16 years old, shared extensively across the internet? This is a proper serious problem. Kinda like the difference between throwing a rock at a bus vs detonating a bomb in a train station. Both are some amount of kinetic force, but the magnitude is wildly different.
I'm not pretending that physical violence is the same as making pornographic imagery, but my experience in that field (no jokes plz!) shows a much stronger link than many would like to admit, from one to the next. It's useful for teenagers to learn where these boundaries are in their particular society - life is often more pleasant when ~everyone agrees on the local standards of behaviour and they are (mostly) followed.
As for police action where I lived, it too followed a broad spectrum: on one end, there is arrest and court action which on occasion is definitely warranted (another pun, sorry). On the flipside, there's educating kids about the reality of the world they live in: If you share a nude, it might get shared again -- consider not including your face and your reproductive organs in the same picture. If someone generates a fake picture of you, it'll feel hurtful and you will be wronged, but what if they wrote something mean (and fake) about you? Eventually you will need to deal with the fact that some people are arses.
Some combination of protecting victims (but also helping them grow a thick skin in this world), educating kids on what is appropriate so they can know there is some boundary where things move from 'funny' to 'wrong'. They usually already have a sense of 'private' vs 'public', though that distinction is a becoming lot blurrier/different for current young generations.
And nothing above has anything really to do with 'AI'.
Our society will quickly need to adapt to the idea that seeing a photo of something no longer means it really happened. There are going to be fake videos, fake photos, fake articles, etc.
Rather than burying our heads in the sand and making it illegal to produce those, I think the better long term approach is to teach our children to be skeptical, and to emphasize that we can't believe everything that's shared on the internet.
Speaking generally here, but I'd hoped that decades of photoshop would have been enough to inspire some healthy skepticism for pictures and hollywood/instagram filters enough to do it for video.
On the plus side, the advancing technology can provide plausible deniability if a genuine intimate photo/video gets leaked, although as long as we're hoping for humanity to evolve a little, maybe we can collectively get a little less prudish about the human body and sex too.
> Our society will quickly need to adapt to the idea that seeing a photo of something no longer means it really happened.
Yup. We're already sort of getting there.
I'm already living in a world where I can't trust what I read. Soon I won't be able to trust what I see or hear online. Face to face interactions will be required for anything important.
Eventually, society is going to have to come around to the fact that this technology can't be legislated away. It's here to stay and it's only going to improve.
I hope we get to a place where the automatic assumption upon seeing any digital image is that it's probably fake.
Sure, and I agree that it's harassment, and reprehensible, but there will always be people who do it because there are a lot of shitty people out there.
We can criminalize adults for that. We're not supposed to criminalize children for acting inappropriately, we're supposed to teach/parent them. Either way, there should be no shrugging of shoulders and allowing harassment to go unaddressed.
It’s not just acting inappropriately it’s hurting other people. We wouldn’t (well hopefully) excuse rape or murder in the same way. Children very much can commit heinous acts and should with appropriate investigation into what they understand of it be held accountable. These are also teens rather than children. A lot of places allow people in their teen years all sorts of rights like the legal ability to consent to sex. I’d suggest they can understand and be culpable for sexual harassment at that age.
> We wouldn’t (well hopefully) excuse rape or murder in the same way.
If these kids had raped and murdered I'd agree that that's an issue teachers and parents need outside help to address, but that's not what we're talking about.
> These are also teens rather than children.
Teens are adolescent children and their undeveloped brains make them highly susceptible to making poor choices and failing to consider the consequences of their actions. This is the time of their lives when they're naturally going to do things that they shouldn't.
There's a reason we don't lock schoolyard bullies up for theft when they steal someone's lunch money. We don't arrest them for doxing when they write a classmates name and phone number next to "for a good time call" on the bathroom stall. That's stupid kid shit that children have been doing for ages and the solution isn't prison, it's parenting, so that those kids can learn from their poor choices and become better people.
Teens can be held accountable for their actions without involving law enforcement.
> society is going to have to come around to the fact that this technology can't be legislated away
Neither can fraud or murder. Yet we still ban it. This concept that if a problem cannot be 100% solved by the law, it shouldn't be banned, is an umbrella for the insufferable.
By all means ban it. I'm not suggesting that 100% is required. I'm suggesting that banning it won't really accomplish much because it's pretty hard to enforce what you can and can't generate on your own machine without trampling on people's civil rights.
That’s not comparable though since the deepfake is going to be photo-realistic and the image wasn’t just made but shared. As noted in the article it seems that making these sorts of images of real people is highly likely to be made illegal in its own right and the behaviour could be illegal already under different laws.
> That’s not comparable though since the deepfake is going to be photo-realistic and the image wasn’t just made but shared
You can make a photorealistic drawing too, that can pass as a real photography (and with enough skill it will be better than whatever generative models can produce). The difference here is the lower barrier to make the images.
But I think that even a photorealistic drawing, if made and distributed to others, should be considered a form of harassment
> But I think that even a photorealistic drawing, if made and distributed to others, should be considered a form of harassment.
I agree, but critically, I think it's the harassment that matters, not the drawing.
If someone draws something in the privacy of their own home (maybe or maybe not with the assistance of an AI), and the police find it in an unrelated search, they should have nothing to prosecute. Drawing is one way humans work through their thoughts and emotions; the act of putting pencil to paper—including digital pencil to digital paper—should never be illegal, lest we effectively create thought crime.
Yes, when I was in college I took an art class where we had to cut a magazine picture in half, glue it to a piece of paper, then draw the other half. Mine was good enough that unless you got up close, you couldn’t tell which was the photo and which was my drawing. If someone had taken a picture of it, you wouldn’t have been able to tell at all. Photorealistic drawings are very possible.
I agree there’s no difference between making a fake photo-realistic image by hand or by AI. A crudely rendered drawing could also be a form of harassment as well but probably in different circumstances.
I think you may be underselling what's going on here, because the entire point of AI generated artwork is that it is based on some chain of relationships to the source material, that there is no actual creativity involved.
So the AI is going to take what you give it, and combine that with what others have given it based on a bunch of similarity weights. If the AI is any good at its job, it will produce an image that is remarkably correlated to the undisclosed original, since that is exactly what AI is doing every time it's asked to do anything at all: "I have an incomplete thought/image in my mind, please produce a written or visual completion of it based on a model built on several billion best guesses from real world data." And if the victim of the harassment has had the misfortune (or lack of foresight) to have their actual nude images somehow slip into the ai's training data, it is likely the AI will prioritize surfacing them in its output.
I do not think we have to seriously entertain the thought that a random 15 year old will have nudes in the first place, much less part of a training dataset.
Either way, the AI will now give you a body that is roughly correlated with what your head looks like.
That's going to make an interesting supreme court case when it ends up there.
You can develop fake images of someone from pretty much purely public data. Images in the past, you would have needed to take an actual picture of the person. And if you had taken a picture of the person, there's generally a way to handle this scenario under copyright laws.
If someone shows up in court and says it's not a picture of the person suing, and just a generated image of someone who looks rather similar (people looking similar to each other being a rather common occurrence), and that the image is unlikely to cause imminent lawless action, it would be really interesting to watch the court try to pick apart what's going on.
So if a teenager is skilled enough at drawing on their iPad to make something look photorealistic, and they draw a naked picture of a classmate and share it, should that be illegal too?
I'm not trying to say it's ok, but just like the person you replied to, I truly find these things comparable.
I think it’s a matter of degree. If you share it behind closed doors, with a few friends, it wouldn’t turn any heads (mostly because the target is unaware).
If you share it with the entire class? In the school building itself? It becomes impossible for the target to not be aware.
What if you share it with one person, and that person shares it with one person, and so on, until the whole class sees it? Has no crime been committed?
> So if a teenager is skilled enough at drawing on their iPad to make something look photorealistic, and they draw a naked picture of a classmate and share it, should that be illegal too?
If the person is recognizable, then in many cases it would violate existing nonconsensual/revenge porn laws, which very often are not concerned with provenance but whether its a explicit, a recognizable depiction, and nonconsensual.
(Depending on where it is shared it might, as others note, be sexual harassment as well.)
Yeah if they can draw something that photorealistic then sure. Poorly rendered drawings could also constitute harassment depending on what they were and how they were used.
It is reprehensible to be cruel to others; that is, to increase the net amount of grief in their life in order to derive some satisfaction for yourself. it does not matter how you go about doing it or under what cover you can obfuscate it.
The latter originates as a formalized description of the former. That despicable and selfish people frequently exploit the delay between the former and the latter does not excuse their behavior and certainly does not dismiss the misery, however small, their actions contribute to the world.
It's not just drawing though? In this it's creating a copy that's realistic enough to pass for an original, which can be psychologically taxing and damaging for a person, or a potential career/safety issue. I get the teenage impulse to do it, it's just a bad idea for everyone involved (any halfway decent teen would regret doing it as an adult).
I would assume the girls - as classmates - are of a similar are to the "teen boys". There's unlikely to be a significant age differential, and it doesn't really change the ethical implications.
> While some legal authorities argue against the legality of these AI-generated depictions, there is no established precedent for prosecuting their creation.
It's possible that what they did was illegal regardless of the age of the girls. These laws however exist to protect children. Treating a bunch of hormonal teenagers the same as a 50 year old adult is unlikely to be a net positive to society.
Bullies suck, and the kids who did this should get in some kind of trouble with the school and their parents, but this is not something police need to be involved in.
The moral panic over deepfakes is ridiculous, but the endless push to give literal children police records for acting like dumb kids is what's actually dangerous.