Hacker News new | past | comments | ask | show | jobs | submit login

I'm on the fence on this one. If I draw a stick figure and put your name above it, is that unethical? What if I add a speech bubble saying something risqué? What if I'm a talented artist and I draw something pretty life-like? I don't see the lines here being too clear without claiming down on all expression.



None of these examples are really anywhere close to deepfake porn videos.

- Porn is done under consent, where participants should be reasonably aware that it will be published

- Porn tarnishes reputations

If you pasted Daisy Ridley's face in a crowd in, say, China, doing every day stuff, no one would rightly care because there's no real potential unless you are doing it for some ulterior motive.


I think how convincing a piece of media may be is at the crux of this issue. Photoshopped heads of celebrities on porn actors' bodies have been around for some time now and isn't, to my knowledge, illegal. The key difference here is that people hold video to be more trustworthy. Given the advancements in CG and now more recently with deep fakes, perhaps that's what needs to change; People should stop trusting video footage.


Photoshopped heads of celebrities on porn actors' bodies have been around for some time now and isn't, to my knowledge, illegal

They have been, since inception, illegal, in the sense that they represent a tortious act. They might be illegal in the criminal sense, depending on the jurisdiction, even in the US if absent meaningful context bringing the fake under the protection of the First Amendment as a form of speech.


I agree, since the every day person's idea of what is possible is entirely cultural and that will change and advance, although it doesn't necessarily need to be illegal for it to be pushed out to the seedier parts of the internet and off of the more public platforms where influence is easier to gain.


What if I paint a specific person being physically attacked?

What if I make a fake video of a specific person being physically attacked?

He didn't consent to be attacked and it may tarnish his reputation. Is it unethical to do any of the above?


I was also, but now think that I disagree with Reddit's decision. I do not think there's a good argument for it being in the same class of phenomenon as nonconsensual porn.

Here's why:

1. r/deepfake exists, draws novelty, and has appeal in part because it's explicitly identified as fake. It's part of the name. So there's not only any claim to it being real, it's explicitly identified as fake. It's hard to argue how there's any nonconsensual anything when the parties involved in the deepfake porn all agree and understand it's fake.

2. Let's say that someone posts deepfake porn as real porn. Now this is a different issue, one closer to liable. But that's where there's some misattribution of something to the potential victim. The victimization is from the assertion that it is about the individual (as opposed to a deepfake creation, where the opposite is being asserted).

3. Let's say that, out of curiosity, you, in the privacy of your own home, on your own hardware, create deepfake porn involving your spouse (who fully consents and wants you to do so because it's arousing to them) and publicly available imagery. You do not distribute it. By the "nonconsensual porn" logic, though, you have now engaged in something akin to sexual assault, by engaging in nonconsensual photography. But this is absurd, because the public figure has suffered nothing, and nothing was obtained from them without their consent. It's your (and your spouses') creation.

4. Let's take this a step further, and say that a year from now you create software that will create a simulation of a person solely from its knowledge of what humans look like. Let's say that you obtain something that looks like a celebrity. Have you now created nonconsensual photography?

5. Let's say you find a person who is the doppleganger of a celebrity--a dead-ringer lookalike. You film them in porn. (This has been done actually.) Is that nonconsensual porn, because the celebritie's likeness is being used without their consent? The porn actors/actresses consented, though--why is deepfake any different, when there's nothing to consent? Why do you need porn actors/actresses consent to supercede the celebrities whose likeness they resemble?

The logic behind this reddit (and pornhub, etc.) decision is full of holes as far as I'm concerned, and it creates a very dangerous precedent concerning consent. It essentially gives people power over others' likenesses due to their popularity.


At point 3 you skip over the public humiliation that a published deepfake can be responsible for. By creating something and keeping it to yourself, you are basically demonstrating the behaviour Reddit wants. They want you to keep that "content" offline in your basement and not on Reddit being spread and commented on.

If that fake sex gif you made of your wife gets spread around her workplace, you better believe that would be an extremely humiliating and emotionally traumatic experience. An experience these rules will be created to prevent.


> I'm on the fence on this one.

What's there to be on the fence about? It's terrible and a infringement of free speech. But reddit has the right to ban it as a private company. It just make reddit look terrible and hypocritical.

But considering they are planning on IPOing, I guess they have to sell out.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: