Hacker News new | past | comments | ask | show | jobs | submit login

I agree it’s gross, but I’m not sure how to articulate it. It’s making real something people have done in their minds-that is imagine people with no clothes on. These pictures aren’t the actual subject naked. There’s nothing being discovered or disclosed. It’s a pure fiction. But it still bothers me.



It is fiction yes, but if it is lifelike enough does the difference matter? Even without the ick factor of making porn of someone without their consent, it would be so easy to destroy someone's career or relationship by making these deepfakes and then spreading them around. Especially once the tech gets more life-like and loses the current AI gen tells.

And the same would apply to doing this the old fashioned way in Photoshop, however you have to admit that taking it from "need special software and experience in using that software" to "just upload their image to a website and get back their AI generated nudes" is a huge change in how accessible this is.


> it would be so easy to destroy someone's career or relationship by making these deepfakes and then spreading them around.

If nude pics can get you fired, work culture needs to change.

Same with relationships.

Basically, people need to learn not to trust digital media at all without some kind of authentication, and to be a little more tolerant of nude human bodies when they do pop up.


Potentially the opposite. It may become more difficult to use such images to harm someone's career or relationship. Perhaps nothing is believable in the future.


I’d like to believe that this is the future. Already with the rise of digitally native relationships nudes have become commonplace (even Jeff Bezos has sent some). Now with these widely accessible deep-fake generators any leaked nude photo can be chalked up to digital malfeasance!


>it would be so easy to destroy someone's career or relationship by making these deepfakes and then spreading them around.

Only because this tech isn't yet well known. People will just correctly learn to not trust anything they see online.


The average IQ doesn't allow for that.


When it's on TV and people are aware it will sink in.

It's only Voice cloning that really scares me.


I’d also add that the knowledge of what’s possible diffuses to abusers a lot quicker than it diffuses to, say, the grandparents of victims.


Through all that pain, humanity adapts.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: