Hacker News new | past | comments | ask | show | jobs | submit login

Like all tools it can be used for good and evil. It could be installed directly in cameras to sign videos. And people with the power to turn it off could make AI fake videos that much more believable.



I would make the argument that these AI safety initiatives yield messaging that muddles and confuses the public on the simple fact that they should not, under any circumstances, use a video or image as proof or assume its veracity. When I tell someone this it is common for them to come back with something like "aren't they working on things to detect if a video is fake?" I think this idea, that video content can still be trusted and that {COMPANY} is being responsible is the real goal of the money pumped into these watermarking techniques. These techniques will not actually help people, images and video will continue to be used for disinformation. The only thing that can stymie that is a broad cultural shift to default to distrust of photographs and video footage, to treat it all like you might a painting or animated cartoon depicting an event; maybe an accurate portrayal, but just as easily totally fabricated. The responsible thing for companies to do would be to spread messaging indicative of this fact, but they would rather engage in safety theater and score some points while keeping users dumb and easily fooled.


"they should not, under any circumstances, use a video or image as proof or assume its veracity"

This is just silly. Courts never assume the validity of evidence. It is actually assumed to be invalid unless it can be proved to have not been tampered with. Photos have been able to be edited for over 100 years but they are still used as evidence. The person who took the photo will sign an affidavit and or testify in court that it is real. And AI videos are going to be easily detectable for a long time.


I'm talking about your average person, not the court system. I'm asserting that culturally we need to shift to acknowledging that photos are not proof, rather than pretending that some fancy counter-model or watermarking will somehow allow us to maintain an already-misplaced trust in the veracity of images.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: