Hacker News new | past | comments | ask | show | jobs | submit login

It's pretty abhorrent to practically digitally unclothe someone (unless they fully consent to it), and more times than not the content produced by this will be shared online to countless other people.

The thing is, you cannot guarantee that everyone will know it's fake, and when you weaponise it against people who can't distinguish it there'll be consequences.

Currently we see people found to be doing Adult Content online fired from their jobs, but what if a technically-blind boss gets sent a ton of deepfakes of an employee and is falsely told they do content like that on the side? What if a doubting partner suddenly receives deepfakes of their partner with someone else? What if your political rival starts spreading deepfakes with wildly rabid claims about yourself which the media picks up and reports on – bullshit has spread around the world by the time the truth has tied its shoelaces.

Think about the absolute inane content people believe on Facebook. Think about the misinformation that gets spread over Twitter just because the fake information would be funny. Think about the stuff you've seen online which turned out not to be so real.

Even if the images generated aren't real, the subject of them can't convince the world they're not, and there's no way for them to stop the violation of their self-image.




> It's pretty abhorrent to practically digitally unclothe someone (unless they fully consent to it),

I'm having a hard time wrapping my head around this. Is the same true of doing so mentally?

My post specifically posits that the images are known to be fake and asks if THAT has any harm. Obviously framing people for things is a harm.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: