
We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now - nz
https://motherboard.vice.com/en_us/article/bjye8a/reddit-fake-porn-app-daisy-ridley
======
Everlag
If you're looking for a quicker summary of what's happening, there's a write
up on r/cyberpunk[0] that was posted 3 days ago. There's a very specific
flavor to the subreddit and the comments on that write up that might be
satisfying for those that appreciate cyberpunk.

If it wasn't immediately evident, this is very NSFW.

[0]
[https://www.reddit.com/r/Cyberpunk/comments/7sexm6/](https://www.reddit.com/r/Cyberpunk/comments/7sexm6/)

~~~
propman
Wow I just went through this and the deepfakes subreddit, this has SERIOUS
case of abuse all over it. How soon till high school kids bully peers over
this? Or even teachers?

Blackmail, revengeporn, bullying, harassment. I'm a bit shocked right now at
how real it is. I saw one link that was almost undetectable even with very
complicated movements. We need some regulations over this as soon as possible
to prevent abuse.

~~~
commandlinefan
Actually I was thinking it would do exactly the opposite - even if a
legitimate tape is leaked, now the victim can say, "that's fake" and it will
be completely believable.

~~~
mcphage
> now the victim can say, "that's fake" and it will be completely believable

Believable, perhaps—but no less humiliating.

------
tbirrell
This is probably NSFW and should be flagged as such. While it is a technical
subject (the video "photoshopping" technology) the examples used render the
article difficult to view at work. Especially if company firewalls are doing
text searches for keywords.

~~~
krapp
People shouldn't be surfing Hacker News at work anyway, and the title should
make the nature of the content obvious.

~~~
chrisbennet
Back to work you slackers! Who said you could surf the Internets while you eat
lunch at your desk!!!

------
tlb
Someone should build the app that replaces one actor's face with the viewer's
own face, rather than a celebrity. It could 3D-scan the viewer and
automatically find videos with sufficiently similar actor bodies. Then you
could watch yourself interacting with the other actors.

Depending on the personality of the viewer, it would be either the best or
worst thing they'd ever seen.

------
benmarks
I fear a future in which recorded evidence of misconduct cast a shadow of
doubt of equal size.

~~~
kowdermeister
It just means that image forensics will be a much more profitable business in
the future than we probably thought it before.

\+ don't forget that if the source "vehicle" video is found, then it's a proof
against who filed it, so it might not worth the risk.

~~~
kls
I was thinking the same thing, at least in the US it will expand the shadow of
a doubt, as any doctoring could lead to inadmissible evidence.

------
breakingcups
I can't wait for the damning Donald Trump video to finally come out only to
now be loudly rejected by him and his followers as fake news because "anyone
can do it with an app!"

------
88e282102ae2e5b
Maybe this could finally bring the use of cryptographic signatures into the
mainstream. An additional incentive might be that they could even be used to
"cheat" when unfavorable news came out: an organization that routinely signs
all of their press statements and internal documents could disavow any
unsigned leaks, perhaps.

------
hungerstrike
They didn't show or even mention the highest quality fakes of huge celebrities
that have been seen on /r/deepfakes like Katy Perry or Scarlett Johansson.

I wonder why? I thought those names were way bigger than Daisy Ridley or
Jessica Alba.

