
App Undresses a Photo of Any Woman with a Single Click - longdefeat
https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman
======
parliament32
If anything, I'm surprised that this wasn't created sooner.

Additionally, entire communities exist around shopping celebrities' heads onto
naked bodies; this has been a thing approximately forever but no one really
cared. Now that it's automated it's "absolutely terrifying"? Like the author
said, anyone with a few hours of photoshop training can do the same, it's just
faster now.

That being said, news articles like this will be key to generating enough fear
to pass legislation about what we can and can't do with software. Not sure if
that's a good thing.

~~~
QuickToBan
This software is just a precursor to what could be possible in the future. You
could conceivably tell the software a caption and it would generate a video
for you. How can you control what software someone runs offline locally? You
can't. I find it irrational to even think of such legislation.

Posting such generated content online is another matter though. It would
probably already violate some preexisting laws.

~~~
Izkata
> How can you control what software someone runs offline locally? You can't. I
> find it irrational to even think of such legislation.

This is what DRM and Content ID are. They're not perfect, but plenty are
trying, and it does stop most non-technical people.

~~~
QuickToBan
I don't think you have thought it through. Those work using a blacklist only.
First, you can't force people to run DRM on their system. Even if you could,
the ML based generator will always generate some new and random that foils the
DRM. You will basically need to force people to run an ML model on their
system which serves as an intelligent dyanmic DRM, and thia dictatorial idea
is acceptable nowhere.

------
thelazydogsback
I don't get the horror -- it's _not_ the woman's actual body. And even if it
were, it's _just a body_ and maybe we shouldn't freak out about it. If we
could somehow embrace "radical transparency" then it wouldn't make a
difference if someone saw you naked, knew what gender's clothes you like to
wear, what medical conditions you have, or anything else. If not that far,
let's at least get rid of this puritanical nonsense where we can't see naked
bodies, but it's fine to see extreme violence, etc. I agree that deep fakes
will perhaps move us a step closer to not being shocked by images, and not
believing anything we see or read online, which is probably good thing. Of
course soon we'll all have to be naked save for gobs of SPF-1000 anyway...

~~~
lostmsu
I totally agree with this comment. "Ethics" needs disruption. We only have
problems with nudity, because it is a tabu.

------
lacker
They sure have a lot of fake nude-but-censored photos in this article. I think
one or two would have sufficed to show how the technology works.

~~~
Fjolsvith
I only saw two.

------
chlorophyl
I see huge consequences as you pretty much frame any woman and there's nothing
she can do about it. The problem is that perception matters. Many people who
are not exposed to technology would not understand / believe that such app
exists.

For example, if a person decides to grab a picture of a young teacher and post
fake nudes, the teacher may be able to say that the pics are fake, but she
might get fired anyway due to decisions made by the school board or PTA. This
type of news might be picked up by the local news network and the story will
be embellished out of proportions. Regardless of what she says and whether
it's true, her reputation would already be tarnished. We've seen a lot of
these happening with celebrities. It will be worse if it happens to normal
people with no money to hire a PR company or go to court.

------
apo
> "This is absolutely terrifying," Katelyn Bowden, founder and CEO of revenge
> porn activism organization Badass, told Motherboard. "Now anyone could find
> themselves a victim of revenge porn, without ever having taken a nude photo.
> This tech should not be available to the public."

Or maybe the opposite?

The transition to ubiquitous deep fakes will be very jarring to those who
haven't been paying attention. In time, the shock value (and indeed interest)
should fade into obscurity.

Pop stars upset about appearing in one of these deep fakes should be much more
horrified about another possibility: eventually their talents will no longer
be needed at all because it will be possible to produce content free of human
placeholders and much more effective at generating money.

------
sandworm101
A photo of "any woman"? I think not. All the examples are of female
celebrities who are 90% naked in the original photo.

Taylor swift in a skin-tight see-through strapless dress? Kardashian in a
bikini? I'll call this thing dangerous when it can work from headshots or
images of people in winter clothing.

------
RickJWagner
What a great advertisement for the app. They were sure to identify the app and
mention the cost.

Shocking, simply shocking.

------
krapp
You think this is terrifying, wait until it works in realtime with an
augmented reality app.

~~~
buboard
there's desensitization. try a nude beach for an example

~~~
krapp
Nude beaches imply consent - most places are not nude beaches.

------
bitforger
> DeepNude also dispenses with the idea that this technology can be used for
> anything other than claiming ownership over women’s bodies.

That's not true! We can also use it for undermining democracy, making
everything look like a Van Gough painting, and...

> This algorithm is similar to [...] what self-driving cars use to "imagine"
> road scenarios.

I understand the fear here, but I don't appreciate rage-bait sub-headlines
that contradict the rest of the story. Obviously GANs can be used for many
things.

Also, I'm sure there are a few bad actors who will use this for things like
revenge porn (which is Not Good), but I suspect the driving demand behind this
application is horny teenagers who think Daenerys from GoT is hot, rather than
malicious people who want to "oppress women." Although to be fair, this is the
effect the technology has, whether intended or not.

------
thrill
"I was high when he pitched it!" [https://www.youtube.com/watch?v=FFNhP-
dBK4E](https://www.youtube.com/watch?v=FFNhP-dBK4E)

------
likeclockwork
Two thoughts:

1\. Doesn't this make any genuine nude photo deniable?

2\. I wonder if this is the killer app for augmented reality.

~~~
mcphage
> 1\. Doesn't this make any genuine nude photo deniable?

Deniable, sure, but what will denying it get you? If people wanted to rub it
in your face, saying “it’s not real” won’t stop them. If your workplace is
going to fire you because of it, denying it won’t stop them from firing you.

~~~
drink_bepis
I think it will eventually make the whole of nude photos a non-issue. You'll
only still get fired and shamed for it while this tech is still underground -
once it's widely known, "nudes" won't matter more than any other photoshopped
image.

------
Scoundreller
> and only works on images of women.

Gamification shows up in some unlikely places

------
buboard
Nice vice spam. What's horrifying about a person being nude again or seeing a
nude person? I get the novelty but this is not " absolutely terrifying"

~~~
gtf21
While I would agree in principle that there's nothing shameful about nudity,
in most societies, this is simply not the case. Having images circulated of
oneself in the nude would be shaming, and I imagine that having these images
faked makes one feel a loss of control over one's image.

The fact that this was only made to effectively "unclothe" women is also
disturbing (if unsurprising).

Not sure it qualifies as "spam".

~~~
buboard
> Not sure it qualifies as "spam".

the amount of fake nudes in the page. clearly yellow attention grabbing

------
cannonedhamster
People only see the downsides but there could be some really cool uses for the
technology in image repair, think old paintings, damaged photos, etc. So while
yes, this tech starts as unwilling pornography, it could turn into something
generally usefully to society.

~~~
lostmsu
Obviously, they took an existing technology, and applied it to a different
goal.

~~~
cannonedhamster
Yes, obviously every other comment is about how terrible it is, however no one
focuses on the tech that's available unless you're already in the field. Not
sure what about my comment set people off, I almost expect it's bleedover from
a different thread since there's no responses other than yours. The tech
itself is pretty cool, using data sets to model similar body types. I remember
seeing this when it first came out and was being used to swap President
Obama's speech. I'd expect that this will be used to model missing children
better, it might lead to computerized assistance in facial and body
reconstruction. The tech is pretty interesting.

