
Adobe Photoshop AI Detector - dsr12
https://www.fxguide.com/fxfeatured/adobe-photoshop-ai-detector/
======
sam_goody
Now the team in charge of "Face-Aware Liquify" has to improve their algorithm
so that the edits won't be recognizable.

That would mean better warping ("more real looking"), which is overall a great
thing, even if it just ups the ante for the for the "Edit detector" team.

Basically, the red team has to hack into the image instead of a program -
definitely a good way for Photoshop to evolve.

(Though I realize that this detection would be useful for much more than just
the liquify tools.)

~~~
gjm11
It's a GAN with an unorthodox (and rather slow) training scheme!

~~~
rm_-rf_slash
Generative Adversarial Teamwork.

------
blauditore
>Trust in what we see is increasingly important in a world where image editing
has become ubiquitous – fake content is a serious and increasingly pressing
issue.

I keep on reading such statements, but still don't feel like this is a real
(increasing) problem. Pictures could be faked is a believable way 100 years
ago already, and I cannot recall a single instance where new, fancy AI-driven
fakes were any more of an issue than classic photoshopping. In my opinion,
it's all just FUD.

~~~
tachyonbeam
100 years ago you would have needed an expert, specialized equipment and
several hours. 10 years ago, you still needed an expert and probably an hour
at least. Soon, there may be tools anyone can use, feed in some photos of you
and a description of what you want to see. It will be easy to generate fake
profile pictures that look like you, for instance. Your voice will be easily
cloned on the phone as well, and eventually, even video chats could be fake.

Scammers could find out when you're on a trip from social media, reach out to
your family, ask for emergency money because of a stolen wallet, etc. Still
some effort for the scammers, but eventually, you might be able to automate
the whole scam with a bot. The bot doesn't need to be that smart, it just
needs to follow a script with a handful of variants, like a preplanned
dialogue tree.

~~~
blauditore
This is something that could be done before as well by just impersonating
someone. The voice just needs to sound believably similar, which is not too
hard with a sufficiently bad connection. And yet such attacks don't seem
common at all.

~~~
e12e
Sure, you can impersonate _someone_ (or even _anyone_ ). A botnet can
impersonate _everyone_ , and make robocalls to _everyone_.

If you're doing a targeted attack, you might need a significant chance of
success - if you can target 10 000 users, you might get away with a 1% success
rate, and still make a significant amount (say 60 times 1 000 USD in
"emergency" funds).

------
tpetry
Apply an ai like this with more detections to a typical instagram „models“
stream and you‘ll see that almost all pictures are altered. This could maybe
be a good wakeup call for the young generation which is seeing the bodies as
problematic because they are not as perfect as other people on instagram

~~~
mosselman
It would be a great idea to legally force social networks to automatically
apply this kind of AI and show the results in the feed for the reasons you
mentioned.

~~~
brokenkebab
Why stop here? Let's legally forbid makeup. Think about tone-evening creams,
for example, what they do to truth!

~~~
mosselman
Clear strawman fallacy here. I did not say 'forbid'. I said that it should be
made clear that the images are doctored.

Besides, there is a big difference between make-up, that anyone could
theoretically apply to look like someone in real-life and what people use
photoshop for in the images we are talking about.

~~~
brokenkebab
I would call it reductio ad absurdum. But that's not the point as I wasn't
refuting any arguments you made (you didn't any), but making fun of what I see
as a knee-jerk reaction of calling for a new law on every minor occasion.

Btw, your final para: do you mean that only privileged people can apply
"beatification" effects on their phitos? FWIW make-up is much more expensive.

------
classified
Ha, I was reading "AI Dictator". Possibly a bit too early but eventually just
a question of time until that complimentary update ships.

------
kelvin0
AI vs AI ... so the race has begun.

------
rndgermandude
Here is their website including code to play with:
[https://peterwang512.github.io/FALdetector/](https://peterwang512.github.io/FALdetector/)

(and yes, somebody already turned this into an app)

~~~
ralfd
> and yes, somebody already turned this into an app

This here?

[https://www.reddit.com/r/iOSProgramming/comments/e08sps/woke...](https://www.reddit.com/r/iOSProgramming/comments/e08sps/woke_up_to_my_app_54/)

Costs 3,49 Euro in my local App store.

~~~
rndgermandude
That's the one I saw too. They do the processing on some server, too, which
might explain the price tag, and is a bit problematic privacy-wise.

