
Online Demo of DeepWarp: Photorealistic Image Resynthesis for Gaze Manipulation - orless
http://163.172.78.19/
======
teleclimber
While the technology is amazing, I am a bit bothered by all these picture and
video modifying algorithms.

The issue is that we can't know what's real any more. It used to be if you saw
a video or a photo depicting an event you could be pretty sure that what
you're looking at actually happened.

Now, if you see a video of a prominent politician saying something awful in
your twitter timeline (or whatever), they may have actually never said
anything remotely close. It could be a completely fictional video that looks
perfectly realistic[0], made by some teen in Macedonia.[1]

I realize photography and video have always been used to trick people into
thinking things that aren't true, but this technology enables nuclear-grade
deception.

I am wondering: is there a use-case for such an algorithm that is practical
and good for the world?

PS: I know an eye-rolling algo is quite innocuous but I've had this thought on
my mind about these in general and needed to air it out.

[0]
[https://www.youtube.com/watch?v=ohmajJTcpNk](https://www.youtube.com/watch?v=ohmajJTcpNk)
[1] [https://www.wired.com/2017/02/veles-macedonia-fake-
news/](https://www.wired.com/2017/02/veles-macedonia-fake-news/)

~~~
captainmuon
> It used to be if you saw a video or a photo depicting an event you could be
> pretty sure that what you're looking at actually happened.

That is only true because we are in a wierd transition period. Before
photography, there was no way to create an accurate snapshot of an event.
Since then, it was easy to take a photo, but hard to fake one. For the first
time in history we were able to make visual proof of events. Of course, since
the beginning, people have been trying to fake photographs. That has been
mostly detectable, but is getting better and better all the time.

(This also gives us a certain fetishism for unedited images in journalism. I
understand where it comes from, but I think it is objectively wierd that
certain manipulations are allowed, and others considered scandalous.)

I believe we are moving to a period, where we can no longer consider photos to
proove facts. People will resist for quite some time (as they are resisting
the fact that the internet allows us to copy information for free). There will
be legal push-back, image editing will be scandalized... but eventually, the
technology will be come so good and ubiquotious that nobody can rely on
pictures anymore.

I also believe it is a good thing! We are becoming more and more nervous about
our image on the internet, about private photos leaking out there, ruining our
employability etc.. What happens if everybody can say: "OK Google, make a
photo of Jason where he is drunk and riding a donkey" and you get a convincing
fake? Eventually, we must adapt, and these worries will go away.

We will have to learn to treat pieces of information not due their origin, but
only due to their content.

For a current example: You got a picture of President Trump with prosititues
in Moscow? I don't care if that _really_ happened or not - I'm not prude, and
it doesn't have _immediate_ relevance. Let him have fun. What I do care about:
Does the story fit my image of him? Do I think he is capable of doing that?
Does this have explaining power? Note it might as well be a painting or a blog
post instead of a photo.

In my old days it seems I am going full-on postmodern...

~~~
pegasus
Using trusted computing and related technologies it should be entirely
feasible to build tamper-proof cameras that produce provably-real photos,
including trusted timestamp. It should be a small step for example for Apple
to add such a feature to their future iPhones, given the layers of security
they already have in their hardware.

~~~
loa_in_
you are talking about secure infrastructure, but the image itself can be still
faked. And no such infrastructure is safe, in the end, when given entirely to
the end user (modify camera sensor, transfer signed image to an intact
device).

------
orless
We had it a few months ago (see
[https://news.ycombinator.com/item?id=12164728](https://news.ycombinator.com/item?id=12164728))
but now there seem to be a demo available online.

Sample result: [http://imgur.com/a/nyG4Z](http://imgur.com/a/nyG4Z)

Abstract:
[http://sites.skoltech.ru/compvision/projects/deepwarp/](http://sites.skoltech.ru/compvision/projects/deepwarp/)

~~~
rrherr
"Machine Learning Algorithms that _Matter_ , like Machine Translation, Spam
Filter or Cat Generation, will find its way inside a Web Browser."
[https://twitter.com/hardmaru/status/834607972923748353](https://twitter.com/hardmaru/status/834607972923748353)

------
avenoir
I find that the side-to-side motion is nearly natural. However, the up-down
movement seems to introduce clipping between the lower eye lid and the iris in
addition to slightly smudged upper lid. Still very cool technology. And here I
thought Skolkovo was long dead.

------
Mao_Zedang
Cant wait for this to be used for political purposes.

