Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: Create a Deep Fake Video from Any Image (myvoiceyourface.com)
31 points by soheil 27 days ago | hide | past | web | favorite | 41 comments



This just looks like a simple web app wrapper around First Order Model: https://github.com/AliaksandrSiarohin/first-order-model

The license for the code specifically states "Non-Commercial".

Is your code something else? Also seems like you've double posted it.

Edit: Your other posts also have a similar look - are you testing the waters for a viable biz model by adding web-app wrappers around ML papers who have published their code on Github? Not being accusatory or anything, just trying to understand.


Quite brave leaving random Internet generated content on the front page of the site.


Interesting 1st observation


Haha sorry my day job is in security so my mind goes directly to abuse.

The transparency is very appreciated. Nice job putting it together, hope you get some actually constructive (vs mine) feedback from hn


There's a huge and qualitative difference between the example video and the 'recent videos' below.

The example video shows the face being stretched mostly around the mouth, and without extreme or sudden changes. Sure, it looks fake, but it seems coherent. It's almost the kind of thing a human would make in a comedy setting. But in the generated videos below, I see a lot of the issues we often get with generative models -- eyes or mouths flickering in and out of existence, a disappearing neck, parts of the background getting mixed with the face, etc. Sort of nightmare-ish, but not at the level of deep-dream dog-worms from 2014 or whenever.

Were these even from the same method? Or is there some about the user-submitted photos that meaningfully differs from those used during training?


The instruction wasn't clear, the video and the image both need to have a person's head for this to work.


Is this related to avatarify at all? I tried that a few times and the result was very similar to what is in the recent videos section.


It would be really cool if it actually worked. Saw the recent videos and they are pretty much all a mess.

Anyone knows of something like this that works? Links? Thank you.


Looks like people were uploading odd videos/images due to lack of instructions. So I just added them:

Select a headshot video of a person speaking and an image that you would like to bring to life. Make sure both image and the video are cropped so mainly the face is visible. (no rotated images/videos)


Your pricing plans:

https://myvoiceyourface.com/pay/

$20/mo for 20 videos. While the technical output you've achieved is notable, just curious, who is your target audience? How do you see these videos being used?


It works with line drawings.. https://myvoiceyourface.com/video/?id=355fb6de15


The website is not opening for me.


I think it is being hugged to death atm, can't load it either.


it's back now


Yeah this deep fake is flawed enough to be amusing. But I know we're probably only 1-2 years away from producing extremely convincing deep fakes at a whim.

In my mind, blockchain is the only thing that can counterbalance this tech. We need blockchain signed photos and videos to ensure authenticity.


I see blockchain-verification hype all over the place, and no-one's ever been able to convince me of how it would actually work in situations like this. Could you elaborate? How would you differentiate between "a 'legitimate' video that has been taken by a camera" and "a video that has been edited, and then had its provenance metadata edited to look like it is raw"? Given that there cannot possibly be any consensus protocol that verifies that "data I created myself" is legitimate, how can you prove that anything isn't legitimate?


One thing you could do is have the camera hardware sign the raw image, then make a zero knowledge proof that says original photo was signed by real hardware, and then the only edits applied to it were simple things like cropping, changing colors etc

Seems reasonable to coordinate the updates with a blockchain (though not necessary for use cases where you trust the platform / coordinating entity).

Disclaimer I work at a zkSNARK / blockchain company :)


So like most solutions involving blockchains, no blockchain is actually required.


Sigh... why is blockchain the only answer?

Blockchain or any signing system, PGP, shared key, is only as good as the person/entity signing it.

Who will perform that signing that you trust... CNN? Fox News? Google? THE US GOVERNMENT?!


Isn’t that the whole point of blockchain? A distributed ledger with no central authority so no single entity has the power to manipulate the history...


Let's say you can't manipulate history (it's possible btw).

History is only as good as the one telling the story. In this case, who inserted the initial video? The FIRST person who inserted gets to say "this is truth".

With Bitcoin what's being traded existed from the beginning. It was "mined" from the original pool. Here we all collectively said, "we trust that original pool not to be shifty". So because we trust that original source, we can trade these "items" confidently.

Here the question isn't "Proof of work", it's "proof of source" which currently I cannot see a good solution for.


https://en.m.wikipedia.org/wiki/Proof_of_authority

Proof of Authority. What matters is the reputation of the source. You can have varying degrees of trustworthiness based on an source's reputation, which is built up on the chain.

For example, Craig Wright has faced a lot of backlash for claiming to be Satoshi Nakamoto. There's been no way for him to be proven wrong. Recently, someone signed a transaction with an address that hadn't moved Bitcoin since 2009 (implying it was an early adopter or contributor) and they called out Wright's lie (he had claimed to own that block).

So Wright lost credibility. That's what we need. We need the ability to develop reputation on the blockchain and a way to punish that reputation economically for posting invalid data.

EDIT (I'm out of HN replies): Reputation can be established based on a staking transaction. You create a smart contract for say, independent journalist Tim Pool. He signs it and stakes some amount and publicly declares that address to be his. He posts all his videos with a signature embedded that can be validated against that contract. The hardware itself (with algos signed by chip manufacturer) could also sign the video. An open source app could also sign. If you've got 3 signatures on a video HW, SW, and individual (associated with smart contract address) and a public blockchain timestamp, you've got a high degree of trustworthiness in that footage.


Can I reply twice? Replying to the edit:

That's a social proof token[0][1].

But again, think about bad actors gaming the trust token and then inserting their own deep fakes. It's not even far fetch.

Also, who 'gives' those tokens? What if a whole bunch of republicans give a lot of social proof tokens to Alex Jones. At one point in time, that made quite a bit of sense. So now you've got a blockchain backed authority saying Sandy Hook was a democrat red herring.

So then a bunch of democrats come in to push that Alex Jones vote back down. That same system that pushed Alex down is now a system that can push others back down in a weaponized fashion.

Separately, if you just want to say "I created this work" regardless of who I am. PGP does that perfectly well. The whole block chain infrastructure isn't necessary. It doesn't add anything to the validity of the video itself.

[0] https://ght.dtsgroup.co.nz/ [1] https://www.hubtoken.org/images/hub-white-paper.pdf


I'm not looking at is a token per se. I'm mainly interested in a Decentralized ID, which could represent an organization, like FOX or CNN. Nevermind the political biases of those orgs. Each org can establish a DID that they publicly claim is theirs and they sign all their video footage with those DIDs. Theoretically an open source camera and microphone chip could also sign a MP4 or MP3 file in a verifiable way. So you've got different levels of authenticity verification. This is just to prevent someone from taking this AI tech and making it look like an anchor said something that they didn't actually say. The timestamp of signing a transaction on the blockchain is important for proving when something occurred.

PGP depends on a certificate authority, and is quite centralized in that sense. Blockchain allows multiple open standards to exist and to develop organically. AI will continue to evolve and it's important that verification tech evolve along with it in a open source manner. The blockchain community, especially with it's recent work and research on zero knowledge proofs, are more prepared to adapt to those changes. Their whole industry is based on establishing consensus in a permissionless way.


Again a decentralized ID is all well and good. An extension of PGP/GPG which as before, works great for this person signed this video.

The problem of deep fakes still exists. Deep fake isn't just about altering, it's also about creating whole new videos that didn't happen using the likeness of a person like in OP's link.

If I upload that, sign it with my decentralized ID, CNN and Fox picks it up and co-sign it with their decentralized ID. Great! My deep fake is in the wild picked up by millions all while being completely verified.

But that won't happen because no one would believe you irjustin. Right, isn't that the crux? Who do we trust?

(if you're thinking i can't just sign any random video, it's gotta be signed with my phone - don't worry let's go back to direct feed into the camera)

I'm not sure what else to say to help you understand that if you're not willing to take a hard look at your own solution then there's nothing I can do to convince you the block chain is really good at doing some things and preventing deep fakes is not one of them.


You're never gonna solve it with 100% veracity. But look, if irjustin smart contract got created in less than 24 hours and posted the fake photo or video with no prior reputation and also the video lacks hardware signatures then you've got the markers to establish trustworthiness. That should give people enough reason to doubt it.


You've got to look at it one level deeper. Why do you trust the guy who could sign using the original BTC?

That trust is built on 2 things. PGP is trusthworthy and BTC is worth listening to.

I replied to your other post, but it's embedded in "why is BTC worth listening to"?

BTC/Eth is trustworthy because fundamentally it is a self-contained math problem that can be self tested by anyone.

Today, there is no way to make video self-testable in a trustworthy fashion because you will always be asking "Who uploaded/created this?"


Given that when you see a video recorded by someone's phone on twitter, it is probably the first and only time you will see a video from that person. What use if tracking trust if you have to accept things from unproven people all the time?


Any public open source blockchain. Whichever one the market rallies around, probably Ethereum. Because it's a consistent point of reference. The apps / phones could sign using zero knowledge proofs, so you don't have to give away personal details to verify.

At the very least this needs to be done with public figures in particular, since a convincing fake video could invoke a riot or cause a war.


In effect, you're telling me you are willing to trust every/any phone to self sign a video and upload it into the block chain as truth?

So as a bad actor, I'll just take a video of a deep fake video with my self signing phone to circumvent your trust?

Ah but it's easy to spot a video of a video!

But what if I just replace the camera with a direct feed system?

That's absurd amount of money/work, why would you do that?

Because I'm a state level actor and I really want this war to start.

BTC & Eth are trustworthy because the original pool we can perform tests against the math which it is built upon. In fact, it doesn't matter who uploaded BTC/Eth. The system is a self contained math problem.

An uploaded video does not have such self-testing ability because it always comes down to "who" took it and who uploaded it.

"Trust at the source" or "Who to trust?" is now the problem. Blockchain cannot solve that alone. I don't know any system that could.


Why blockchain? You just need the original video and its metadata to be signed by the camera. A blockchain would be just as useless here as every other place a blockchain has been applied.


I use Ethereum every week. I buy things with Bitcoin at least once a month. I invest and bank part of my money on decentralized applications pegged to the USD in stable coins. Sorry, blockchain is more than useful and your rhetoric is out of date.

The use of blockchain is important as a global timestamp that can be trusted.


The point is that everywhere you are using blockchain, other solutions are superior, which makes it useless. If I wanted to inconvenience myself, I could buy things by bartering paperclips too, but that doesn't mean that paperclips are useful as currency. The difference between paperclips and blockchain is that paperclips solve one problem better than others, while blockchain has zero real world uses where it is better than alternatives.


What stops me from signing my own deepfaked video and putting that on the blockchain?


If you want to incriminate yourself with a deep fake, go for it. But the point is that video evidence will cease to be a reliable source of truth by the end of this decade. That's stunning, and the implications are drastic.

Digital signatures to validate a source and recording method are needed.


^^ this


What stops me from writing a check to myself and cashing it?


a) it is absolutely conceivable to write a cheque to yourself. For example, to move cash between bank accounts

b) the bank has a big database and a system of clearing cheques that ensures you cannot double-spend money by writing yourself a cheque. If you try to kite cheques by abusing the latency in the system they notice because it turns out you can just write that down in a central database too and it's pretty effective


I remember a friend doing that. More funny a friend wrote I owe you $5 and deposited and took out $10 the minimum allowed.

What happened was that person lost the ability to deposit and withdraw.

There is a level of trust but things will catch up to you.


For those that aren't following along, the answer is nothing and the parents comment completely misses the point of being able to attribute something to a person.


I'm trying to figure out if this entire thread is a troll or serious




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: