I have some technical knowledge of the field, and this is not true. The ability to do this is brand new and getting into the public's hands essentially the same time as anyone else's.
For hard evidence of this, see Russia's use of deepfakes a few months ago to impersonate Zelensky and attempt to make Ukraine think their leader was surrendering.
The deep fake was technically advanced but also laughably bad.
What would be an interesting and difficult question is whether this state of things can largely be attributed to the AI community's commitment to making advances in AI open to public knowledge and use, or if there is some stronger factor at work.
It is not brand new. The ability to replace actors in film was developed for stunt double placements as a manual process, and efforts to automate have been halted by immature male pornographic greed. Seriously. I patented a feature film quality process back in '08 and investor fascination with pornography halted further progress.
How did interest from the pornography industry halt further progress? I would have assumed the high demand for the product would have driven innovation much like how the porn industry decided the winner in the vhs/betamax format war
We had viable non-porn applications that would have made serious bank: 1) brand name advertising with a celebrity telling you (you're in the video advertisement too) explaining how smart you are for using their brand; 2) film trailers where you're in the action too; 3) educational media for autistic children (serious work here); 4) aspirational/experiential media for psychological therapy; 5) personalized photo-real video game characters.
Basically, every time we formed an investor pool after a while one of them would "suddenly realize" actor replacement applied to porn and then he'd fixate on the idea. He'd talk the other angels into the idea and then they'd insist the company pursue porn. We'd explain the non-porn higher valued applications, and the danger of porn psychologically tainting the public perception of the technology and the creators themselves. Plus, we had VFX Oscar winners in the company, why the hell would they do porn?
Why not, it sounded like you had a good amount of investor interest? Was it just moral objections, cause while items 2-5 seem fine as business ideas I would have assumed anyone trying to sell item 1 wouldn’t have cared about porn if money was involved.
For hard evidence of this, see Russia's use of deepfakes a few months ago to impersonate Zelensky and attempt to make Ukraine think their leader was surrendering.
The deep fake was technically advanced but also laughably bad.
What would be an interesting and difficult question is whether this state of things can largely be attributed to the AI community's commitment to making advances in AI open to public knowledge and use, or if there is some stronger factor at work.