They say that "media literacy is poor" in Pakistan but it's not at all clear that education would solve the problem. We've seen in more "media-literate" countries a free-floating distrust of media of all kinds that is exacerbated by the keener awareness of deceptive techniques. Niklas Luhmann has written about this "suspicion of manipulation:" https://monoskop.org/images/6/6c/Luhmann_Niklas_The_Reality_...
The fact that advertising works, despite many claiming they're not susceptible to it, is good evidence that media manipulation works amazingly well - even to people who think they understand and would not be manipulated.
Deepfakes are being weaponized against all politicians in Pakistan - not just women.
FYI, Pakistan is also in the process of releasing it's version of China's Great Firewall and has heavily leveraged the "AI Safety" argument for that [0][1].
Also, it's quite rich that a PMLN minister in Punjab is complaining about electoral safety when her party viciously cracked down on the PTI.
I would say creating sexual, pornographic or depicting criminal acts with AI generated images and videos using photographs and media of a person without their consent is morally wrong. It will however, only get easier to create and manipulate images. It is going to become automatic and the barriers to entry will be reduced. The social default will be to assume that everything is a AI generated or manipulated. It will not matter if the content is real.
Making film backed images from edited digital pictures is not beyond the ability of a motivated individual, let alone a state entity. Even something as rudimentary as a film camera pointed at a slightly out of focus 4k screen can be made believable, specially in a film stock with heavy grain.
I guess if you have enough resources, faking something is possible. But making it expensive means fewer people could do it. The advent of ai and ease of doing so is what is dangerous.
Perhaps images taken in the future by digital cameras need to come with an IR depth map as well, which would make it harder to fake.
While the main fact of this story (that deepfakes were created) is true, the article overall seems like a propaganda piece by a highly unpopular government to create a soft image. Ignore the other stuff in the article.
Just FYI, to those not in Pakistan who might not be aware. The minister mentioned in the article (yes, deepfakes are a form of sexual criminality!), is a direct underling of another female chief minister (who won despite her party receiving less than 20% of the votes - yay military rigging) who ... get this ...
runs an extortion ring where 100s of judges, politicians etc are honey trapped & copious videos are made. She (& her husband) have OPENLY bragged about this, FOR YEARS. 1 judge (whose video was released by her team) committed suicide 3 years ago. The rest of the judges all toe the line (hence the laughable "lawfare" vs Imran Khan).
She makes J Edgar Hoover seem like a novice. Just keep that in mind ...
A few brave souls (eg Senator Swati) have openly admitted to this attempted blackmail (his was an explosive, tearful press conference 3 years ago).
She even does it to the military top brass, and they do it to her & her goons. In short, they're all low-rent garbage, so save your tears for the more worthwhile 1000s of political prisoners (MANY of whom are female) of this military-led, crime-family infested, American State Dept sponsored gang of thieves & cutthroats (eg: witness recent Islamabad massacre, helpfully ignored by the mainstream US media)
If the state dept threw out IK that was wrong, but why has everyone forgotten IK begging the “umpires” to throw govt out when he was out of power before elected (with some help) the first time.
Whether deepfakes are or are not a form of sexual criminality is an absolute based on the jurisdiction. Either it is a crime there or it isn't. That's independent of whether or not individuals or the population agree it should be, whether it's moral, whether it's seen as overreaching, or any other consideration beyond what the law there says.
Yea, sure, being gay or having anal sex, is still a crime too in some places.
We can argue it shouldn't be considered a crime and I'd argue deepfakes should not be a crime. I'd also argue considering them so is close to "thought police".
And, I'd argue fighting them is a losing cause. They get easier and easier to make daily. Lots of papers at the most recent Siggraph on how to take a single photo and turn it into an deepfake, oh, I mean avatar for a game, model for virtual clothes fitting, etc...
To me, its just another form of fiction. Its as much a crime as writing a story where you base a character in it on someone you know, and subject that character to sexual acts. Sure its depraved, but it doesn't pass the criminality threshold for me, not by a long shot.
And no, i do not see the use of real photos and video to be material to this specific discussion.
I think this issue is a great way to evaluate the values and integrity of a society.
Also if these people would amass outside your house and demand you'd be tried for blasphemy? Or would just try to kill you? E.g. Pakistan, women, religion and sexuality is a heady mix. Not everybody has the luxury of not caring.
I would also like to point out again that people still developing and productizing deepfake software or tools that underpin them, should take the morally only correct path and stop, and retract their work. It's a considerable net negative on society.
>Also if these people would amass outside your house and demand you'd be tried for blasphemy? Or would just try to kill you? E.g. Pakistan, women, religion and sexuality is a heady mix. Not everybody has the luxury of not caring.