Well then you have >>definitely real<< AI generated picture. I don't think that problem has a technological solution, but a regulatory one - each generated picture must be somehow marked (i.e. via a watermark invisible for human, but difficult to delete by a program) that it is AI generated. If a person or a company fails to do so, massive fines are on their way.
I feel like that horse has already fled the stable.
Stable Diffusion has been kicking around in the wild for a couple of years now, and probably a million people are running it. You can't really criminalize that.
But key generation and storage can be in a hardware module, and tied to a unique device, so it would be quite difficult to hijack -- and, if you did, you'd only get very limited use out of it. So there might be a technological solution. Maybe not a perfect one, but one that's at least plausibly useful.
Even easier with video: Hash each frame in a way that's dependent on the previous frame. A hashchain, if you will.
Totally impossible with text, of course. AI text can be, in theory, genuinely undetectable.