Human rights groups and the like can request permission for a designated user to have a "special administrator" permission. Anyone with that permission will see a permission toggle when they look at someone's user account. That toggle will allow them to give or revoke permission to view these "forbidden" videos.
That will then allows these organisations to give access to anyone they deem suitable to help them police these videos. There would need to be some sort of safeguards to make sure that permission is not accidentally given to a minor or something.
This should solve a lot of the problems I reckon. Thoughts?
If anyone needs evidence, it is Justice prosecutors and it is likely already possible. Deleted from user-facing Youtube and Facebook and the like doesn't mean it's deleted from their storage...
Plus the title is misleading: they're not deleting evidence of war crimes, they're deleting upsetting and distressing videos from their open platform, and that's a good thing! Nothing proves they don't already pass these videos to law enforcement.
Indeed, who watches the watchmen? The SPLC was recently embroiled in sexist and racist controversies. No human organization is exempt from corruption. We all have to watch each other while simultaneously be charitable and willing to forgive.
> Plus the title is misleading: they're not deleting evidence of war crimes, they're deleting upsetting and distressing videos from their open platform, and that's a good thing!
Debatable. There's no such thing as a right to be "not distressed" or "not upset". Being a member of any human society means being exposed to stresses of all kinds, and we arguably all have the stressful duty of guarding our human and civil rights.
However, all the debate aside, the simple truth is that governments are ordering that content be removed so until we can have the debates it would probably be a good idea to have some way to allow the content be viewed by some people. Arguments that it's probably already available to law enforcement are all very well but law enforcement has limited resources and it's impossible for them to review all the flagged content so if there are people who are willing to volunteer their time to review it looking for evidence of crimes then we should probably have a way to enable that.
As for their being some bad actors in these organisations - that's pretty much unavoidable, even law enforcement has some bad apples.
It's about compromising and making the best of what we have.
The idea is that the platform providers can comply with their legal obligation to remove such content from the general public but it can still be made available to lawmakers and armchair detectives.
However, I'm sure all these videos still exist somewhere in backup. I doubt they are really deleted.