Hacker News new | past | comments | ask | show | jobs | submit login

So, here's an idea. Let's take the case of YouTube, instead of its algorithm deleting this content, how about it is marked with a flag. Normal users are completely unable to view, or even know of the existence of content marked with this flag. It doesn't come up in searches and even a direct link will just show some "content unavailable" message.

Human rights groups and the like can request permission for a designated user to have a "special administrator" permission. Anyone with that permission will see a permission toggle when they look at someone's user account. That toggle will allow them to give or revoke permission to view these "forbidden" videos.

That will then allows these organisations to give access to anyone they deem suitable to help them police these videos. There would need to be some sort of safeguards to make sure that permission is not accidentally given to a minor or something.

This should solve a lot of the problems I reckon. Thoughts?




Why would Human rights groups be entitled to have this censorship privilege? So they'd decide now what's good or bad to display on Youtube?

If anyone needs evidence, it is Justice prosecutors and it is likely already possible. Deleted from user-facing Youtube and Facebook and the like doesn't mean it's deleted from their storage...

Plus the title is misleading: they're not deleting evidence of war crimes, they're deleting upsetting and distressing videos from their open platform, and that's a good thing! Nothing proves they don't already pass these videos to law enforcement.


> Why would Human rights groups be entitled to have this censorship privilege?

Indeed, who watches the watchmen? The SPLC was recently embroiled in sexist and racist controversies. No human organization is exempt from corruption. We all have to watch each other while simultaneously be charitable and willing to forgive.

> Plus the title is misleading: they're not deleting evidence of war crimes, they're deleting upsetting and distressing videos from their open platform, and that's a good thing!

Debatable. There's no such thing as a right to be "not distressed" or "not upset". Being a member of any human society means being exposed to stresses of all kinds, and we arguably all have the stressful duty of guarding our human and civil rights.


I am in absolute agreement with you that whether or not such material should be removed is subject to debate. There are good arguments on both sides. I am generally against censorship but I can recognise some merit in protecting children from videos of people being shot, hanged, beheaded, etc.. Not that I particularly want to go down the "think of the children" route.

However, all the debate aside, the simple truth is that governments are ordering that content be removed so until we can have the debates it would probably be a good idea to have some way to allow the content be viewed by some people. Arguments that it's probably already available to law enforcement are all very well but law enforcement has limited resources and it's impossible for them to review all the flagged content so if there are people who are willing to volunteer their time to review it looking for evidence of crimes then we should probably have a way to enable that.

As for their being some bad actors in these organisations - that's pretty much unavoidable, even law enforcement has some bad apples.

It's about compromising and making the best of what we have.


I fear I may not have explained myself properly as you appear to have misunderstood. I was not proposing that these groups be able to decide what does and does not get shown on YouTube. I was proposing that they have the facility to allow selected people to view content that has been flagged so those people can review that content for evidence of war crimes, etc... The content would still be unavailable to the general public.

The idea is that the platform providers can comply with their legal obligation to remove such content from the general public but it can still be made available to lawmakers and armchair detectives.


How do you sell this to YouTube or enforce this on them, to first accept then prioritise this functionality? What benefits do they gain or what alternatives do they avoid, as a company for profit?


Yes, some sort of tiered access seems better than mass deletion.

However, I'm sure all these videos still exist somewhere in backup. I doubt they are really deleted.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: