Very true, but there's a notable distinction: they can devote 100% of their expertise to moderating such nuances. It would be costly for a company such as tumbler (one which is not built around inappropriate material) to build out and maintain such expertise.
> The reason they won't is because they don't like to go against the grain of the currently advertiser-mandated vision of an exclusively family-friendly internet — where 'internet' here means the ad-supported part of it; i.e., all of the bigger commercial content silos.
Are you surprised this is the case? What brand wants to be associated with that sort of stuff?
> The outrage comes when people stumble upon photos of minors in the early pubescent or even prepubescent stage of development intended to titillate. That is, content that is fairly consistently classed as child pornography, and no apparent action is undertaken to purge that content.
There have been quite a few instances of prosecuting minors due to sending pictures of inappropriate content, taken by themselves: https://www.thedailybeast.com/cheats/2010/03/21/is-sexting-c...
> For the odd case where an account is uploading content that looks like it might involve a minor nearing adulthood, a platform privately and confidentially asking for proof of identity and age is reasonable enough. It's a fair solution for, to name just one example, the odd flat-chested twenty-something exhibitionist of Asian descent.
I'm sure there was a more polite way to phrase that. Regardless, much of the internet consists of downloading and re-uploading media. This would remain difficult.