The internet has - and will always have - a number of assholes whose only joy it is to make others miserable in whatever way they can. So from that day on I always design whatever I make with abuse in mind from day one. It still doesn't always work, but for the most part that seems to be the only reasonable way to put stuff together for public consumption.
It's frustrating, but it's reality.
It was a lesson Microsoft learnt the hard way, a decade+ later with its AI twitter bot (Tay)
Especially the "fight" between GGX Gang and The Engineer to combat them.
That's going to be their downfall, since its priority seems to be to connect people to the worst of the site as firmly as possible. A year or two ago I became curious about the flat earthers and watched 2 videos to try and figure out if there was any sort of deeper meaning to their superficially ridiculous claims (spoiler: there isn't). YT kept shoving flat earth videos at me for weeks like some sort of ontological crack dealer.
Whether by design or accident the flat earth nonsense has the potential effect of unmooring people from the very notion of an accepted reality; and thus habituating them to the idea that what they inherently know about the world cannot be trusted. It literally pulls the world out from under their feet.
That is, if you can get people to believe or even question that something as fundamental as the shape of the earth is a lie they've been told, then you can effectively "clean-sheet" them. They become blank-canvases upon which you can write the alternative reality of your choice. Not to mention this type of nonsense generates mistrust in the status-quo, as it raises the question among its victims: "who is this 'they' who lied to us about such fundamental and important matters?"
So, if you then claim to be against the status quo, then you have sympathizers who are now open to your message. It thus becomes much easier to write your story onto their now blank canvases and serve as their champion.
Religion has already done that.
But now, all I see recommended is Don Rickles. Literally the whole sidebar except 1 or 2 videos is Don Rickles even if I’m watching a tech video. Drives me nuts.
He's totally 100% serious. Not trolling; not joking.
He buys into a lot of conspiracy theories and he's a pretty paranoid person, in general. He also thinks some conspiracy theories are actually conspiracy theories.
If only we could get everyone to turn it off en masse.
Belief in a flat earth is a dog-whistle for fundamentalist Christians.
So yeah, they don't care about the fact that we can observe the Earth is round or any other proof, their ignorance is proof of their faith.
In fact, most of today's flat-earthers seem to be primarily driven by the arguments from flat-earth claims and nothing more: https://www.theguardian.com/global/2018/may/27/is-the-earth-...
A "Child mode" flag would be set system wide when the device is configured by the parent to be used by a child. This should enforce a top down policy that only allows installing child safe apps or apps that propagate and enforce a similar policy. For example, Chrome would be installed but would only allow access to sites that respect the child safe flag. Youtube would load but only run videos and suggestions form a carefully curated set of publishers that respect the age. Etcetera.
Compared with previous approaches that failed due to lack of adoption (RASCi, ICRA) this system would provide a strong incentive for app and site owners to properly support the age flag, else they are simply invisible in the children market. A strong initial effort would be required from Google/Microsoft to bootstrap the initial whitelists.
Goog just needs to give up and partner with pbs or something to make kids videos a charity.
It’s actually kind of worse, because it lulls parents into thinking it’s safe. And it’s even harder to police since adults never use it and the UI is different than the YouTube parents expect.
Then they're blamed for allowing a media oligopoly form, whilst ignoring small content producers. We're seeing this play out with the demonitazation of popular vid creators.
Long story short, with the public, damned if you, and damned if you don't.
The problem is that Youtube wants to show you an endless stream of content, and they never ever want to show you an empty search result.
So they show you sketchy videos uploaded by bots with no quality control, and their stupid algorithms somehow manage to always put the fake stuff next to the real stuff.
And to make it clear: Nobody is requesting that Google should block questionable videos. They just shouldn't shove them in your face like they do now.
This isn’t going to be solved with better algorithms or machine learning. The bad actors also have computing resources. I think it can only be solved by human curation.
That experience was a non-zero factor in my transition into a technical career. Big companies will automate anything they can and being on the automation side rather than the automated seems like the better idea.
But that doesn't work. Until we have true human level AI (and honestly maybe not even then), decent curation has to be done by humans rather than machines. That's why forums tend to be less 'toxic' than Twitter or Facebook, because the moderators can view content and use their personal disgression while figuring out whether something breaks the rules and then act accordingly. It's why (somewhat surprisingly), the likes of SMW Central and MFGG have better quality control standards than Steam, the App store or other such services; because the moderators physically play through every game submitted and determine whether they're good enough to be accepted. It's why Amazon has an issue with counterfeit products nowadays, and why Google search has spam and blackhat SEO.
And it's not just humans that are needed for good moderation or curation either. It's humans with knowledge/experience in the field in question, and with enough interest to go all in on their work to make the community better. Those 'moderation sweatshops' some social media services use (the ones that check for illegal content and what not by getting a bunch of low wage workers to scan through content en mass) are as flawed an idea as the algorithms; they don't know what's acceptable beyond the very minimum.
But I guess that's the issue with many businesses and services and startups now. Everyone's trying to solve human problems with only code, and realising it (at least for the time being) just often doesn't work that well.
Isn't Youtubes whole schtick that they don't want to do any curation, but instead want to just turn the algorithms loose on the firehose of data that's being uploaded and recommend whatever the algorithm spits out? Pivoting to human curation from there sounds like a really difficult task for them.
It's either that or I won't use Youtube (or rather, my kids won't). So it's definitely in their interest to curate.
They could whitelist a large number of content providers, and just rely on their judgement for this curated platform.
By far the best idea I've ever heard was the creation of a .kids TLD that would be essentially what you described. I believe there was a proposal for that that went nowhere.
Social mores are dramatically different in different places. What if all content was required to meet Saudi Arabia's regressive social moores? Or the disconnect between the US when it comes to nudity and violence?
A moderated community can work, but it has to be much, much more granular, and the effort to achieve that kind of granularity is basically not feasable from an economic standpoint.
Commonsensemedia is a good example, they have a US-centric Christian bent that doesn't agree with my own, but for the most part their information about movies and TV for kids is useful to me when deciding what to show my kids.
So there seems to be a treasure trove of voluntarily produced data on content by various groups on one side, and a desire to automate applying those ratings to my kids in a default whitelist then allow-by-exception model.
If it were technologically possible, I'd even be thrilled to have a default whitelist created by the parents who are my Facebook contacts. We do that manually today. "Hey, have you seen The Incredibles? Is it too scary for my three year old?"
Anything that allowed parents to do that, and that worked in a widespread way, would be adopted immediately and would probably be worth a subscription fee.
Marble run videos ok. Crazy Russian science guy, mostly ok, as long as it's not too pyromaniac or alcohol. Thoughtful Minecraft videos ok, annoying kids playing terrible games and freaking out about nothing, no thanks.
YouTube kids was totally worthless. Either you turn off search and only see featured videos which nobody wants to watch, or you enable search and basically everything is there, because it only blocks a few categories.
That was the day we learned about "trolls" on the internet.
I wish this were a better world and that this was unnecessary but in retrospect it was a sweet bonding moment we still laugh about years later.
Kids are resilient as long as you are there to help them make sense of it. And you have too. The internet and it's trolls aren't going anywhere. In the end, goatse comes for us all...
Step 2. Install PBS Kids and Duck Duck Moose
Do not let any un-curated automagic near kids. Its not worth the headache!
But that's just a piece of property... just putting on YouTube and letting kids explore the Internet has potential to be far worse. I know the fucked up things I saw on the Internet as a kid, and I know it hasn't gotten any better since then. YouTube is not a safe space and it blows my mind that otherwise responsible adults treat it as such.
Again I'm not a parent so maybe I can't understand it through that lens, but from an outsider's view it's awful. There's just so many hours of misery on YouTube that even adults don't need to see.
I realize this is kind of a third rail at the moment, but hear me out. The fact is that parents got by for thousands of years without having electronic child sitting devices. The time children waste on glowing screens could be more productively spent learning (remember books?), socializing with their peers, or enjoying healthy exercise. If a child absolutely has to have an hour a day in front of a screen, at the least give them something worthwhile like Mr. Rogers' reruns to watch. Who knows what the long term effects of 3 hours of the same nursery rhyme a day will have on their mental health. At the very least I think extreme caution is warranted; it's easier to let the screens do the baby-sitting, but raising a kid has never been easy.
And then, this time with a little bit of judgement on the parents, I see kids at sports games watching Netflix, or I hear my sister talking about her kids watching YouTube with headphones in the car on a roadtrip, and I think, maybe the kids should be learning the value of the experience they're missing instead of disappearing into a fantasy world before boredom has any chance. Netflix and YouTube might be just a little bit too easy of a solution.
I would say for most it's more like "these parents have to keep their existing work", given how housing costs have increased far faster than lower-class and lower-middle-class wages. Being a stay-at-home parent isn't really a viable choice if you need two incomes to afford a place to live.
> I see kids at sports games watching Netflix, or I hear my sister talking about her kids watching YouTube with headphones in the car on a roadtrip
Would you react the same if it was a book instead?
Yes. It's an experience that is meant to be actually experienced, not ignored for fantasy. Sports is entertainment already, if you're distracting yourself from a distraction you paid to be at, you should have stayed at home.
Likewise on a road trip, there is so much other social activity you can do without putting children in a fantasy bubble. At least with a book, though, you can engage the kid or have them read aloud to you.
Not only that but a little boredom is good for children; it fosters creativity.
But the strangest thing is the reaction from friends who have kids the same age that we’re being prudes or technophiles. I shared a few of these articles with a friend whose 9 year old had a YouTube channel with a few cute videos but spends about 4-6 hours per day watching. They denied the author. Screen caps showed videos like the ones referenced in the right list and recommended and they still said it wasn’t a big deal.
Really weird reaction from people I know well.
Makes me wonder if I’m getting all Tipper Gore in my old age and this is the new form of heavy metal panic.
In general, I don't think so. There's a pretty clear value proposition in things like heavy metal, D&D, and similar Tipper Gore-esque hysterias. There's really not much value in these things - algorithmically generated random content that is designed to trigger the same response over and over again.
But maybe a little on the 9 year old. I think the biggest fear here is the impact it has on really young children, and kids that are likely to get sucked into this are probably closer to the <5-6 year old range. At 9 the kid should have a much better understanding of reality and how the world works, and should be less likely to get sucked into watching these for any great length of time. They're also less likely to come away from it with their view of reality being impacted - they can see and parse what actually goes on well enough to know this shit is super weird.
I think for older children it's something to be aware of, and if you see it impacting their behavior, then certainly, take steps to handle it. And I imagine that some amount of older kids are going to react negatively to it as well.
YouTube is bidirectional-- your kids can engage with their favorite YouTube stars, with or without your knowledge.
Remember that brands were so insidious in their marketing to children during saturday morning cartoons in the 70s that the FCC mandated a more clear separation between what was a cartoon and what was a commercial. Cartoons are fine, advertising is fine, but blurring that line went too far and parents had a hard time saying no once their kids were sold something they thought was a cartoon.
Different situation today, but same outcome. Parents have a hard time saying no to their kids, and then to justify their decision and make it seem okay, they shame you for not making the same choices they did, until eventually every kid is watching characters from My Little Pony kill each other and think it's a cartoon, and then the feds have to step in and regulate it.
My wife and I have been putting off having kids because the peer pressure and judgement we see from parents is just so damn ridiculous.
I suspect they're deep, deep down in a pit of denial because to admit that this is a problem would be an existential threat to them and their way of parenting.
Unless you want to keep your kids in totally isolated bubbles, they live in the world. If the rest of the neighborhood kids or classmates are watching certain things, then you can't keep your kids from exposure and social pressure etc.
Issues are on a continuum from totally private, isolated concerns to broader social ones. Social problems require social solutions.
But yes, impressionable young children shouldn't get unrestricted internet access. And also we can and should criticize anyone profiting off of systems that are causing harm.
And yeah, that means they should probably stop eating processed food with added sugar. Which means they should avoid most of the shelves in the supermarket.
It’s a serious problem which goes far beyond personal responsibility. Our whole industrial food system is designed to deliver as much teeth-rotting food [as a bonus, also causes diabetes and heart disease and obesity] to as many consumers as cheaply as possible, while convincing those consumers that there’s nothing wrong. Not only that, we subsidize it heavily using our collective tax money.
Sure, there are some which are easily identified by the title or an obviously unofficial animation style, but if all of them were like that then I suspect YouTube would have had an easier job training their AI to detect and remove such content.
And it's very easy to blame the parents, since it is their responsibility. I don't let my kids watch youtube at all.
For example, search for the strictly inaccurate phrase “copyright not intended” on YouTube and observe the type of result.
So it's not really worth getting to hung up about copyright law in the context of this discussion as I could easily just replace "Peppa Pig" with "Kids Nursery Rhymes" and still make the same point.
There may be great content on YouTube, but that content could go anywhere. Is there maybe a market for a YouTube where a human reviews the content that gets posted and guarantees it is family-friendly?
I don't understand why some parents find electronic gizmos so essential for day to day activities.
But thanks for the patronising tone bud. Just because someone admits to playing YouTube to their son once in a while it doesn't mean they don't still spend nearly every waking moment at home interacting with them. Just like how feeding them fries occasionally is completely fine if their usual diet is healthy.
The reason I said go for a walk was that's exactly what I did last time we took the kids to get their haircut - asked them how long it would be, they said 15 minutes, so we went for a walk for 10 minutes
Its ironic you make that confession after judging me for saying something similar - particularly when I don't tend to give my kids electronic devices on long journeys instead favouring conversation (like you suggested I should be doing anyway). Which just goes to show that you don't have to understand how other families work, you just have to accept that it does work for them and not pass judgement like you had done.
> The reason I said go for a walk was that's exactly what I did last time we took the kids to get their haircut - asked them how long it would be, they said 15 minutes, so we went for a walk for 10 minutes
Honestly, I genuinely find it quite rude when people do that as you lose all sense of how many people are before you and when you'd be up next. I'd rather teach my kids to wait patiently than have them think it is ok to leave and rejoin a queue whenever they get board. But as I said, that might be a cultural difference.
This is ridiculous, you can simply go ask how many people are ahead of you if you want to know your place in the queue, which is the proper way to do it.
Sometimes people do disappear when it is extremely busy but when they come back (typically an hour or two later) they expect to join the back of the queue. Literally the only time I've seen anyone leave queue and rejoined it where they left was when they were there with their larger family and one of the kids urgently needed the toilet. And even then I've only seen that happen once in the decades (literally!) I've been frequenting barbers.
You can use all of the patronising terms you like to promote your preferred etiquette but if it's not shared with the rest of community then your method simply isn't the "proper way to do it".
The only difference here is you make the appointment in person.
It's really not a hard concept to grasp but after reading your other comment about how you love to talk confidently about subjects you no nothing about, I'm starting to question if this is one of those subjects. Perhaps you don't even have kids at all and this is just another one of your attempts at trolling which you also described enjoying.
In any case, the real crux of the matter isn't what you consider logical but rather what the generally agreed social etiquette is. So you can argue until you're blue in the face but if it's not how barbers currently work then that simply isn't how barbers currently work. Period.
We’ll just have to end it here.
It's relevent because this topic is about parenting and kids waiting in line for stuff like barbers.
> I've also never known a barber that didn’t take an appointment.
As I've repeatedly said, this might be a cultural thing. Most UK barbers in UK towns don't take appointments. Trendy ones in the big cities might well do, but in terms of your typical UK barbers shop you would just turn up and wait in line. So to that effect, I've also never known a barber that does take an appointment.
> We’ll just have to end it here.
For what it's worth, I did try to end this tangent several times already when I compromised with "it might be a cultural thing". Which, weirdly, is when you decided to rejoin the discussion. Go figure.
Sure you could blame the parents complacency or whatever but where do you draw the line? What happens if someone causes a scene that scares my kid when we are at a restaurant? Or walking him to school? You can't predicte the actions of every arsehole in communal spaces so blaming this stuff on the parents is more than a little unfair then society owes a little responsibility for its own actions as well.
But as I also said before, if these shock videos were just bad knock offs etc that a quick watch when the kids aren't around highlights their content, then I suspect YouTube would have found it easier to train their AI. The problem is some of those videos even get past parent moderation as well (such as the one I described above) and thus don't become apparent as a shock video until the damage has already been done.
Where it doesn't work is cases like your friend where a single unexpected event can cause a lot of harm. But most of these videos aren't that, they're just mildly "disturbing". If anything, they're a good warning that worse things might be coming.
And I don't believe no one should be blamed. I believe the blame lies heavily with the uploaders of those videos because they're deliberately releasing content that they know isn't suitable for children in a way that is intended to lure children into watching it. Sure there will be instances where the parents are also at fault for not overseeing what their kids watch. I'm not in anyway trying to shift responsibility away from the parents. However while I'm one of the biggest advocates for free speech and all that jazz, ultimately if you're creating content that you know is inappropriate for children then you should not be targeting children with said content. Period.
My kids got help to find kid-friendly things but were essentially free to do pretty much as they pleased online. They were given two tongue-in-cheek guidelines: No porn and no learning how to make bombs.
I think the key is to assume they are good kids who need some help finding what interests them, not bad kids who need a parent jail keeper preventing them from doing bad things.
Having said that, I was fortunate to be stationed in Germany when they were really little with access to only one American TV channel. I had lots of videos for them, not to control content but to give them entertainment. Still, content control was largely baked into the situation.
So I am somewhat self conscious that what worked back in the day may not work now and other disclaimers.
Edit: I will also add that when they were little, we had one computer in a public space. Individual computers in private spaces came much later.
I wouldn't. The real world has laws and rules, and people to enforce them. The internet (in all practical situations) is lawless and completly unpoliced.
That's not inherently a bad thing (I like having an open internet). But it's not something I'd ever expose a child to.
I'd easily let my own child wander the neighborhood unsupervised, long before I'll let my child wander the open internet unsupervised.
I too was unsupervised. As a child I used to troll forums, which is like the white collar equivalent of being a troubled youth who joins a gang. But it taught me better reading comprehension, and how to type faster, how to write in different tones of voice, how to imagine other people’s point of view, how to figure out what they care about and what makes them mad, and how to sound credible when I’m clueless.
When I grew out of trolling, these skills continued to serve value to me throughout my life.
Skills like this only develop when a child’s usage of the internet is at least equal parts consumption and production.
Not saying the cartoons are right or appropriate, but they're not child abuse unless actual children are molested while producing them (which I don't see how considering it's all computer-generated graphics).
I can tell a child "Ill come take your family and cut them in tiny pieces", and that's clearly child abuse.
Or I can show peppa pig raping elsa, and then bashing her brains out... There's nobody that would think that's normal to show children. And again, child abuse.
Abuse can be physical _or_ emotional.
EDIT: a few samples here https://www.reddit.com/r/ElsaGate/comments/6o6baf/what_is_el...
Maybe the number that are sick is small and not typical.
Child psychology is incredibly complex, and I am nowhere near an expert, but out of anecdotal experience, children are both terrified and fascinated by their fears. There is an almost obsessive tendency that comes from a minor trauma: for example, a child being startled by a vacuum might then cry every time he sees a vacuum, and ask to see the vacuum.
Note that I don’t think this absolves parents or anyone at YouTube. I think they are both morally obligated to protect children from videos such as this.
Of course it will. That's why there's a market for horror movies, but there's no real market for avowedly dull movies.
This works with many companies.
When children watch videos on YouTube, they pick another video to watch. Children are driven by known characters, and an innate curiosity about their fears, so the next video they select will be an traumatic video. YouTube’s algorithm is then trained to prioritize abusive videos. Creators (yuck) then create more abusive videos to match the algorithm, and thereby get more views.
Technically, the system is working as intended. Given a list of videos, children would prefer to select one that has their favorite characters and deep seated fears. So more videos like that get produced.
Yes, children should not watch YouTube unsupervised, and people should not be allowed to upload videos that traumatize children. However, both of those things are happening, and in the middle of it, YouTube is effectively promoting child abuse.
A better analogy would be “Ford creates program for child abducters to better find vulnerable children.”
What's missed in these discussions is the pure oddness of these videos. And I don't mean odd as in quirky and funny. I mean odd as in uncategorized - meaning that there's no conceivable way for some of these videos to emerge logically as a function of society. And the sheer effort put into them and endless variety seems to imply some sort of underground economy or industry behind them.
A lot of children's media is edgy and experimental in ways that adults deem inappropriate, and it's been that way for decades. Remember Ren and Stimpy? There's a lot in that that would make today's parents really uncomfortable. And nowadays, kids' movies (like Shrek) will usually have some inappropriate humor encoded in it to keep the adults happy. But it's all just entertainment and the themes are all familiar. You can kind of imagine how it played out. The writer of a kids' show had a slightly-edgier-than-usual idea for a gag and thought it would fly under the company's radar, and happened to be right that time.
Maybe I'm just naive about the state of video technology or the depravity of ordinary people, but I can't possibly imagine the type of mind that dreams this stuff up or the company that provides the resources and manpower for such a project. I get that they're done to farm clicks from 2-year-olds, but that doesn't explain the content of the videos. Stuff like Elsa becoming pregnant and receiving an abortion via Spiderman injecting her with a giant hypodermic needle, or Minnie Mouse blacking out after her friend spikes her cocktail with pills and waking up chained to a bed? The cruelty and inhumanity depicted in these videos using children's characters go way, way beyond the realm of goofy pranks and immature comedy. They indicate an intimate familiarity with the criminal world.
There is no risk at all to put your kid in front of a peppa pig list of episodes on Netflix or another legal video plateform.
But YouTube can’t efficiently curate or filter content like peppa pig because it’s not legal and the only appropriate action following law would be removal.
And that’s their main problem they actually needs illegal kids cartoon to be posted because it’s what people that put kid in front of YouTube are seeking. And Alphabet have paying customers that ask to place their ads in front of kids...
I understand that at the beginning it makes sense to make uploads free to gain market share, but now they've reached monopoly so that's no longer needed.
Charging something like $1 for uploads would allow to pay human reviewers, would cut down on mis-labeled content (since humans are reviewing) and inappropriate/advertiser-unfriendly content (since the creators would be reluctant to pay if they know it'll most likely get rejected, and if they try anyway the human would catch it). $1 is also affordable enough not to hinder legitimate usage. Seems like a win to me.
I started by checking the ownership of the videos on YouTube (through Content ID). All recent videos are claimed by emails on a domain entonegroup.com. The domain itself yields nothing as it's registered with an extra privacy option.
However the email leads to a lady that works for Entertainment One that also advertises  the ownership of the content.
I also checked the earliest videos of Peppa and it seems like they were produced by a different individuals, which would suggest that, at least this show, was acquired by Entertainment One.
Because as far as I can gather using Wikipedia, Entertainment One is the company behind the _real_ Peppa Pig. If you're saying that there is also reason to believe that they are behind the _fake_ Peppa Pig, that would be much larger news, which (looking at the amount of interest in the story), it isn't.
Edit: Ah, mFixman suggests a plausible explanation:
> Entertainment One owns Peppa Pig. It's likely that Content ID thinks Elsagate is legitimate Peppa videos.
>  https://en.wikipedia.org/wiki/Entertainment_One
That makes more sense :)
But from the home page you're redirected to an URL without the trailing slash that doesn't work when copied and pasted.
Edit: This isn't smugness. Do you think it's also smug to not run untrusted binaries? To not copy-and-paste from Stack Overflow into production? To say "thank you," or "yes, I've got it," when taking a blade from somebody else? To not run with scissors is my opinion, not because of smugness, but because I care about safety.
That said, parents that unleash their kids on YouTube make about as much sense as encouraging them to go play in a dumpster. Its all fun and games until the obscenity surfaces and the feedback loop begins.
When first I got Kindle, I really loved reading random books for free (Amazon Prime books). But then it got so much noise that now I can pretty much guarantee that if a book is available for free, it is worthless. My new rule is if I am not willing to pay for a book, I am not going to waste my time reading it.
That is so, so false. There are many wonderful, amazing books on Project Gutenberg and Librivox, and they're all absolutely free.
Sure, they're far outnumbered by garbage, but that's true even of books you have to pay for at other sites.
You have to know how to look. Find reviews and recommendations (those do exist for free, public domain books). Especially with older books, word eventually gets around which are the "classics", and those usually have a pretty good chance of being worth reading, possibly even of being great. With experience you get a sense for which kinds of books you tend to like so can have a good chance of spotting an interesting book just from reading a synopsis or sometimes even just a title.
I am a subscriber to Bookbub list which sends out e-mails to daily Amazon sales (0.00 to 1.99) on e-books of your chosen category.
Quality wise it does not seem to matter on the price.
I remember there was a second party website of human curated youtube videos for kids some 4-5 years ago, Kideos .
It worked really well at the time (for my first kid) but site seems to be broken now.
That's an extraordinarily naive statement and it's baffling that the author even questions the obvious intent and, worse, reaches the conclusion that intent isn't important here. He then goes on to say the problem is with the platforms, algorithms, etc. While these have their issues, I think he overweights them by a good bit, and they are secondary to this particular subject. That is, if the algorithms weren't being abused, then they would be just fine for their intended purpose. The problem here is that the content they are working on is problematic to say the least.
Specifically, someone is clearly and deliberately targeting children with disturbing videos that promote extreme violence associated with themes and characters they think to benevolent and loving. He then completely acknowledges this in the following nugget:
>Previously happy and well-adjusted children became frightened of the dark, prone to fits of crying, or displayed violent behaviour and talked about self-harm – all classic symptoms of abuse.
So, the harm is specific and clear. How can he then dismiss this as possible happenstance of secondary importance? We also live in a culture where people are being radicalized online and children are now being murdered in schools en masse, frequently by other (slightly older) children. Does the author not see a thread here?
Now, broaden the context further to consider the armies of troll-bots, etc. on our platforms that we know are being weaponized against us to incite anger, division, etc. There can be no other conclusion except that we are under a sustained attack to divide and destroy our society, and now children are being included on the target list.
This is not some randomly generated content to game platform algorithms for ad revenue that just happens to be disturbing. It is specifically effective in damaging children and is rooted in a very particular line of psychological attack.
Much of it illustrates how surprisingly difficult it is to get algorithms to replace human judgement, and how easy it is to think you've done it when you haven't. We look at instances of humans making poor judgements and tend to think , "an algorithm could improve on that". It turns out, human judgement does better than you think, compared to the task of automating that in an algorithm. Self-driving cars, I am looking at you.
This is also a natural result of parents using video as a baby-sitter for kids too young to really have any judgment of their own. So we have parent who won't exercise judgment, kids who can't and youtube which is expected to do so.
This goes naturally with the recent item about large corporations discriminating against pregnant women. If society isn't offering parents any extra time to take care of their kids, parents fall back in automated methods. We're finding downsides to such automation but that doesn't stop it, given that society isn't offering any time for alternatives.
I'm all for keeping kids away from mature content, and I think parents should monitor their children online. . . but I thought kids were more resilient than this. I don't have kids, but it's hard for me to imagine them displaying violence and self harm because of some disturbing video.
The incentives are there - disrupt one of the West's biggest platforms, cause social upheaval - and with just a bit of funding an effort like this could go a long way.
Is this option not being considered and investigated?
There is good evidence that a lot of this content is produced in China. My question is whether this is state-endorsed.
As a child, my parents vetted anything I watched that was not a broadcast children's TV show. Adult themes and violence were filtered until an age that was considered safe.
This was also a time when the TV schedule could be known in advance. There was no on-demand anything - pre-VCR. I can't imagine handing a child the loaded gun of the internet and letting them point it at their developing heads.
Therefore it is absolutely akin to your parents letting you watch broadcast children's TV shows, you're just trusting Google employees instead of TV network executive, and giving children more control over the order they watch content.
Ignoring that pressure sometimes means alienation in most social occasions as your friends and siblings only have interest in "parents talk" that you cannot engage in.
Asking doesn't appear, per se, to be the explanation.
Are you saying that Youtube has spent their own resources to make it easier for BigCo, beyond that legal obligation?
People can exist without companies. Companies cannot exist without people. Companies should serve people, not the other way around.
Something has gone very wrong that it works this way. Given that people created these companies, people run these companies, people created the laws and court system used to justify this shit and on and on... Somewhere, we lost our way and forgot that all of these things should serve the common good for humanity. These systems became bastardized and we sit around lamenting that this is just how it is and we are too powerless to stop it.
We have become the slaves of things we invented to serve us and we loudly proclaim that this is the new natural order and there is nothing to be done about it.
> this is a fundamental problem in the world
Although I agree with your sentiment regarding the corruption of systems, I don't believe merely calling out the problem, especially reducing it to the fundamentals, is helpful.
> Somewhere, we lost our way and forgot that all of these things should serve the common good for humanity.
This implies that (at least modern, Western) society evolved from some kind of kinder, gentler, common-benefit civilization in the past. That's inconsistent with my understanding of history, wherein people behaved toward each other in many ways that are reprehensible.
> we sit around lamenting that this is just how it is and we are too powerless to stop it.
> we loudly proclaim that this is the new natural order and there is nothing to be done about it.
These are either straw-men or self-deprecation (since you were the one who used the shruggie originally).
If you want to discuss how best to change the situation, then I would suggest at least proposing what can be done about it. If you merely want to complain, then I would suggest a different forum.
Fair enough, perhaps "expansion" would have been a better word than "departure".
> But your stance seems fundamentally dismissive, so I think I'm done here.
To my mind, I'm trying to be responsive and evoke thought-provoking conversation. In this case, I hoped I could get you to share what you think could be done to improve the broken system, but I failed.
I'm a woman in an overwhelmingly male forum. It's a big problem for me to try to find a reply in the face of that.
You may not have any idea. And it doesn't matter. It could have been someone LGBTQ or a person of color or any number of other reasons for feeling they don't really belong and aren't really welcome. Most people won't tell you their reason for feeling fundamentally unwelcome, but a fairly high percentage of people will feel that way.
I'm not at my best today. Maybe under other circumstances I would have been more able to rise to the occasion. So it's also not intended to simply blame you either. I'm muddling through as best I can, as usual.
I realize that part of what I said could be construed that way, and I apologize.
I do try very hard not be ad-hominem and avoid talk about the commenter, in favor of talking about the comment itself.
So, to be clear, I was making no statement about you not belonging here. Rather, I was saying (and only after actually asking for the kind of content/information I was hoping for) that this isn't the best forum for illustrating a problem without an accompanying suggested solution.
I do still hope to read your thoughts on making the system less corrupt.
The US was founded by a bunch of small businesses owners, basically. Social stuff fundamentally works different in smaller groups than in very large ones. That doesn't mean nothing bad ever happens in small groups. It means that part of the problem is we have 7 billion people and a bunch of huge corporations.
I had a corporate job for a time. I've done freelance work in recent years. I did that while homeless and eventually got off the street.
I'm trying to establish a pilot program for helping others also learn to make money online while homeless. I'm also doing little websites for local artists in a low cost rural area with a bunch of small towns. I'm in the biggest city locally and it's under 20k people.
So I'm personally trying to foster microenterprise as part of the solution to what has gone wrong. We need more small time operators.
Another thing I do is participate as openly female on HN. I think that has more impact than is ever likely to be acknowledged by anyone.
I tried once to do an AMA about it on Reddit. I was politely told to go pick myself, that didn't measure up to their standards of significance for AMAs and directed me to a second rate AMA forum.
And yet in the past year, a lot of things I have been saying on HN for nearly 9 years are suddenly being parroted in various "movements" or whatever.
I don't expect to ever get any credit whatsoever for any of that. I'm old and tired and grumpy and impatient with people being dismissive of me and yadda. But I'm quite confident I'm making a difference.
There's other stuff I do. I wouldn't know how to explain it all and I'm not inclined to try in part because your earlier comment was incredibly dismissive and not just of me personally, and in part because I routinely get dismissed by people saying "No, you didn't actually do X. You are deluded."
It's probably dumb of me to say this much. But I'm human and part of the secret of my success is my willingness to flail about and look the fool in public.
Hopefully we are deep enough in the weeds at this point that there won't be too many eyes on it anyway.
I agree, and I've always preferred smaller companies to larger ones. Unfortunately, I didn't always know that about myself. There's likely more I don't know that I "should".
In tech, and especially on here, there's a strong bias for the VC-funded growth-at-all costs business model. That of course, leads quickly to large organization.
Fortunately, there are people who post here that support small businesses that stay small and share their experiences.
> And yet in the past year, a lot of things I have been saying on HN for nearly 9 years are suddenly being parroted in various "movements" or whatever.
I'd consider that as an encouraging sign, if not success. Someone is getting the message (if not directly from you)!
> There's other stuff I do. I wouldn't know how to explain it all
You shouldn't even have to. Just doing it is enough.
What I would hope you do on here is suggest what believe other people need to do to help and, more importantly, the mechanism by which it will help.
> I routinely get dismissed by people saying "No, you didn't actually do X. You are deluded."
If someone posts an ad-hominem attack like that, I do urge you to flag it. Even a "shallow" dismissal, with no explanation or reasoning, like even just the first sentence of the above quote could be worthy of a flag. The moderators here are pretty good about keeping a lid on that sort of thing.
On the other hand, I also urge you not to take personally a disagreement, attack, or even a dismissal (with reasoning behind it) of your statement.
Not to mention there's also an infinite number of benign videos that could be created, and you need the automated system to correctly identify between all three for most of the time, otherwise one group or another is going to get mad at you.
why youtube kids must app-only ?
> James Bridle’s essay on disturbing YouTube content aimed at children went viral last year.
The first paragraph includes a link to an article from that time.
The point of the article is “has Google done anything about it since last year?”.
It’s a huge problem that Google is just blaming on “algorithms”. Meanwhile little 2 year old Johnny gets to see Peppa Pig killed in a variety of ways.
It’s not only the overt violence either. There’s some really weird sort of cult programming stuff that slips past the filter because they put it at the very end of an otherwise normal clip.
But yeah, just delete it. Plenty of quality apps for kids.
In all seriousness though, my daughters moved into the same room for about a month in response to the 'Killer Clown' (pennywise etc.) stories that were doing the rounds in their elementary school.
And just to emphasize how scary stories and TV can be, I'm only now getting over my deep unease whenever I think of the scarier parts of City of Death, and Sapphire and Steel.
Likewise BDSM. Some friends of mine invited me into forums about it for dating, and I found I could not distinguish what I saw from actual abuse — I instead need to rely on the fact that I knew some of the people and they told me they had enjoyed what was done to them.
It’s like The Dress, I suppose, only disturbing/fine instead of black and blue or white and gold.
Google clearly did something about the issue of how to create a filtered experience for children, by creating YouTube Kids. Most of the article is just pointing to "weirdness" in videos on YouTube proper, which should surprise no one.
The investigations that have been carried out so far seem to point to individual small film studios who produce the content because it seems to be an acceptable business model for them, and of course YT's own recommendation algorithms for being unable to distinguish between kid-okay and not-kid-okay content.
Which, if you had read the article, or the one that proceeded it, or even used the app at all, you would know it's absolute garbage, and makes the problem worse. Google is not using human curators for YouTube Kids, they are trying to use automation, which is why all these things are in an app for kids.