Hacker News new | past | comments | ask | show | jobs | submit login
TikTok: All under-16s' accounts made private (bbc.com)
177 points by alexrustic 3 days ago | hide | past | favorite | 190 comments

Lots of comments here hating on TikTok, so I'll offer another perspective:

TikTok is instrumental for offering children and teens an "in-group" and finding their interests. If you're LGBTQ, you find really wholesome content. If you're depressed, you can find people being open about their depression. If you self-harm, you can find other people discussing and opening up about it. If you're in a toxic household, there's basically a whole support network for you. It's an exceptionally human way to interact with people online, and it's so much fun.

IMHO, TikTok has been the most wholesome social network I've ever used and participated in. Even as an adult (24yo), it's even helped me explore my own identity and how I relate to the rest of the communities I'm a part of. Legitimately, TikTok has helped me feel comfortable opening up and be more "myself" in public.

Yeah, I don't buy it. I've made a few attempts to get into TikTok, because my wife and friends are into it, and I have not been able to find any positive or quality content. It's mostly low quality memes and rude or otherwise vulgar attention seeking. I understand the appeal, given my experience with other social networks, but I definitely don't buy the narrative that it's in any way "good".

I agree with you - it rewards attention seeking behavior and short attention spans.

That said I understand the term of good quality - in that it isn't vitriolic political messaging, conspiracy theories etc. It's mostly devoid of any true meaning - more like entertaining, sometimes hilarious content. I wouldn't say it's enlightening or provides much educational value. As I said early it also probably reinforces short attention span, attention seeking dopamine feed back loops... likely long term damage to brain development.

Just my opinion, no facts here.

It also creates this expectation that your life is a performance for other people or that you produce content for other people to digest to get your dopamine rewards. There's going to be a strange generation coming down the pipeline.

Isn’t all social media like this though?

yes to an extent, but there is a spectrum. I would count most forums under the umbrella of "social media", but for the most part I don't find them to have that performative aspect. platforms where your account is more strongly tied to your IRL identity and photos/videos are the main thing shared tend to be a lot more performative.

Tiktok has exceedingly powerful filter bubble effects. As a result, the nearly universally derided "straight-tiktok" (which is where you start and dominated by conventional social media influencers) is fundamentally different than other parts of the app.

My fyp is a combination of good natured dancing, linguistics, crowd sources music, hank green, a few comedy creators who engage positively with mental health and feminist topics, and animals doing silly things, recently a lot of possums. That's interspersed with memes and jokes, but very few that are particularly rude.

I agree. My fyp also has Dad Green, older LGBT creators, young PhD & postdocs explaining science Qs, lots & lots of dogs, linguists, cooking, trees, lakes, geology, etc. Plenty of light hearted dancing, but everybody seems to be pretty chill and glad to engage with their audience. It's pretty easy to get responses from creators, as compared to YouTube creators.

You have to scroll through for maybe ten minutes and interact with posts that you actually like. Their system very quickly figures out what to show you. The general population of tiktok videos is pretty bad.

I suppose that's true of all social media. I think the default videos on TT are pretty much like the trending section of YouTube.

I know I risk sounding offensive but please accept that this is just an honest inquiry. Have you considered that you are causing that sort of content to hit your “for you page” by how your engaging with it? I ask because if someone asked me for a description of what’s most on tiktok based on what is in my feeds, it would be woodworking, cooking, arduinio projects and 3D printing stuff, with a bit of stand-up and DnD. I didn’t set any preferences or such to make it so, it just ender up that way through what I liked and the half a dosen people I follow.

I am in no way under the illusion that this is what most see. But i have to question if people that complain about Charlie Damelio and similar people being shown too much, just aren’t spending a lot of time watching the videos, commenting and liking them. If you scroll past they stop popping up.

Overall the content creators I see content from seem extremely good for these smaller niche communities.

As a plus, the most unbiased source of content for the BLM protests I found was tiktok, because it was just livestreams and videos from the protests with little possibility to editorialize it. Instead of anchors screaming “violent!” Or sternly saying “peaceful!” You got streamers on spot showing exactly how violent or peaceful certain situations were.

Yes. Use the search, find a few videos you like on topic X. Interact with them (like them, comment on them, follow creators). There that is most of the interaction needed to influence your feed.

Then as you encounter stuff you like click hashtags that look interesting, explore sounds to find similar videos, checkout a producers other videos.

In the end of the day, it is entertainment business, that pretty much sums it up. Hollywood isn't better when it comes to attention seeking behavior.

Thought of this wholesomeness aspect today as well:

I watch a lot of strength training/weight lifting videos on TikTok, and I would say the vast majority of the comments are supportive, positive and/or constructive.

I appreciate a lot my comment is apples vs oranges (because times are very different) - but I can still remember using MySpace, Facebook and Twitter for the first time. They all started pretty wholesome and pretty great before 'the real world' kind of seeped in.

Early Twitter (in particular) was a game changer for me, who was working from home for the second time in my life. I had people to talk to and bounce things off and all of a sudden work wasn't so lonely (as I had been the first time I tried). There was a genuine community feel without all the snark and unpleasantness.

I don't use TikTok, but I hope they'll be prescient enough to see where the others have struggled and keep on top their content policies, communities and moderation.

I think one of the problem is when networks start mixing topics and people.

If it focuses on people, you can meet with real friends, and that's fine, because you get along.

If it focuses on topics, it is also fine, you meet people you don't know, but because you talk about subject you all enjoy, that's fine.

Problem starts when the two mix. If you enjoy astronomy for example, the other guy may be an asshole, but you probably don't even know and you get along fine, because all you do is talk astronomy. But if the network thinks that because you like the same thing, he must also be your friend, then your feed starts filling with assholish things.

I have seen this play out on Instagram a few years ago.

Stage 1: The person is relatable, the videos are simple and to the point.

Stage 2: The following has grown, and the person now posts more frequently, however the quality/humor/personal touches are reduced. Less content about fitness, more about personal lifestyle

Stage 3: Monetization. Workout PDFs, constant advertising of fitness products (protein shakes, weight belts, etc), lots of hollow 'motivation' posts. The audience keeps growing, but the community has disappeared.

Isn't the problems from precisely what you are saying? If you fall into a recommendation hole, as long as they don't get moderation right it might be the wrong kind of hole. For impressionable audience, it might not be a good thing.

TikTok's moderation team is pretty on point. Far better than any social network I've ever used before.

I don't think the commenters who say they only get weird or bad content actually know how TikTok works, or how to use it. You can't just scroll your FYP and expect to get new content, you have to go out of your way to find it. FYP is just a feedback loop of videos you keep watching.

If you want to start getting more queer content, for example, start searching the #lgbt or similar tags and watching/liking/commenting. Your FYP changes really quick.

> TikTok's moderation team is pretty on point.

Not the experience for my circle of friends.

> for example, start searching the #lgbt or similar tags and watching/liking/commenting.

What if some hate video for the community gets posted under the same/similar tag? I am not sure how it is now but mid last year one friend saw something very offensive on a tag related to his nationality with a lot of likes.

It will get reported and deleted _very_ quickly.

I don't think its very fair to claim people who don't think the content is any good 'just dont understand it'.

Plus the conversation is about under 16s, I highly doubt what they're seeing is the same as you, so it's not really relevant what your feed is like.

What you described isn't necessarily positive. If you are depressed or self harm, maybe keep watching those videos are bad for you.

This isn’t much unlike the question of moderation in regards to Parler for example. We need to talk about all views and ideas and opinions, whether it’s political or about your mental health. Well, maybe sometimes it’s both.

But you can’t bar talking about it because sometimes it’s bad. It’s tough.

Not saying you’re wrong at all. Just suggesting there’s a lot of nuance here and it’s difficult to manage. Your point is very correct and important.

The main point here I guess is that Children can't necessarily judge that.

I don't think its particularly helpful as based on what I've seen myself, a lot of these 'support networks' actually veer on glorifying the issues.

You shouldn't base your social group on an echo chamber.

These are all great points.

I think glorification occurs because it’sa much easier way to, at least temporarily, alleviate the struggle some issues can cause. It’s an understandable response to pain, especially in young people confronted with complex internal problems.

Also your point about children not being able to judge these things for themselves immediately made me think... It seems a lot adults can’t be relied on to do it either.

Totally agree on your final point! Though that's a discussion for another time...

The glorification can provide short term relief but creates countless more issues than it solves.

He specifically said there is a support network for him thru tiktok. How can you say ‘keep watching those videos are bad for you’. That is dismissive at best.

Just because he thinks it's doing good doesn't mean it's an effective way of coping, it could actually be making the problem even worse.

These issues should be dealt with using advice from professionals in combination with a support network.

That's such a weird way of making statements. We do make decisions about who we make friends with, who we engage with, without the need of someone to decide those for us. and even more importantly, who?

> If you are depressed or self harm, maybe keep watching those videos are bad for you.

This may sound counter-intuitive, but please don't judge that sort of thing until you've gone through it.

There are many online communities that seem dangerous or destructive to people who haven't gone through the issues, but are actually incredibly helpful to those who do.

There really are so many trash self-help internet communities in the wild. These algorithms are amazing at placing people that share an interest in re-experiencing their buried childhood insecurities together and hire the most active commiserators to build their community.

It definitely doesn't seem like the communities that actually help each other gain a sense of security required to pause and self-reflect and unpack their issues get very popular because the click-optimizing algorithms don't give them a chance.

I feel like all the hate is aimed at the stereotype that TikTok is before they actually use it. If your view of TikTok is through the lens of media or the occasional repost, you'll find that it's mostly teens dancing to catchy music, which there is a ton of on the platform.

But there's so much more content under the surface that doesn't get talked about. Travel TikTokers give me ideas of where to go in 2021, I stole an Oatmeal recipe from a bodybuilder on there that I still make every morning. I took some ideas about Lightroom from a number of Photography TikTokers, the rabbit holes are endless.

TikTok I think it in it's "pre-mainstream" phase where you get the best of user content but it still hasn't been muddied by promo deals, partnerships and big media. Sure big media is there but most of the content I see reminds me of YouTube in that magical time between when it went mainstream with users but not big brands and corporations.

How is that different from any other social network though? Even when I was a teen, there was no shortage of message boards and forums having these kinds of discussions. Sure, the video aspect wasn't there as in general, people stayed anonymous, but the sense of support was still there.

Pockets of supportive communities feel rare online that when you hit one, especially at a young age, they feel incredibly refreshing. Most of my early internet experiences were hard lessons but once in a while I ran into a group of nice people who taught me a lot and were supportive when I was getting started. It impacted the way I think about online communities.

Personally, I've enjoyed TikTok. I find the different cultures around the world making videos about their cultures to be very informative. And I'm amazed at some of the level of comedy available on the platform.

TikTok is good for that although to FYP (For You Page) algorithm seems a bit opaque.

This has led to a load of interesting communities springing up like the Sea Shanties one and quite a few musical communities doing collaborations.

Mine is mostly jokes on tech or politics. Lots of fun. Absolutely enjoy it.

Edit nevermind



You should be ashamed of yourself for writing that.

there is nothing wrong with being LGBTQ+. to insinuate "corruption of Western children by China" is extremely bigoted in several dimensions. I'm disappointed to this sort of thinking on this site of all places

What is wrong with thinking that children should not be sexualized?

nothing, of course children shouldn't sexualize - however that is not what they were discussing. they were reinforcing the harmful & factitious association between being LGBTQ+ and "corruption" or whatever

> of course children shouldn't sexualize - however that is not what they were discussing

That should be part of the discussion, though. It's difficult to separate Queer philosophy from predatory pedophilia. It's part of the system of thought, back to the beginning.

Age becomes relative, and Parenting is seen as an oppressive structure that is imposed on children.

what? what parts of queer theory deal with normalizing pedophilia?

any sane individual studying queer theory is still going to understand the relationship between consent & age, and how pedophilia is a mental illness and not a sexual orientation specifically because it violates the ability of younger individuals to give consent

Even if they don't believe in abolishing the age of consent laws like Michel Foucault did, they still believe that children actually can consent. Pat Califia makes this argument explicitly.

They believe that "normal" sexual ethic is imposed on children by oppressive power structures (enforced by parents). So they believe that they're liberating children by exposing them to queerness. They don't see it as peer pressure or coercion because they see sex as something similar to choosing between ice cream or cookies, which a child is capable of.

Gayle Rubin also defended NAMBLA quite clearly.

Many examples of this stuff.

>It's difficult to separate Queer philosophy from predatory pedophilia.

It really isn't.

I installed TikTok to see what it was all about. I used it every day for about 6 months. I saw loads of misinformation about the election, there were gruesome videos of people feeding live animals to their pet snakes and other animals. And then just loads of sexualized content. I wouldn't allow my kids to use this app in a million years. Also, I tried reporting a bunch of these videos, but they were never taken down.

It tells more about what kind of content you are liking and watching than anything else.

Mine is filled with skaters, mom-daughter dumb jokes, silly lesbian/transgenders sheniningans.

There's sometimes weird things like a priest telling stupid jokes alone in his church and that korean guy doing asmr of burning bread in the oven.

On facebook I saw the death of participants in the events that happened on the 6th but that was in a journalist/news/media analyses group.

I see a lot of good vibes content on Tik-tok but that's what I am looking for on the platform. No doubt I could soon end up on indian train deaths if I wanted. Just like that time on Youtube when I saw that out of nowhere, I was looking for something totally unrelated to trains or india though.

People say things like this in an attempt at shaming the original poster but one thing not considered is the location of the person using TikTok.

I live in a big city and most of the feed is highly sexualised, highly political, or highly prank-related.

My friend that lives in a small town about 100 miles away has to essentially send me fun tiktok videos and, no matter how many likes I do of those videos, the base diet of sexual, political, or prank-related videos are always there.

>> one thing not considered is the location of the person using TikTok.

I run into this daily. I'm in the military and so my internet connection at work is mingled with a thousand other military people. Ok, it is accurate to say we are predominantly male and younger, but every time I see an ad does it have to an an anime character? Some of us recently gathered to watch the DC riot footage on youtube. The footage started with a non-skippable ad for some pervy anime game. Try holding a strait face in front of the general as a busty cartoon character in a negligee is bouncing up and down on a ten-foot screen.

I don't know, there's something reaaaaally dystopian and weird and fun but concerning about the chain of events that led to that situation.

I personally use ublock to correct in real time my world lenses.

Try holding a strait face in front of the general as a busty cartoon character in a negligee is bouncing up and down on a ten-foot screen.

Classic! I love real world tidbits like this. I would suggest in the future use youtube-dl https://ytdl-org.github.io/youtube-dl/index.html to download the video to your machine to avoid situations like this.

There are all sort of ways to do this, but they are difficult in the military context. We gathered in one room because youtube/facebook is blocked on our normal office machines. This was an unrestricted podium computer used for totally unclassified stuff like streaming live news feeds. Like most corporate networks, we aren't allowed to just install software, even simple browser extensions have to go through an approval process.

Can you talk a bit about the ambiance and the reactions in the room as the events unfolded ? If you can, that is.

There was no reaction. This is normal. The unfiltered internet is full of inappropriately sexual ads.

I mean.. reactions to the events that took place at the Capitol.

I love how even the DoD won't pay for YouTube Premium.

> I live in a big city and most of the feed is highly sexualised, highly political, or highly prank-related.

More anecdata: I also live in a big city - London - and I don't see this. I joined TikTok just a few weeks ago and I was surprised at how overwhelmingly positive and fun it all is.

Not a tiktok user, but I know instagram also fills your explore feed with the content that your friends like. So even if you're tailoring your likes to appropriate posts, your friends may not be.

That may just be the initial period after a user has signed up and hasn't engaged with enough content for IG to show anything under "Discover". They may use friends' preferences as a placeholder.

I know for a fact that what's on my Discover feed (surrealist art, UX/UI stuff, Newfie dog videos) is unique to me.

> People say things like this in an attempt at shaming the original poster but one thing not considered is the location of the person using TikTok.

That's a good point. I was wondering about the initial seed of the feed.

It wasn't my intention to shame though.

Until tiktok has something like channels or theme/tag subscription it will remain a problem.

I don't think likes are actually a big signal, it's mostly based on watch time in my experience.

Note that you can also hold on the video then click ‘not interested’.

I hadn't found this piece of UI yet, will try and do that too, thanks!

The content your parent comment refers to was (partly) what I saw when I installed it and scrolled through the default suggestions before it had the chance to learn anything about me. Sure there was a lot of harmless and just silly stuff, but the amount of animal abuse like https://twitter.com/noellesdesigns/status/122443946827220992... that scrolled by made me stop using the app.

The OP did say they used it daily for six months though, that really ought to be enough interactions for the app to personalise to the users choice.

I don't think it tells you anything about what the commenter likes. When I created a TikTok account it showed me nothing that interested me. I marked videos as "not interesting" for half an hour before giving up.

I got a few animal abuse videos in an hour or so of swiping. It shows you these random videos in case you are interested. A few of these was enough to uninstall. I tried again some months later, and after awhile the recommendations get a little better, but there is still the occasional piece of awful content that gets thrown in there. In all cases I swiped past as quickly as possible.

Would you trust the algorithms to tailor most content appropriately for a kid?

I think OP’s point is that there is plenty of questionable media on Tik-Tok.

Your point about what they like is irrelevant to that point.

There is just as much questionable media on YouTube. The difference is TikTok's algorithm works a lot harder to show you more of the content it thinks you will spend more time viewing, even if you are not actively searching for it.

So if OP is seeing "loads" of sexualized content, it's because they spent enough time watching sexualized content before swiping for TikTok to think that's what they want to see, and ensure they see more of it.

> So if OP is seeing "loads" of sexualized content, it's because they spent enough time watching sexualized content before swiping for TikTok to think that's what they want to see, and ensure they see more of it.

Even if the picture you are painting of the previous poster is accurate, being horny once in a while is by no means justification for an endless stream of sexualized content.

That's not a neutral quality of the system, but a deliberate feedback loop establishing an addictive behavior. We already know the shortcomings of the human condition and should ask for less footguns, not more.

I've heard people describe a situation in ML (could be hill climbing) where an optimising algorithm will get stuck in a valley. At that point it will try various things in order to get unstuck, often involving a good dose of randomness. What's to say this didn't happen to op or that you really won't start seeing recommendations for Indian train death videos?

> What's to say this didn't happen to op or that you really won't start seeing recommendations for Indian train death videos?

We can never be sure of anything. What if HN start pushing train death videos on the front page and the admin can't tweak the algo because a glitch in the stack prevent them from accessing the backend ?

Mine is full of fitness instructors showing of their fitness challenges,dancers showing of their synchronized dance routines, and oddball humour. I had a moment where some BLM stuff trended but on the whole it seemed to show both sides (albeit a bit the extreme versions of both sides).

If the content is harmful it's harmful...

> It tells more about what kind of content you are liking and watching than anything else.

This sounds very much like victim-blaming, I don't think it's appropriate to side with TikTok on this.

It's not victim blaming. It's widely known that the "magic" of TikTok is that it carefully scans your interactions with content as you consume it, and seamlessly adapts your stream to what it believes you'd like to see. Algorithmic suggestions taken up to 11, and generally as close to mind reading as we've came as a species.

It seamlessly adapts your stream to what it believes will increase “engagement“. That's a different thing to “what it believes you'd like to see”.

TikTok has seemed to present me with shock/outrage content in the name of engagement less than any other social media - far less.

I almost always see content that is only fun.

That has made it much better for long-term engagement and positive for my mental health, rather than other social media's short-term outrage engagement that is wholly negative.

That means the TikTok algorithm designers are half-way competent.

I intended this in contrast to your point above.

It could mean the algorithm designers have decided that optimising for fun is more effective at increasing engagement than optimising for shock value and conflict.

But, this is a different choice than other social media designers have made. It seems like there is some more wholesome motivation.

I don't think the other social media designers deliberately optimised for shock value and conflict. I think that was a side-effect of a naïve optimising algorithm: short term (i.e. until people just quit the social media platform entirely), conflict and doomscrolling are two of the most engaging behaviours.

Have you ever used TikTok?

I think there's a problem with this though: people aren't always in the mood for the same stuff. The easiest distinction here is 'inappropriate' content from the rest. Sometimes the user is interested in that content, but not every time they open the app. It might even be that most of the time they don't want to see that content.

I experience this with politics videos on YouTube. Sometimes I'm in the mood for that. One evening I'll watch a few politics videos. The next day, my recommendations are filled with politics. Over time my recommendations get back to normal. Then I'm struck with the politics bug again, but my recommendations don't give me an easy way to find more politics content. They're recommending the type of stuff I'd normally watch. The day after that I'm annoyed again about politics being all over my recommended lists.

It feels like I should have separate profiles based on the topics I'm interested in. The platforms don't seem to really support that though. Even different users seem to influence each others suggestions.

This may be true for people with short attention span. For me personally sure, I may click on a clickbait article or watch an attractive lady, but it doesn't mean that I want those suggestions even shown.

It's like putting a lot of chocolate with sugar in my apartment: the wise choice is to not even buy anything that contains sugar, as I can't stop myself from eating it if it's in front of me all the time.

Maybe reptilian mind reading. While we might intuitively be very attracted to sexual and gore content, it doesn't mean that's what we want.


What about when you see the content on first install?

Then it's a problem (for the sake of the argument I consider "the content" to be either illegal or borderline psycho).

And for the record I consider a lot of tiktok pranks to be borderline psycho (eg gaslighting your SO for giggles about divorce, abortion, etc.).

I use the app sometimes. Going through my feed, most videos are DnD or fantasy related, there is no sexualized content and haven't seen any misinformation about any elections. The app tailors the feed to your habits and inputs.

Weird, I guess it really does matter where you are geographically, as well as your "likes" bubble.

I did a similar experiment myself and found it to be a very delightful stream of content. While it did have some sexualised content for sure, it wasn't too much in frequency and degree, nothing more than you would see on a normal tv drama anyway.

And it was just so much fun - mostly teens doing amateur choreography, skids and pranks. I uninstalled it mostly because it was so effective in capturing my attention than anything else.

I still see teenagers going about trying to mimic one routine or another, and it always makes me smile as I can see them being creative rather than just consumers. Doing a high quality YouTube video would be too hard for them, but a quick 10 seconds video - they feel they can do it and try to participate, which is great in my book.

The app literally makes an algorithmic feed of content that you personally "like". I haven't seen anything like this at all. Mostly dancing and singing people, and comedic sketches.

No, it starts out by showing you popular, random or slightly profiled content ("you appear to be a male in the 40s, so let's start here"), and then just maximizes the content types you were shown that would increase screen time.

This has nothing to do with your preferences/what you like. It doesn't care if you feel disgust, as long as the app stays open.

Suggestion/auto feeds are designed to abuse human psychology to maximize time wasted, not interest.

If you like, watch things all the way through, or comment on, then it tries to show you more things like that. If you skip past content, or mark it as not interested then it tries to show you less things like that.

So if you are disgusted with something. Stop Watching It!

Yes this does optimise for time spent watching, but interest definitely plays into it.

People don't read news because they enjoy hearing about crime and corruption. They don't read about the Peter Madsen Murder case because they enjoy the thought of chopped up people.

Blaming the user for engaging by subconscious impulse in a machine designed with every psychological trick in the book to force you to do just that, to optimize screen time at any cost, is just wrong. Assigning interest, joy, or even conscious intent to that engagement is very, very wrong.

The only thing you can do is remove the app, but it won't take long before you run into the next abusive machine. And quite frankly, as with any other intentionally addictive interaction, it is not easy to realize the problem and remove it.

Which is a bit like telling a heroin addict to just Stop Doing Drugs!

At our core, we're still stupid monkeys. We cannot help ourselves on an individual level. The platforms must take this responsibility, there is no other way.

One thing that's clear is that these algorithms are here to stay forever. What's not clear yet is what we should do to cope with it. Culture emerges to help people deal with problems - so what cultural factors will emerge to protect us?

I'm starting to think the answer is a sort of "feed hygeine" that we will all have to learn to keep up with. I successfully made my facebook feed a very calm place by unfollowing toxic friends, hiding posts that made me angry, and joining positive and productive groups.

Maybe we'll be expected to teach our kids how to do this. Maybe it will become part of mainstream internet culture that we know how to curate our own feeds through our activity.

In other words, it shows you more of what you like to watch, not more of what you'd like to like to watch.

No, it shows what you do watch. If you are paralyzed it tries to paralyze you more.

The algorithm does not know that you're watching horrified or enjoying the video. They can just tell that you stopped scrolling for an extra amount of seconds.

It is nonsensical to impute anything about a person's character from what shows up in their feed, a result of black-box algorithms. This feels like the beginning of a witchhunt.

That sounds like YouTube.

Well, that's just you. After some time TikTok actually learns what type of videos you like to watch and shows you more and more of those. You end up in a bubble. Their algorithm is pretty good at that.

When I check the phones of my kids or my wife, I can see totally different type of videos recommended to them.

This is not a counterargument. What if your kids end up in that bubble?

You’re comparing your wife’s experience with GP’s experience. I’d be curious to see what your feed would look like after some use.

And what if your kid ends up in that bubble? What if your kid likes one animal torture video and then the algorithm decides that they should see that on a daily basis?

Sadly your experience regarding reports and offensive content being completely ignored can be universally applied to other big platforms like twitter or youtube. I've seen some inappropriate YT content being bombarded with reports and it's still up because it generates a lot of clicks. Hypocrisy at its best.

It is actually quite easy to have Youtube delete content (doling out a strike in the process) that doesn't violate CG/ToS using false reports.

Yes last year I had a quick look at TikTok to see if it could be added to my main clients (Drinks Industry) social media portfolio.

Literally the first recommended video was a young woman in a school uniform - so that's a hard NO!

Some of these comments seem to be complaining that

a) snakes gotta eat, and, b) children often wear school uniform.

The reason you should want to keep children and alcohol apart is because alcohol is a drug, not because you think school-children are inherently sexualised.

Obviously a kids app is a bad place to sell liquor.

Why is a girl in a school uniform a hard no? Sorry, must have missed some key piece of information here.

Not something you can advertise alcoholic drinks alongside, at least in most countries.

Ahh, that makes more sense.

Maybe she accidentally

They stereotype of the Japanese school girl might well be problematic - the sexual element was what I was hinting at.

> I wouldn't allow my kids to use this app in a million years.

And how would you ensure / enforce this? Honest question, because I'm in that situation right now.

(IANAP -- this is not parenting advice ;) I speak mostly as a newly-minted adult who still remembers what it was like to be a child, when my world was small, so small decisions felt very important, especially when they were made without my input)

Children are people too, just with less experience. Your goal really isn't just "ban TikTok", it's "raise my children to develop healthy habits on their own and to recognize when they are being manipulated."

So, even if you ultimately decide to enforce parental controls, I hope you will bring your children into the decision-making process. Have an honest dialogue with your children about your concerns, and develop a space where they are free to share their own thoughts and feelings without fear of judgement. It's important that your mind is not already made up before you sit down at the table, as they'll sense it immediately. Repeat this often, and make it easy for anyone (yourself included) to express that their perspective has changed. Set a good example (by e.g. not using social media yourself) and apologize when you are wrong, so that your children learn that it's safe to do the same.

Encourage them to pursue healthier alternatives. Allow them to experiment and learn from their mistakes.

Upvoted for solid parenting advice.

I pretty quickly realized that kids, even little ones, like 3 years old are far smarter than you think. They are basically little humans without much experience and that means not great judgement.

So as a parent it’s important to make sure they are exposed to things they understand and can process. Things that are age appropriate. If something isn’t appropriate, well, you try and explain why (as best as they can understand).

Thanks for your comment, I really pointing out "learn why".

I try to explain to my 7-yo the _reason why_ for many things, whenever I can. Key is to not push it when she loses interest. If it's important, just end up talking about it again at a later datetime. There's only so much input a child (or adult) can accept before the buffer is full and needs to be mapped. It takes time to put into context.

Also accept that some things can't really be explained by everyone. A guy on YT with a channel on self defense said, "if you are not a violent person, you'll never understand violence", and I think it has some bearing. Sure, I can certainly understand that, in the case of X, he was beaten as a child and therefore may see violence as a way to handle his feelings, but that doesn't explain "why did X beat up Y unprovoked last Wednesday". There's rarely a single causality that explains things like that.

And, ads, yt-videos, social media, most often are manipulative. Eg the "youtube-face", excessive reaction-videos, etc. (no, she has no access to social media).

Exactly this! I was hoping to get more suggestions along these lines. Because the first thing I thought was that if I hard-ban something they'll just learn how to do it without me knowing, especially if it is something their peers are doing.

So, we need to "gracefully" ban this; that is, if it needs banning at all.

The alternative is to embrace it and show how to use it properly.

And then, there's this addictive nature to it - the app is designed and tuned to provide dopamine hits; how to fight that?

> Because the first thing I thought was that if I hard-ban something they'll just learn how to do it without me knowing, especially if it is something their peers are doing.

Yeah, exactly! Growing up in this age, it's impossible to avoid online interactions of some kind. So, it's better if children learn how to use technology / media / internet responsibly in a controlled environment. You're like the guard rails in a bowling alley :)

One approach might be to let them use TikTok or whatever app is popular, and just try to get them to learn to self-identify 1) how long they spend on the app, 2) how using the app made them feel, and 3) whether or not time on the app took time away from something else they might enjoy. Help them learn identify the positive aspects and the negative aspects of the platform, and only consider a full ban if you start seeing extremely problematic patterns.

If you have any personal self-improvement goals, it might also mean a lot to your kids if you make a habit of sharing your progress with them (in a way they can understand at their age).

Good luck :)

I use the Family Link feature on Android, so my kids have to ask for permission before they install a new app.

and then they will just use the phone of their friend with no such policing in place or use a publicly available client with no such controls in place

when I have such discussions with friends and family I tend to say that I'll rely more on trying to make my kids understand what they are doing than trying to police what they are using. Not saying that I'll not do some policing on their devices but I'll just rely on that

neither one works flawlessly in isolation. You can explain to a child why you don't want them doing something until you're blue in the face. But in all likelihood they're going to want to do it more. The backup is then to block the app on their phone

Sure, they can still use their friends' phones, but you can only protect a child so much. You've made a good start by explaining the issue and banning the app, but you can't track them all day to make sure they're not doing it elsewhere.

Instead of focusing their attention by telling them how bad it is, give them something else to do and introduce you them to other children raised with similar values

You can do that all you like, you can give them all the hobbies they could ever need to be busy. But eventually one of their friends is going to share the app with them when they're playing together

> and then they will just use the phone of their friend with no such policing in place or use a publicly available client with no such controls in place

That means, that they wont have that content pushed on them for hours either before sleeping or while you work and they are in home due to lockdown.

They will have access to it only while they are with that friend.

I deploy a novel virus that makes it impossible for them to get near a friend's phone.

Both iPhone and Android have parental protection that allows you to block apps (though on Android it might be too easy to circumvent)

On my home network, I run a transparent, MITM squid proxy with a whitelist for my 8 year old son. My intention is to gradually loosen the restriction as he gets older, in stages: first switch it to a blacklist, then lift all restrictions but continue to log, then remove the proxy altogether and allow him free, unmonitored internet access. I will change it at whatever times seem appropriate - I'm new to this, ofc, and I don't know when it will be.

I have no intention of letting him have a mobile device any time soon, but if and when I feel it's appropriate/necessary, I have the option of giving him a "managed" device (i.e. like a corporate device) with always-on wireguard to my home network, routing traffic through the same proxy. I haven't tried setting up a device like this yet, but the necessary capabilities appear to be present in both iOS and Android. This will also allow me to control which apps can be installed on the device.

I am of course entirely open with him about this, including the technical aspects of how it works, and frequently discuss all of the many issues involved. My goal is not to hide reality from him or to instill some unreasonable fear of what's out there - quite the opposite. It's to try and help him arrive at a healthy relationship with the internet as an adult, something that most adults I know (including me) have so far failed to establish.

And yes, of course it's possible to circumvent all of this stuff (although quite a lot harder with what I have than with the vast majority of parental control solutions). And yes, I can only control the technologies he has access to that I manage. But you have to consider the "threat model" here. He doesn't rail against this restriction. He understands it. If he wants access to something he asks for it. If I say no, I explain why, and he accepts it. We'll see how that develops over time, but it's certainly not the case that "It's technically possible to circumvent it, ergo there's no point doing it".

There's a strangely defeatist attitude I see about this, often voiced alongside a false dichotomy: that what we need to do is teach our children responsibility instead of using technology to protect them. As I see it, both are needed - and the latter, while difficult, is possible. Unfortunately it currently requires skills that are far from universal. It would be much easier if people took the need for it seriously and developed better technologies for it.

There are people on this post saying about dubious content on Tik Tok that you "only see it if you like it". That's not good enough. Internet technologies lead you on in subtle ways. As an example, my son is massively into Lego. When he was 6 he discovered Lego videos on youtube and started watching them on our smart TV. After a while I realised that all of the models he was making were weapons, mostly guns. It reached a particularly bizarre moment when he handed me an (awesome, obviously!) lego butterfly knife he had made. I checked the videos he'd been watching and all of them were Lego weapon tutorials. Now, my reaction here isn't "omg weapons how horrible!". Not at all. It is, however, to note that through youtube's algorithms a general interest in Lego became laser-focussed on one, perhaps slightly dubious, genre of models.

Among other things, I want my son to understand this kind of subtle shaping/guiding influence that technology can have on its users, and until I feel he has developed sufficient awareness I want to be in a position to know and to intervene if I think he's being led by it in directions I don't approve of. At this stage in his life, I feel that is my responsibility and it would be wrong of me not to at least attempt to live up to it.

Don't mobile phones have decent parental controls?

Other kids' phones may not.

No. It is pretty crappy.

Why... why did you use it for every day for 6 months?

I got bored of it on day 3

It's designed so you'll use it every day for 6 months.

Such a different experience. I keep telling everyone how Tik Tok is the most positive social media experience. All I see is people dancing and comedy sketches. My wife sees some more "mom" jokes, but still the same overall feel.

I've never seen anything like what you're describing. Amazing how the algorithm can make experiences completely different.

I only joined to follow some people from other platforms, so my recommendations were pretty much immediately affected, but stuff from my region has been creeping in lately. Just curious, do you get the feeling that those types of things might be of common interest where you are?

what on earth were you doing to get suggested that stuff? I don't trust TikTok as much as the next paranoid computer guy, but my wife uses it and it's tailored around her interests - almost uncannily.

This would indicate that you liked one or more videos containing misinformation about the election, gruesome videos of people feeding live animals to their pet snakes and other animals, and sexualized content.

The app is full of videos of what people want to see. It's not for you, I guess (or for me)


I don't know why you are being downvoted, we already know that the country of origin is manipulating content to deliver a certain messaging:


We already know it's a massive privacy risk, deemed 'far more abusive' than American social platforms, which are already bad enough:


A piece of software being from China makes it an instant no for me.

Because the equivalent app is also used in China, where its wholly illogical that the government would allow an app dedicated to "corrupting the youth".

In fairness, TikTok and Douyin (mainland China’s version of the app) have no access to each other’s content. It’s totally possible that the CCP allows/mandates different managing styles for the different markets.

TikTok and Douyin are made by the same company, but it appears they don't share content across networks. You're comparing apples and oranges.

TikTok is 90% you get what you consume. More so than even facebook. If you consume right wing content you will get right wing content. You consume queer content you will get queer content.

This is sorta like complaining that you go to reddit every day and all you see is /r/the_donald and /r/gore. Well if you didn't subscribe (consume in tiktok's case) then you wouldn't see very much of it.

These social products are being marketed like tobacco cigarettes of old. Start by targeting kids with addictive and polluting products. Add the vaunted "network effect" and they'll find quitting ain't easy. Finally add "enhancements" to make the product palatable enough so that they can get'em young and keep them smoking.

Teenagers have always been dumb, naive and very curious to explore their incipient sexuality, the awful thing these apps do is to exploit and monetize these traits while broadcasting them to the whole world. So now instead of Susie briefly dancing a stripper routine for a couple of friends she is doing it for the whole world to see.

How is the age verified though? I doubt many people use their real birthdate

Exactly, it's really just a joke. If they were able to verify age then they would be holding lots of identifying information about children...

The fact of the matter is - the internet is an open place, and as such, not suitable for unsupervised access by children.

As well kids have been taught to lie about their age as many apps prohibit their use by anyone 13 or younger.

I really can’t think of anything worse for your attention span endurance than watching tic tok. Even watching a tv show takes focus and investment by comparison.

You might enjoy some the epic multi-part storylines being told by some passionate young filmmakers on TikTok. I especially recommend the 12-part "Fate and Chance" series by @AmericanBaron. There is long form art on TikTok, it's just rare :)

...as if most TikTok users don't already lie about their age.

My son was told by a friend: Never enter your real age, this will block you from all the cool features.

Some of these comments feel like astroturfing, doesn’t feel like a regular HN thread

Should have been 18.

I think they chose 16 because some of their biggest stars are in the 16 to 18 age range (for better or for worse). For example Charli D'amelio, who has 100 million+ followers and has become a celebrity in her own right. I personally agree that it should be 18, and ditto for Instagram and YouTube etc, but I expect that is the business reasoning behind the decision.

Is 18 some magic age where people wisen up?

They probably should have made all accounts under 50 private.

I think there is some research that maturity kicks in around 25. Something to do with neo frontal cortex development.


Myelination of our brains. In super simplified terms, it is over clocking our brains part by part. It continues until mid to late 20s.

>After infancy, the rate of myelination slowed during childhood and adolescence, but exhibited continued growth until the end of the third decade. Specifically, mean MFLD during later adulthood (defined as ≥28 y) was significantly greater than in adolescence and early adulthood (11–23 y; Welch t, P = 0.000059; Mann–Whitney U, P = 0.00016).


Our (dorsolateral) prefrontal cortex is one of the last parts to myelinate.

>The last areas to myelinate are the anterior cingulate cortex (F#43), the inferior temporal cortex (F#44) and the dorsolateral prefrontal cortex (F#45).


This is the area of the brain that handles things like planning and abstract reasoning (though it isn't the only area involved in such thinking).

It's the age where the majority of legal systems draw the line on considering a teenager to be an adult, so it isn't unreasonable.

This is the age where the last legal distinction is removed, there tend to be a lot of age related regulations that change over the years before that.

18 is pretty much only reasonable when it directly aligns with a legal requirement. Otherwise it is not the most reasonable, but the most lazy choice.

>Is 18 some magic age where people wisen up?

Yes, to the extent we accept it so much we made it into law. Though the law does seem to pick a few different age groups for other things based on a seemingly arbitrary standard. Lower for driving, even though bad driving could kill innocent bystanders. Higher for smoking, even though smoking only kills yourself. You can even sign up and join the military a year before you are able to vote for the politicians who control it.

Age of majority varies between different countries, 15 in Indonesia, 16 in Scotland and Vietname, 17 in Tajikistan, 19 in many Canadian states, 20 in New Zealand and Japan (currently) and Thailand, 21 in Singapore

To latch onto this comment, a lot of countries have slowly increased the age of maturity over time; in my country, the minimum drinking age has gone up to 18 (from 16 for light alcohol), while at the same time the age you can start learning to drive has gone down to 17 or 17,5 (you can practice in simulators at that point iirc). Just to name some examples.

Anyway, I'm pretty sure people are allowed to stay children for longer nowadays, instead of being told to marry and work in the coal mines in their early-to-mid teens.

More cynically / conspiratorial / on the other hand, people are kept infantilized as well as a means of control by various parties in whose interest it is to keep a population docile and dumb. Can't overthrow the government if you've got scantily clad teens to watch.

New Zealand is bringing it down from 20 to 18 at the end of the month. Most restrictions aren't to do with the age of being an adult though.

For example in Britain (age of majority 18, 16 in Scotland)

Sex: 16

Marriage without parental consent: 18, except Scotland (16 for local elections, 16 for Wales coming in next year)

School leaving: 16, but must remain in education outside of school (say official apprentice) until 18

Drinking: 18, 16 in public with a meal when bought by parent, either 0 or 5 at home

Driving: 16 moped + tractor, 17 cars, some vehicles are even higher (23 or 24 for powerful motorbikes for example, 21 for minibusses unless in the army)

Smoking: 18 to buy, 16 to smoke in E+W, 18 to smoke in Scotland

Gambling: 18 (lottery was 16 until recently)

Voting: 18, except in some caes in Scotland where it's 16

Fun fact: in my country the drinking ages are 0, 5, 16 and 18. 18 to buy, 16 to drink with a meal in public, 5 to drink in private and 0 if it's a medical emergency.

Canada does not have states.

I suspect Quebec might be the one that has different rules

For once Quebec is part of the majority in Canada with 18. There are only 4 provinces that set the age of majority at 19 which are BC, NS, NB and NFL.

16-18 are also made private but a 17-18 year old can change it.

That would be catastrophic for TikTok from a business perspective, as they'd lose 30% of their content overnight. In my experience, most 16+ year olds are quite capable handling themselves on social media.

How is it that TikTok and other services like this are allowed to grant accounts to minors without parental consent?

Lack of laws, and that literally nobody wants to have to legally verify their age to use a random website.

Other services like what?

Because it sounds like you're saying people under 18 should require parental consent before they can open a social media account, which is honestly ridiculous.

Different folks parent in different ways. Some consider it protecting their children against online bullying/unsafe content. From my perspective, the idea that I'd give my teenager a mobile device with carte-blanche connectivity sounds ridiculous. I wouldn't dare tell another parent to adopt my views.

> I wouldn't dare tell another parent to adopt my views.

The original comment literally was demanding that all parents adopt your views (i.e. that sites should not be "allowed" to let minors use them).

I once felt this way, until my 10 year old got a TikTok account. The problem is that my children do not have the ability under the law to enter into any kind of agreement with anyone. Only legal guardians do.

I’m very curious how successful Instagram Reels is ever since they began their hard push of it. It’s never in the news but from my perspective engagement seems to be doing pretty well, and has much tighter e-commerce integration compared to TikTok too.

Makes me wonder if TikTok is paying its biggest influencers to not post there.

In my experience, something like 90% of content on Instagram Reels is not originally created there. The recommendation engine (accessed via the search page) is almost exclusively bot accounts that steal and repost low-effort attention-grabbing content from other socials. Even the original content direct from creators usually has the TikTok watermark on it since it was originally created and posted there. In my experience as a casual Instagram surfer, it is wholly terrible compared to tiktok.

I know this can all be circumvented by changing the age connected to one's account, but I genuinely would be fairly concerned if this applies to existing accounts. I'll speak with my youngest brother and ask him of his account has been restricted. If so... there needs to be an override of some sort. Blanket bans based on age are not the smart way of going about solving the problems TikTok faces. I don't have a better solution, but I know this one makes me mildly uneasy.

For context, I'm a professional content creator whose main platform and audience is TikTok. I'm a legal adult now, but wasn't when I started. There are a lot of mature 15 year olds using TikTok responsibly and creating great content--they shouldn't need to lie about their age to continue doing that.

Is it me or it's a very WRONG idea to allow kids to connect with random adult on the internet anyway..? And when you know the long lasting relation between child pornography and internet it looks even more crazy.

Big tech sees the writing on the wall and are trying their best to clean up their yards before the all seeing eye of Congress comes in, destroys them in the press, and smashes them into a dozen different pieces.

It's so odd seeing an HN comment talking about the US legislature being competent and effective.

Personally I have more faith in the courts but we'll see.

I would not call them "competent and effective", but they're a bull in a china shop to be sure.

Humans in the future will read about our struggles with social media and our inability to fix it, and they will think we were so dumb.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact