TikTok is instrumental for offering children and teens an "in-group" and finding their interests. If you're LGBTQ, you find really wholesome content. If you're depressed, you can find people being open about their depression. If you self-harm, you can find other people discussing and opening up about it. If you're in a toxic household, there's basically a whole support network for you. It's an exceptionally human way to interact with people online, and it's so much fun.
IMHO, TikTok has been the most wholesome social network I've ever used and participated in. Even as an adult (24yo), it's even helped me explore my own identity and how I relate to the rest of the communities I'm a part of. Legitimately, TikTok has helped me feel comfortable opening up and be more "myself" in public.
That said I understand the term of good quality - in that it isn't vitriolic political messaging, conspiracy theories etc. It's mostly devoid of any true meaning - more like entertaining, sometimes hilarious content. I wouldn't say it's enlightening or provides much educational value. As I said early it also probably reinforces short attention span, attention seeking dopamine feed back loops... likely long term damage to brain development.
Just my opinion, no facts here.
My fyp is a combination of good natured dancing, linguistics, crowd sources music, hank green, a few comedy creators who engage positively with mental health and feminist topics, and animals doing silly things, recently a lot of possums. That's interspersed with memes and jokes, but very few that are particularly rude.
I am in no way under the illusion that this is what most see. But i have to question if people that complain about Charlie Damelio and similar people being shown too much, just aren’t spending a lot of time watching the videos, commenting and liking them. If you scroll past they stop popping up.
Overall the content creators I see content from seem extremely good for these smaller niche communities.
As a plus, the most unbiased source of content for the BLM protests I found was tiktok, because it was just livestreams and videos from the protests with little possibility to editorialize it. Instead of anchors screaming “violent!” Or sternly saying “peaceful!” You got streamers on spot showing exactly how violent or peaceful certain situations were.
Then as you encounter stuff you like click hashtags that look interesting, explore sounds to find similar videos, checkout a producers other videos.
I watch a lot of strength training/weight lifting videos on TikTok, and I would say the vast majority of the comments are supportive, positive and/or constructive.
Early Twitter (in particular) was a game changer for me, who was working from home for the second time in my life. I had people to talk to and bounce things off and all of a sudden work wasn't so lonely (as I had been the first time I tried). There was a genuine community feel without all the snark and unpleasantness.
I don't use TikTok, but I hope they'll be prescient enough to see where the others have struggled and keep on top their content policies, communities and moderation.
If it focuses on people, you can meet with real friends, and that's fine, because you get along.
If it focuses on topics, it is also fine, you meet people you don't know, but because you talk about subject you all enjoy, that's fine.
Problem starts when the two mix. If you enjoy astronomy for example, the other guy may be an asshole, but you probably don't even know and you get along fine, because all you do is talk astronomy. But if the network thinks that because you like the same thing, he must also be your friend, then your feed starts filling with assholish things.
Stage 1: The person is relatable, the videos are simple and to the point.
Stage 2: The following has grown, and the person now posts more frequently, however the quality/humor/personal touches are reduced. Less content about fitness, more about personal lifestyle
Stage 3: Monetization. Workout PDFs, constant advertising of fitness products (protein shakes, weight belts, etc), lots of hollow 'motivation' posts. The audience keeps growing, but the community has disappeared.
I don't think the commenters who say they only get weird or bad content actually know how TikTok works, or how to use it. You can't just scroll your FYP and expect to get new content, you have to go out of your way to find it. FYP is just a feedback loop of videos you keep watching.
If you want to start getting more queer content, for example, start searching the #lgbt or similar tags and watching/liking/commenting. Your FYP changes really quick.
Not the experience for my circle of friends.
> for example, start searching the #lgbt or similar tags and watching/liking/commenting.
What if some hate video for the community gets posted under the same/similar tag? I am not sure how it is now but mid last year one friend saw something very offensive on a tag related to his nationality with a lot of likes.
Plus the conversation is about under 16s, I highly doubt what they're seeing is the same as you, so it's not really relevant what your feed is like.
But you can’t bar talking about it because sometimes it’s bad. It’s tough.
Not saying you’re wrong at all. Just suggesting there’s a lot of nuance here and it’s difficult to manage. Your point is very correct and important.
I don't think its particularly helpful as based on what I've seen myself, a lot of these 'support networks' actually veer on glorifying the issues.
You shouldn't base your social group on an echo chamber.
I think glorification occurs because it’sa much easier way to, at least temporarily, alleviate the struggle some issues can cause. It’s an understandable response to pain, especially in young people confronted with complex internal problems.
Also your point about children not being able to judge these things for themselves immediately made me think... It seems a lot adults can’t be relied on to do it either.
The glorification can provide short term relief but creates countless more issues than it solves.
These issues should be dealt with using advice from professionals in combination with a support network.
This may sound counter-intuitive, but please don't judge that sort of thing until you've gone through it.
There are many online communities that seem dangerous or destructive to people who haven't gone through the issues, but are actually incredibly helpful to those who do.
It definitely doesn't seem like the communities that actually help each other gain a sense of security required to pause and self-reflect and unpack their issues get very popular because the click-optimizing algorithms don't give them a chance.
But there's so much more content under the surface that doesn't get talked about. Travel TikTokers give me ideas of where to go in 2021, I stole an Oatmeal recipe from a bodybuilder on there that I still make every morning. I took some ideas about Lightroom from a number of Photography TikTokers, the rabbit holes are endless.
TikTok I think it in it's "pre-mainstream" phase where you get the best of user content but it still hasn't been muddied by promo deals, partnerships and big media. Sure big media is there but most of the content I see reminds me of YouTube in that magical time between when it went mainstream with users but not big brands and corporations.
This has led to a load of interesting communities springing up like the Sea Shanties one and quite a few musical communities doing collaborations.
That should be part of the discussion, though. It's difficult to separate Queer philosophy from predatory pedophilia. It's part of the system of thought, back to the beginning.
Age becomes relative, and Parenting is seen as an oppressive structure that is imposed on children.
any sane individual studying queer theory is still going to understand the relationship between consent & age, and how pedophilia is a mental illness and not a sexual orientation specifically because it violates the ability of younger individuals to give consent
They believe that "normal" sexual ethic is imposed on children by oppressive power structures (enforced by parents). So they believe that they're liberating children by exposing them to queerness. They don't see it as peer pressure or coercion because they see sex as something similar to choosing between ice cream or cookies, which a child is capable of.
Gayle Rubin also defended NAMBLA quite clearly.
Many examples of this stuff.
It really isn't.
Mine is filled with skaters, mom-daughter dumb jokes, silly lesbian/transgenders sheniningans.
There's sometimes weird things like a priest telling stupid jokes alone in his church and that korean guy doing asmr of burning bread in the oven.
On facebook I saw the death of participants in the events that happened on the 6th but that was in a journalist/news/media analyses group.
I see a lot of good vibes content on Tik-tok but that's what I am looking for on the platform. No doubt I could soon end up on indian train deaths if I wanted. Just like that time on Youtube when I saw that out of nowhere, I was looking for something totally unrelated to trains or india though.
I live in a big city and most of the feed is highly sexualised, highly political, or highly prank-related.
My friend that lives in a small town about 100 miles away has to essentially send me fun tiktok videos and, no matter how many likes I do of those videos, the base diet of sexual, political, or prank-related videos are always there.
I run into this daily. I'm in the military and so my internet connection at work is mingled with a thousand other military people. Ok, it is accurate to say we are predominantly male and younger, but every time I see an ad does it have to an an anime character? Some of us recently gathered to watch the DC riot footage on youtube. The footage started with a non-skippable ad for some pervy anime game. Try holding a strait face in front of the general as a busty cartoon character in a negligee is bouncing up and down on a ten-foot screen.
I personally use ublock to correct in real time my world lenses.
Classic! I love real world tidbits like this. I would suggest in the future use youtube-dl https://ytdl-org.github.io/youtube-dl/index.html to download the video to your machine to avoid situations like this.
More anecdata: I also live in a big city - London - and I don't see this. I joined TikTok just a few weeks ago and I was surprised at how overwhelmingly positive and fun it all is.
I know for a fact that what's on my Discover feed (surrealist art, UX/UI stuff, Newfie dog videos) is unique to me.
That's a good point. I was wondering about the initial seed of the feed.
It wasn't my intention to shame though.
Until tiktok has something like channels or theme/tag subscription it will remain a problem.
Your point about what they like is irrelevant to that point.
So if OP is seeing "loads" of sexualized content, it's because they spent enough time watching sexualized content before swiping for TikTok to think that's what they want to see, and ensure they see more of it.
Even if the picture you are painting of the previous poster is accurate, being horny once in a while is by no means justification for an endless stream of sexualized content.
That's not a neutral quality of the system, but a deliberate feedback loop establishing an addictive behavior. We already know the shortcomings of the human condition and should ask for less footguns, not more.
We can never be sure of anything. What if HN start pushing train death videos on the front page and the admin can't tweak the algo because a glitch in the stack prevent them from accessing the backend ?
This sounds very much like victim-blaming, I don't think it's appropriate to side with TikTok on this.
I almost always see content that is only fun.
That has made it much better for long-term engagement and positive for my mental health, rather than other social media's short-term outrage engagement that is wholly negative.
It could mean the algorithm designers have decided that optimising for fun is more effective at increasing engagement than optimising for shock value and conflict.
But, this is a different choice than other social media designers have made. It seems like there is some more wholesome motivation.
I experience this with politics videos on YouTube. Sometimes I'm in the mood for that. One evening I'll watch a few politics videos. The next day, my recommendations are filled with politics. Over time my recommendations get back to normal. Then I'm struck with the politics bug again, but my recommendations don't give me an easy way to find more politics content. They're recommending the type of stuff I'd normally watch. The day after that I'm annoyed again about politics being all over my recommended lists.
It feels like I should have separate profiles based on the topics I'm interested in. The platforms don't seem to really support that though. Even different users seem to influence each others suggestions.
It's like putting a lot of chocolate with sugar in my apartment: the wise choice is to not even buy anything that contains sugar, as I can't stop myself from eating it if it's in front of me all the time.
And for the record I consider a lot of tiktok pranks to be borderline psycho (eg gaslighting your SO for giggles about divorce, abortion, etc.).
I did a similar experiment myself and found it to be a very delightful stream of content. While it did have some sexualised content for sure, it wasn't too much in frequency and degree, nothing more than you would see on a normal tv drama anyway.
And it was just so much fun - mostly teens doing amateur choreography, skids and pranks. I uninstalled it mostly because it was so effective in capturing my attention than anything else.
I still see teenagers going about trying to mimic one routine or another, and it always makes me smile as I can see them being creative rather than just consumers. Doing a high quality YouTube video would be too hard for them, but a quick 10 seconds video - they feel they can do it and try to participate, which is great in my book.
This has nothing to do with your preferences/what you like. It doesn't care if you feel disgust, as long as the app stays open.
Suggestion/auto feeds are designed to abuse human psychology to maximize time wasted, not interest.
So if you are disgusted with something. Stop Watching It!
Yes this does optimise for time spent watching, but interest definitely plays into it.
Blaming the user for engaging by subconscious impulse in a machine designed with every psychological trick in the book to force you to do just that, to optimize screen time at any cost, is just wrong. Assigning interest, joy, or even conscious intent to that engagement is very, very wrong.
The only thing you can do is remove the app, but it won't take long before you run into the next abusive machine. And quite frankly, as with any other intentionally addictive interaction, it is not easy to realize the problem and remove it.
At our core, we're still stupid monkeys. We cannot help ourselves on an individual level. The platforms must take this responsibility, there is no other way.
I'm starting to think the answer is a sort of "feed hygeine" that we will all have to learn to keep up with. I successfully made my facebook feed a very calm place by unfollowing toxic friends, hiding posts that made me angry, and joining positive and productive groups.
Maybe we'll be expected to teach our kids how to do this. Maybe it will become part of mainstream internet culture that we know how to curate our own feeds through our activity.
When I check the phones of my kids or my wife, I can see totally different type of videos recommended to them.
Literally the first recommended video was a young woman in a school uniform - so that's a hard NO!
a) snakes gotta eat, and,
b) children often wear school uniform.
The reason you should want to keep children and alcohol apart is because alcohol is a drug, not because you think school-children are inherently sexualised.
And how would you ensure / enforce this? Honest question, because I'm in that situation right now.
Children are people too, just with less experience. Your goal really isn't just "ban TikTok", it's "raise my children to develop healthy habits on their own and to recognize when they are being manipulated."
So, even if you ultimately decide to enforce parental controls, I hope you will bring your children into the decision-making process. Have an honest dialogue with your children about your concerns, and develop a space where they are free to share their own thoughts and feelings without fear of judgement. It's important that your mind is not already made up before you sit down at the table, as they'll sense it immediately. Repeat this often, and make it easy for anyone (yourself included) to express that their perspective has changed. Set a good example (by e.g. not using social media yourself) and apologize when you are wrong, so that your children learn that it's safe to do the same.
Encourage them to pursue healthier alternatives. Allow them to experiment and learn from their mistakes.
I pretty quickly realized that kids, even little ones, like 3 years old are far smarter than you think. They are basically little humans without much experience and that means not great judgement.
So as a parent it’s important to make sure they are exposed to things they understand and can process. Things that are age appropriate. If something isn’t appropriate, well, you try and explain why (as best as they can understand).
I try to explain to my 7-yo the _reason why_ for many things, whenever I can. Key is to not push it when she loses interest. If it's important, just end up talking about it again at a later datetime. There's only so much input a child (or adult) can accept before the buffer is full and needs to be mapped. It takes time to put into context.
Also accept that some things can't really be explained by everyone. A guy on YT with a channel on self defense said, "if you are not a violent person, you'll never understand violence", and I think it has some bearing. Sure, I can certainly understand that, in the case of X, he was beaten as a child and therefore may see violence as a way to handle his feelings, but that doesn't explain "why did X beat up Y unprovoked last Wednesday". There's rarely a single causality that explains things like that.
And, ads, yt-videos, social media, most often are manipulative. Eg the "youtube-face", excessive reaction-videos, etc. (no, she has no access to social media).
So, we need to "gracefully" ban this; that is, if it needs banning at all.
The alternative is to embrace it and show how to use it properly.
And then, there's this addictive nature to it - the app is designed and tuned to provide dopamine hits; how to fight that?
Yeah, exactly! Growing up in this age, it's impossible to avoid online interactions of some kind. So, it's better if children learn how to use technology / media / internet responsibly in a controlled environment. You're like the guard rails in a bowling alley :)
One approach might be to let them use TikTok or whatever app is popular, and just try to get them to learn to self-identify 1) how long they spend on the app, 2) how using the app made them feel, and 3) whether or not time on the app took time away from something else they might enjoy. Help them learn identify the positive aspects and the negative aspects of the platform, and only consider a full ban if you start seeing extremely problematic patterns.
If you have any personal self-improvement goals, it might also mean a lot to your kids if you make a habit of sharing your progress with them (in a way they can understand at their age).
Good luck :)
when I have such discussions with friends and family I tend to say that I'll rely more on trying to make my kids understand what they are doing than trying to police what they are using. Not saying that I'll not do some policing on their devices but I'll just rely on that
Sure, they can still use their friends' phones, but you can only protect a child so much. You've made a good start by explaining the issue and banning the app, but you can't track them all day to make sure they're not doing it elsewhere.
That means, that they wont have that content pushed on them for hours either before sleeping or while you work and they are in home due to lockdown.
They will have access to it only while they are with that friend.
I have no intention of letting him have a mobile device any time soon, but if and when I feel it's appropriate/necessary, I have the option of giving him a "managed" device (i.e. like a corporate device) with always-on wireguard to my home network, routing traffic through the same proxy. I haven't tried setting up a device like this yet, but the necessary capabilities appear to be present in both iOS and Android. This will also allow me to control which apps can be installed on the device.
I am of course entirely open with him about this, including the technical aspects of how it works, and frequently discuss all of the many issues involved. My goal is not to hide reality from him or to instill some unreasonable fear of what's out there - quite the opposite. It's to try and help him arrive at a healthy relationship with the internet as an adult, something that most adults I know (including me) have so far failed to establish.
And yes, of course it's possible to circumvent all of this stuff (although quite a lot harder with what I have than with the vast majority of parental control solutions). And yes, I can only control the technologies he has access to that I manage. But you have to consider the "threat model" here. He doesn't rail against this restriction. He understands it. If he wants access to something he asks for it. If I say no, I explain why, and he accepts it. We'll see how that develops over time, but it's certainly not the case that "It's technically possible to circumvent it, ergo there's no point doing it".
There's a strangely defeatist attitude I see about this, often voiced alongside a false dichotomy: that what we need to do is teach our children responsibility instead of using technology to protect them. As I see it, both are needed - and the latter, while difficult, is possible. Unfortunately it currently requires skills that are far from universal. It would be much easier if people took the need for it seriously and developed better technologies for it.
There are people on this post saying about dubious content on Tik Tok that you "only see it if you like it". That's not good enough. Internet technologies lead you on in subtle ways. As an example, my son is massively into Lego. When he was 6 he discovered Lego videos on youtube and started watching them on our smart TV. After a while I realised that all of the models he was making were weapons, mostly guns. It reached a particularly bizarre moment when he handed me an (awesome, obviously!) lego butterfly knife he had made. I checked the videos he'd been watching and all of them were Lego weapon tutorials. Now, my reaction here isn't "omg weapons how horrible!". Not at all. It is, however, to note that through youtube's algorithms a general interest in Lego became laser-focussed on one, perhaps slightly dubious, genre of models.
Among other things, I want my son to understand this kind of subtle shaping/guiding influence that technology can have on its users, and until I feel he has developed sufficient awareness I want to be in a position to know and to intervene if I think he's being led by it in directions I don't approve of. At this stage in his life, I feel that is my responsibility and it would be wrong of me not to at least attempt to live up to it.
I got bored of it on day 3
I've never seen anything like what you're describing. Amazing how the algorithm can make experiences completely different.
We already know it's a massive privacy risk, deemed 'far more abusive' than American social platforms, which are already bad enough:
A piece of software being from China makes it an instant no for me.
This is sorta like complaining that you go to reddit every day and all you see is /r/the_donald and /r/gore. Well if you didn't subscribe (consume in tiktok's case) then you wouldn't see very much of it.
The fact of the matter is - the internet is an open place, and as such, not suitable for unsupervised access by children.
They probably should have made all accounts under 50 private.
>After infancy, the rate of myelination slowed during childhood and adolescence, but exhibited continued growth until the end of the third decade. Specifically, mean MFLD during later adulthood (defined as ≥28 y) was significantly greater than in adolescence and early adulthood (11–23 y; Welch t, P = 0.000059; Mann–Whitney U, P = 0.00016).
Our (dorsolateral) prefrontal cortex is one of the last parts to myelinate.
>The last areas to myelinate are the anterior cingulate cortex (F#43), the inferior temporal cortex (F#44) and the dorsolateral prefrontal cortex (F#45).
This is the area of the brain that handles things like planning and abstract reasoning (though it isn't the only area involved in such thinking).
18 is pretty much only reasonable when it directly aligns with a legal requirement. Otherwise it is not the most reasonable, but the most lazy choice.
Yes, to the extent we accept it so much we made it into law. Though the law does seem to pick a few different age groups for other things based on a seemingly arbitrary standard. Lower for driving, even though bad driving could kill innocent bystanders. Higher for smoking, even though smoking only kills yourself. You can even sign up and join the military a year before you are able to vote for the politicians who control it.
Anyway, I'm pretty sure people are allowed to stay children for longer nowadays, instead of being told to marry and work in the coal mines in their early-to-mid teens.
More cynically / conspiratorial / on the other hand, people are kept infantilized as well as a means of control by various parties in whose interest it is to keep a population docile and dumb. Can't overthrow the government if you've got scantily clad teens to watch.
For example in Britain (age of majority 18, 16 in Scotland)
Marriage without parental consent: 18, except Scotland (16 for local elections, 16 for Wales coming in next year)
School leaving: 16, but must remain in education outside of school (say official apprentice) until 18
Drinking: 18, 16 in public with a meal when bought by parent, either 0 or 5 at home
Driving: 16 moped + tractor, 17 cars, some vehicles are even higher (23 or 24 for powerful motorbikes for example, 21 for minibusses unless in the army)
Smoking: 18 to buy, 16 to smoke in E+W, 18 to smoke in Scotland
Gambling: 18 (lottery was 16 until recently)
Voting: 18, except in some caes in Scotland where it's 16
Because it sounds like you're saying people under 18 should require parental consent before they can open a social media account, which is honestly ridiculous.
The original comment literally was demanding that all parents adopt your views (i.e. that sites should not be "allowed" to let minors use them).
Makes me wonder if TikTok is paying its biggest influencers to not post there.
For context, I'm a professional content creator whose main platform and audience is TikTok. I'm a legal adult now, but wasn't when I started. There are a lot of mature 15 year olds using TikTok responsibly and creating great content--they shouldn't need to lie about their age to continue doing that.
Personally I have more faith in the courts but we'll see.