Hacker News new | past | comments | ask | show | jobs | submit login

I think the underlying point is that platforms are gearing their algorithms towards manipulation, and that a new user account has a pre-set path towards being rabbit holed to where most of the content they see can trap them into certain mindsets...

This worries me a lot, because people generally think they are choosing what they want to see, but the platform can shape the world views of many people all at once.

If you trap someone in a room and play violent videos for them all day long, when you let them out, they might come out thinking they'll have to fight for survival, just as an example.

Governments can encourage political chaos through subjecting people to certain types of videos through these platforms over time, and as another example, they can influence certain regions to start a riot or vote a certain way.

Behavioral psychologists are involved in algorithm development now just as much as developers and that's a big problem that most of the world, especially government regulators are totally unaware of.

I have a TikTok account, but only use it in really small doses because I found the suggestive content really was not good for my mental health. I hope more people realize that this is happening, and also understand the lies about potential for sharing independent work on platforms because the platforms are really not geared towards discovery of new content, they have very specific agendas and money making at the foremost behind them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: