There's been evidence, admittedly anecdotal AFAIK, that YouTube recommendation algorithms can tend towards increasingly extremist content from quite innocuous starting points. For example:
This doesn't mean YouTube staff are purposefully designing the algorithm to specifically trend towards extremism: this can fall out naturally from human psychology and ML algorithms that note which videos increase engagement and recommend videos based on user interest and recommending videos that will increase the user's time on site.
I think this is worth looking into in a more methodical manner. You can dismiss it as cherry-picking, but I think it's something reasonable people can be concerned about. And these services and products should be looked at separately, determining which parts are good and which parts are bad.
You could write a pseudo-explanation even longer than the comment you are purporting to describe, or you could simply read the words that I wrote and not try to put words in my mouth.
Would the two of you ('jonathanyc and 'kimdcmason) please stop engaging uncharitably with each other? There might be common ground there somewhere, but you're not even trying to listen to each other. It doesn't matter at this point who's right or who's wrong or who started it or who's at fault. You're both contributing to continuing it. As one HN member to another, please help increase civility and substantive discourse.
I feel obligated to listen to and respect people who disagree with me on the issue of whether Google and Facebook are evil. But I can get and have gotten intelligent and reasoned perspective from people who aren’t spouting the names of logical fallacies as if they were hexes and caltrops.
As much as I feel obligated to engage with others constructively, I feel obliged to discourage and make others aware of nonconstructive behavior. I take your point that this thread is going nowhere, though, and I will stop. I acknowledge I may have gone too far.
* https://twitter.com/zeynep/status/973716995521748992
* https://twitter.com/Limerick1914/status/972837152940855296
This doesn't mean YouTube staff are purposefully designing the algorithm to specifically trend towards extremism: this can fall out naturally from human psychology and ML algorithms that note which videos increase engagement and recommend videos based on user interest and recommending videos that will increase the user's time on site.
I think this is worth looking into in a more methodical manner. You can dismiss it as cherry-picking, but I think it's something reasonable people can be concerned about. And these services and products should be looked at separately, determining which parts are good and which parts are bad.