Hacker News new | more | comments | ask | show | jobs | submit login
Continuing our work to improve recommendations on YouTube (googleblog.com)
143 points by minimaxir 28 days ago | hide | past | web | favorite | 285 comments



My biggest problem with YT recommendations is that they dramatically overweight recent videos when generating the recommendations. I mean, I can spend a week doing nothing but watching (well, listening) to videos of classical music performances, and my recommendations will be full of classical music, exactly as you'd expect. Then I watch one random one-off thing, like "15 funny pitbull fails" and suddenly ALL of my recommendations are animal videos. Then I have to go manually edit my feed history and remove "15 funny pitbull fails" to get my recommendations back to normal.


All the AI stuff is really still incredibly dumb. Same for Amazon: Buy a mixer and suddenly you will be haunted everywhere by ads offering more mixers.


All the AI stuff is really still incredibly dumb. Same for Amazon: Buy a mixer and suddenly you will be haunted everywhere by ads offering more mixers

Do we know it's actually dumb?

Like, given that you just bought a mixer, perhaps in an absolute sense, you're unlikely to buy a second mixer. But relative to everything else that you might buy, perhaps you are very likely to buy a second mixer!

There are huge financial incentives to get these ads right. I'm more inclined to guess that Amazon is doing something right than that they're leaving money on the table.


Just spitballing here, but if you just bought a mixer, your odds of buying another mixer is lower than that of buying a cook book or some other related stuff -- which Amazon could totally figure out mining the other stuff people bought at the same time or shortly after a mixer.


It's still not necessarily the better idea for them.

Some percentage of people who just bought a mixer will get a product they aren't satisfied with, for whatever reason. That population has a much higher change of buying another mixer than other people. And mixers are probably far more expensive products.

I have no idea if, in an absolute sense, there's more chance you'll buy a cookbook rather than another mixer, but it wouldn't surprise me either way - on the one hand, obviously most people don't need two mixers. On the other hand, plenty of people who buy a mixer will never buy a cookbook in their life, but literally 100% of people who bought a mixer have proven that they were at some recent point in the market for a mixer. I'm not sure which effect wins out here.


There are huge financial incentives to get these ads right.

Not really. All you have to do is convince companies that machine targeted ads are worth it. Apparently that is orders of magnitudes easier than actually recommending anything relevant.

Which isn't hard to imagine either, there aren't much competition after all. So the primary purpose of targeted ads is that it functions as an imaginary hammer you can use to squash new competition with.


Alternativly there are many matching and complementing products for a mixer (or any other products for that matter)

Just bought a mixer, how about: A new set of mixing bowls, alternative whisks and hooks, perhaps a set of measuring cups, a kitchen scale?

Just bought a new smartphone, how about: an external battery, screen protector (that fits the specific model), a fancy charging dock?

It really doesn't seems like it would take much intelligence to create these sets of complementing products and then use them for further ad targeting in the future, you could even use AI and ML to create the sets in the first place so you can fill out your buzzword qouta.


You think that's dumb, but I bet the marketing data behind that says otherwise.

You think "I already have a mixer, idiot!" but in reality the chance that you are interested in buying a mixer just went from 1% to 2%. (Because you want another one, because you want to gift one, because you returned yours, etc.)


This has been discussed on HN before.

The last time I saw an Amazon employee weigh in, he said the ads are in fact not effective but the ad team isn't measuring it well. In particular, he claimed that he showed them that after adjusting for returns, the order-again rate for a particular expensive item was effectively zero, they asked him to repeat his study for another product, and he wandered off and found work to do for his own team.


My skepticism level remains high. This is billions of dollars, with easy access to tons of data.

I'm not sure if put more stock in the government successfully hiding a fake moon landing, or analysts in one of history's most successful companies not running the numbers.


I think the problem here is that "recommendations that maximize expected marketing revenue" and "recommendations that maximize user experience" are not the same. Most of us would prefer Google to present us the latter, but they have considerable financial incentives to present the former instead.


> "recommendations that maximize expected marketing revenue" and "recommendations that maximize user experience" are not the same

That, I think, hits the nail on the head.

I think it explains the Amazon case quite well. I'm not so sure sure about the YouTube idiocy.


The mixer probability maybe went from 1% to 2% but there must be 100s of other products they could show that have much higher probability. Maybe a mixing bowl or a cookbook?


I'll take that bet. Show us the marketing data.

So far it looks just as dumb to me.


Some other online store, but related :)

I once bought crab meat in a can from an online store (hard to get it any other way in Poland).

A banner for buying more canned crab meat kept haunting me for the next two years, in various internet locations :)


I guess that’s proof that consumers overpay for it.


> All the AI stuff is really still incredibly dumb.

and scarily brilliant at the same time. I was looking up Newtek some time ago of the Amiga Video Toaster fame. Googled Wil Wheaton promo video (Will worked for Newtek :o), Penn Jillette (also promoted it) and finally Kiki Stockhammer (spokesperson). Here is where things get interesting. Google spewed 1997 "How To Have Cybersex On The Internet" https://www.youtube.com/watch?v=PBDKEn-TeQg as the result for Kiki haha, why would it do that? Turns out this "instructional video" was made using Video Toaster and features one of the iconic special effect transitions

https://youtu.be/PBDKEn-TeQg?t=9s

here on Toaster demo reel https://www.youtube.com/watch?v=K1OVWfmynPw&feature=youtu.be...

so what does it have to do with Kiki Stockhammer? This transition is Kiki doing poses on the green screen. Google search somehow connected this information together and was able to recognize Kiki Stockhammers silhouette on random youtube meme video :o !

This isnt the first time I got image search results that could only happen if Google was analyzing actual image content. Microchip part searches often return pictures of random boards featuring that particular IC despite no bom list in the html, Google is OCRing images and recognizes part numbers. Same with faces, cars etc.


Read the Youtube recommendation paper (https://ai.google/research/pubs/pub45530) and it will become very clear why it recommends a ton of high-engagement clickbaity content based off of a minimal set of recent watches.


Could you give a hint? I'm not seeing it.


This is literally mentioned in the blog post.

> More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles.


Just checked my feed of recommended videos and it is still full of Tetris and excavator videos that I was into lately. It would be awesome to be surprised with some extraordinary lectures or documentaries similar to stuff I watched 5 years ago, or so.


I have the opposite problem. I'm only interested in new videos for the most part, and my recommendations are filled with videos from months ago. It would be much better if they would give us a flag or some date range options. Obviously this increases UI complexity, but maybe it doesn't need to be that prominent?


Things like this were a constant source of frustration at google. The philosophy there was to streamline the experience for showing ads to the masses rather than provide any features for power users. Feature requests like this were brushed off by labeling the requester "not the target user".

I guess we all know this by now, but google doesn't care about your being able to use its products in a reasonable way. In some sense, this makes sense since I bet the overlap of power users and people with ad blockers is pretty high.


> In some sense, this makes sense since I bet the overlap of power users and people with ad blockers is pretty high.

Maybe you should speak for yourself? I do not run ad blockers (much less anything targeting Google ads!) and the experience I get as a power user on Google services is no better than anyone else's - although, to be fair, it's not noticeably worse, either. (And mind you, the average ad supplier on the Internet is a hell of a lot shadier than Google, Facebook or any of the big "household name" tech companies!) I'm sure that plenty of other people here can confirm my POV.

Google's trouble engaging with power users as of late is entirely self-inflicted. There's quite simply zero doubt about that.


"Obviously this increases UI complexity"

It always annoys me that giving users options they can control is viewed as complex but search engines that return whatever they deem as important aren't viewed as complex.


I think when the GP says new, he meant newest entry in their history. His problem is that watching one unusual video immediately and dramatically biases the recommendation engine at the expense of videos that match his long term viewing habits.

What you’re after already exists, the “recently uploaded” section on the home page.


It's a combination of that and focusing on recommending videos in depth on whatever topic it thinks you're most interested in. I don't want to watch fifty videos about swords. Three is plenty for now. I want to watch a bunch of different videos and shallowly explore a lot of topics, and the YT recommendation system is extraordinarily incompetent at facilitating this exploration mode.


It's terrible. I have to completely log out when I watch videos which put me to sleep a couple times /week because I don't want my suggestions filled with dry history lectures.


> I have to completely log out when I watch videos which put me to sleep a couple times /week because I don't want my suggestions filled with dry history lectures.

Why not just pause your watch history (or, in case you forget, remove the items after the fact)?

https://support.google.com/youtube/answer/95725?hl=en


If you're on mobile, there's an incognito mode (at least on Android).


If you're on mobile there's alternative ways of watching Youtube videos, at least if you're on Android:

https://github.com/TeamNewPipe/NewPipe

No ads, no direct tracking [1], just videos.

[1] Google can still watch your device downloading videos of course...


NewPipe is great. I'm honestly kind of surprised that it's allowed on the play store.


One of the first things mentioned on that Github page is the following warning:

WARNING: PUTTING NEWPIPE OR ANY FORK OF IT INTO GOOGLE PLAYSTORE VIOLATES THEIR TERMS OF CONDITIONS.

Newpipe is available on F-Droid but that version is sometimes a bit old. The project maintainers suggest the following order of preference for getting access to the latest version:

In order to get this new version, you can:

- Build a debug APK yourself. This is the fastest way to get new features on your device, but is much more complicated, so we recommend using one of the other methods.

- Download the APK from releases and install it.

- Update via F-droid. This is the slowest method of getting updates, as F-Droid must recognize changes, build the APK itself, sign it, then push the update to users.


It's not afaics. It's on f-droid though.


Tried it, annoying because it forgets I have YouTube Premium and starts showing me ads. Now I just turn history off forever.


That's used to be the case for me, but since like a month I see more older content(also YT has removed date of when the video was published from right hand recommendations).

If I want something new, escape my bubble, I use other platforms like bitchute or peertube. Sometimes it's pretty scary but it's getting better.


I actively avoid watching YouTube videos linked from Reddit or similar sites just to avoid this phenomenon.


Opening almost all YT links in a private/incognito window does WONDERS.


I do this. it has pluses and minuses. the plus is I'm not tracked. the minus is YouTube can't recommend. Sure of course if it's a cat video then I dont want YouTube recommendations but I watch almost all videos in incognito mode to avoid tracking. I'm not sure which ones I want YouTube to know and which I don't.

Maybe I'd just prefer to get links from friends


I tried FB video recently and was amazed by its predictions. It was pulling up compelling video from stuff I liked a decade ago that I was more than willing to view, it was kind of uncanny.


I fully understand that experience. I somehow always manage to sucker myself into free-falling into a rabbit-hole (ha, pun) of cute bunny videos.


What really bugs me is when a video literally has "Episode 18" in the name, but somehow the recommendation list on the right side fails to find the one with "Episode 19".

But if I want to go watch Episode 5 again or jump ahead to Episode 26, it's happy to help. Awesome.


As I remember at least some series videos used to be recommended in order, but I haven't seen that in a while.

The behavior that really gets me now is having the top recommended videos on my YT homepage ones I've watched... Recently. I can watch a video from my notifications, and then have it stay at the top of my homepage until I hide it manually. Refreshing the page changes some videos, but rarely the top few.

Of course, hiding the video helps but I get tired of having to do it constantly. I think I'm resigned to my YT homepage being useless at this point.


I think this behavior is probably driven by people like me, who use youtube mostly for music. I repeatedly listen to/watch performances on youtube because to me it's the only streaming service I have. I frequently come to youtube just to watch stuff I've already seen. I find the recommendation in that context to be uncannily good and they keep me coming back to find new music and to see the cool performances I've already seen. Maybe youtube has trouble differentiating these use cases?


When I'm programming, gaming I mostly listen to retrosynth and outrun.

For that youtube is great because it's autoplay just seamlessly plays one after another.

For everything else it's recommendations are hilariously bad.


It's possible your adblocker is preventing YouTube from tracking your watched video history. I know pi-hole does this by default unless you whitelist certain domains.


user agent hits Google random-string.googlevideo.com/videoplayback with referrer, ip, signature and several other parameters coding your individual user id, Google knows every single time who, why and when requested particular resource.


Just because they have the data doesn't necessarily mean that they use it. They might rely on some javascript XHR to tell them when a video is really watched. Just guessing here.


Happens to me on non-adblocked mobile all the time, it is infuriatingly dumb. Google seems to not be able to do user friendly UI's.

Don't get a started on discovery!


I think when you turn off tracking what you watch it warns you this will happen. it does this for me i think and didnt before i turned off watch history.. but maybe i am just imagining that. And maybe you havent turned off watch history.


I wondered how many others experienced this. It really makes it demotivating to use YT


What drives me insane is there's no way to mark videos as watched. You can hide them only with "not interested" even though you totally are.


When I mark a video as "Not interested" I get a link to "tell us why", where "I've already watched the video" is an option.


I wrote my own tempermonkey userscript just for that occasion to hide already watched recommendations :/


And there is no way to find Episode 19.

You know the exact date of Episode 18 so you assume Episode 19 was published soon afterwards. So you go into the profile and click videos.

You realize that the date is not even listed, it just says "x months ago", or "x years ago". And since this video was from two years ago you have to sift through all videos of the last two years. And you can't just ctr+f. You have to go to the bottom to load the next 30 or so videos.

The user interface for these kinds of sites are just incomprehensible poor.


I wish they would fix their iPad app. I'm learning to play guitar with the JustingGuitar channel. Justin has created playlists that match his book that I bought. But as soon as I play a video from that list, it seems to lose that context.

Plus, on the iOS app there's no way to search within a channel or playlist. If I want to find Justin's lessons on a Nirvana song I have to search "justin guitar nirvana" rather than just "nirvana" when I'm on that channel.


I strongly recommend downloading educational and otherwise valuable content. It solves so many problems, present an possible.


Have you tried deleting the iPad app and using the website?


The app is better than the website in other ways so I'm reluctant to do that. I actually called YouTube and talked to somebody about this. Whether or not that makes a difference, who knows?


That's even worse when you make videos, especially when they search for something that is episodic like a let's play.

I'd much rather have youtube recommend episode 1 or the first video of the playlist to people but instead it gives them a random one in the middle. This causes the viewer to get mad and press thumbs down. This also makes the "average viewed minutes" go drastically down.

That's probably the source of most of my thumbs downs.. It also causes ridiculous stats like episode 27 of 50 has 100 times the number of views as the first episode but not more view time.


That recommendation issue you point out is extremely annoying since YouTube has no way to reverse the order of videos in a playlist. More often than not for some reason Let's Play and other chronological video authors create their playlists in a reverse chronological order, latest first so it's basically impossible to binge them unless you manually click on the next episode yourself.


I make mine in chronological order!

It's actually just a playlist setting for the creator but the default is reverse and that makes the whole list useless. You even have chrome and firefox plugins to fix it.

I also link the playlist in each youtube description but unfortunately they are not so easy to spot on app/tv/game consoles.

Also, in this case I think it's OK to youtube-dl the list and play them in order locally.


It's eternally appreciated when creators consider doing that!

Unfortunately I use Safari so it's not an option for me, sadly one of the very few downsides to using Safari is relatively weak extension support.


You've summed up my frustrations, well.


What would be even better is if they had somewhere that you could just specify what episode it is in a series rather than it try to guess based on titles


Youtube creators already have the ability to create series inside playlists.

Example: The Slow Mo Guys channel has a "Planet Slow Mo" series: https://www.youtube.com/playlist?list=PLbIZ6k-SE9Shmj0Uvxtrv...


The problem is when you first find a video that's interesting you don't necessarily find the playlist that it belongs to. (or worse, you find it in someone else's playlist which includes the video along with other seemingly random videos that the playlist creator was interested in)


This is not available to... well, almost anyone. Seriously try searching for it. Every permutation of your search terms will bring you back to one of three results that all heavily imply or say that it's available to a select handful of channels

It's not even subscriber gated or verified channel gated, it's someone at youtube for some reason flicked the switch on for this channel gated


Sometimes they're in a playlist, but sometimes I'm not watching via the playlist. Like if I know that episode 19 is the next one I need to watch, I go in search and find "Super cool series episode 19", and then I watch it.

I don't search for the playlist, then go to the playlist and scroll through to the episode I want, then pick it.

But then since I didn't go to the playlist, now YouTube can't recommend the incredibly obvious next video in the series.


there are times that "Episode 19" does in fact exist but even when I search for it, it finds NOTHING


I wouldn't surprised if it had been A/B tested and rejected because it decreased engagement. I'd love to know if and why.


We currently don't allow our son to watch anything on youtube, simply because it's a cesspool of horrible videos for children. You're always almost one click away from some violent video where cartoon characters are getting their limbs chopped off, etc. And it's sickening that Google's done nothing about it.

Secondly, I can't watch any video without it taking over my recommendations. Alex Jones for example. I still haven't watched an Alex Jones video (I was curious about what rubbish he spews), without worrying that I'll get a bunch of delusional hyper-conservative drivel taking over my recommendations.

There should be a way to watch a video once and not have it take over your recommendations.

On the one hand, we find that we're making great strides in certain areas with ML. On the other hand, recommendation systems are still just so, so, so naive and bad.


What do you want them to do? I'm an adult who uses YouTube and I don't want sanitized kid-friendly recommendations just so that you can protect your son's eyes from cartoon violence. In fact, several of the creators I like have been pinched hard by changes to the All-Powerful Algorithm which seem to be deliberately de-emphasizing anything that could be remotely considered offensive, and it's not fair to them at all.


There should be a "parental controls" toggle.


YouTube Kids is a separate app. Does that not suffice?


No, YouTube Kids has the same problem. Content on it isn't really vetted in any way.


Have you used YouTube Kids recently? You can set it to "Approved Only Mode" which only allows kids to view human-vetted videos.


That's good to hear!


Employ human curated collections, no algorithms. There is simply no other way that guaranteed kid-friendly content will exist in a space. Otherwise people will game the algorithms and you get Elsa-gate.

I’ve talked about this before, but if you have kids, the only surefire solution is curating the videos yourself and downloading them. Then playing them offline. Anything else is a gamble, which may or may not be the route you want to take.

YT cracking down on videos also has a lot to do with skittish advertisers. I’m not sure if a solution is possible other than utilizing Patreon, librepay, etc. Even if you don’t make “controversial” content, that’s what most people do these days for consistency, as well as merchandise.


YouTube Kids is a separate app and it has an option to only allow human-vetted videos. From the support site:

"In this setting, your child will only be able to watch videos, channels and collections that you have hand-picked. Collections are videos and channels grouped by topics, such as science and music, picked by the YouTube Kids teams or our partners."


Sorry to be so cynical, but I do not trust youtube whatsoever. Do you trust random "partners" of youtube? I don't.

YT has proven themselves incapable time and time again when it comes to dealing with human support and services. I cannot read PR text from them and just believe it.


How about employing 10 people of this massive platform to curate some channels that people can easily subscribe to? It could literally be advertised to be kids-friendly and not dumb etc.


If you delete a video from your Youtube watch history, it doesn't get used for recommendations anymore.

https://support.google.com/youtube/answer/95725?hl=en


My anecdotal experience agrees. I cleared my watch history and then all my recommendations changed from being tech related to more pop culture oriented for a few weeks


Its a Lie. I have YT watch history disabled for hmm 2 years now, recommendations "work" just the same.


Disabling it != Removing a single item from the list.


" When you pause history, any videos that you watch while history is paused won't show in history and won’t be used to improve your recommendations."

is a lie


Unless you have history disabled entirely.

Why can't we opt out completely from "recommendations"?


>> There should be a way to watch a video once and not have it take over your recommendations.

Log out.


You can't log out of the YouTube Android app anymore. There's only an incognito mode.


Newpipe


Or if you want the official experience: https://vanced.app


Don't use the app


Even if you logout, your anonymous session has its own set of recommendations. No?


Disable cookies


you will get served "interests shared by users from that particular IP", google helpfully assuming its a single household.


Yeah, those Tom & Jerry cartoons are HORRIBLE.


Thanks for sharing your personal opinion about videos you've never watched.


Nobody needs to watch an Alex Jones video to have an opinion on the man. He is a garbage person who spreads lies and propaganda. This isn't an opinion, this is Jones' own admission in court of what his program represents.


Yeah, and similarly nobody needed to bring him up in this thread at all. But don't think too hard about that.


It's worth noting that this blog post appeared less than 24 hours after BuzzFeed News posted an article explicitly highlighting the conspiracy theories in the recommended section: https://www.buzzfeednews.com/article/carolineodonovan/down-y...

I helped contribute to that article, feel free to ask any questions!


Have you investigated their rabbit hole for children? YT for Kids simply needs to die. They are evil.


Tangent: I'm expecting a kid, and time will come in couple years, when I'll want to show them kid videos. I fully intend to set this up as streaming from a library curated by my wife & me, populated with youtube-dl and through other means.

I know some people on HN have such setups; could you share recommendations for hardware and software stack? I suspect a Raspberry Pi may not be enough (having network and USB sharing capacity), especially if we want to simultaneously stream movies for ourselves. But I also don't want to turn it into some $1k+ server build, the way they do on /r/plex and elsewhere. Could anyone recommend a "compromise" setup, that would allow to comfortably stream HD videos to 2-3 devices simultaneously?


Whatever you do, keep the kids off youtube unless you want to deal with nightmares and other unpleasantness.

(I have a seven year old and a three year old)

In the end we just settled on netflix and the national broadcasting company (NRK), but I do have a synology nas with some stuff that I can stream to mobiles/ps4 via dnla or apps which would do most of what you want.


If you're in the United States, 90% of the population can receive the 24/7 PBS Kids channel over the air for free.

If you're worried about the quality of content from YouTube, or the chances that someone gaming the system dumps questionable content into your stream, it's worth spending ten bucks on an antenna. It's also good for supplementing your child's YouTube viewing when YT starts showing the same stuff over and over.


Not in the US; unfortunately, I don't think Poland has a dedicated government channel for kids.

Still, I want to make it video-on-demand, but with content fully curated by us - to avoid age-inappropriate content, ads and recommendations. There's no way I'm going to let my kid touch raw YouTube in the next decade.


We do, tv-trwam.pl


Hahaha, no.


The PBS kids website has a lot on it as well, all ad-free.

Also PBS kids apps, ad-free as well.


It may not be much help, but I've found minidlna to be a reliable, simple, lightweight media server on my LAN. I expect it could run well on a low-power device. Any DLNA/UPNP device can stream from it.


Thanks, I noted it down and will check it out.


I don't have much to add, other than I have had some good results with using the application "4k video downloader" to grab playlists on youtube. I'm not a fan of the generic name, but it does what it sets out to do and is fairly cheap. I'm sure everything can be done with youtube-dl, but I don't have a ton of experience with that.

I am able to get most of the videos downloaded and converted using that application. Sometimes I have to go and manually get a couple of videos that failed to download, but the app has served me well.


Using ytdl is easy: `youtube-dl $url` grabs the video at the highest quality. Playlists can be downloaded this way too.


Agreed! What I have found is that most legitimate popular YT kids content (Blippi, Pinkfong, etc.) is also available repackaged and included in Amazon Prime. I've been using that for most kids content instead, as it's not just sponsored toy unboxing crap and I don't need to worry about weird auto-playing recommendations or commercials.


Elsagate videos would be a good next step; I suppose they wouldn't be covered by YT's announcement.


Have you used YouTube Kids in the last ~6 months? They made a lot of changes and enabled a mode where only human-vetted videos from topics/creators you select can play. I don't have kids but had assumed that problem was largely addressed.


No it has not. An average person wouldn’t have time to tweak the unknown changes thrown by unknown developers at them for their kids viewing experience. Nobody should trust the YT for kids because their primary motivation is not kids safety or education. When my daughter uses PBS Kids or Khan Academy Kids, I don’t have to worry about what they are throwing at her (relatively). YT on the other hand is a pure disaster waiting to happen to parents, if they don’t know (I was in that boat for some time).


I see the Sinicization of US technology & media continues apace, and we're now thoroughly into the "encourage virtue, chastise vice" phase.


So when did you guys first notice twitter screaming about it?

For me, at least right around the 2016 election...


How does one report videos to Youtube that violate their policies and actually get them to do something about it. It appears one can talk about the Holocaust not being real / that people don't get killed in mass shootings in the United States all they want and keep their monitization / Superchats.


Unfortunately, that's all you can do. The real remedy is shining sunlight on the policy violations, which is why the journalism is so important.


What's the problem with videos like you mentioned?

Do you think all videos in which people are wrong about things should be taken down?

If somebody writes on their blog that they don't think the holocaust is real, should their blog get taken down, or is there something different about it being a video?

In my opinion, YouTube has gone much too far towards the side of taking down videos rather than leaving them up. We don't need to be calling for them to be even more strict with removing videos that people don't like. If you don't like it, you don't have to watch it.

If your goal in taking down conspiracy theory videos is to kill the conspiracy theory, I think that may actually be counterproductive. If the video stays up and everyone laughs about how stupid it is, nothing bad happens. If the video gets taken down then it just provides them an opportunity to shout about how they're trying to spread the truth but they're being silenced by the establishment.


The counterargument here is that "everyone" doesn't laugh about how stupid these videos are. Many people do, maybe most people, but some go, "Well, yeah, it's kinda crazy and stupid, but it makes you think." Fringe ideologues may upload videos to YouTube in part to hear themselves speak and to preach to the choir, but they also expand their audience and reach.

Do you think all videos in which people are wrong about things should be taken down??

Suppose Video A posits "your computer has a little gremlin inside of it and you should shove an oatmeal cookie into the CD slot at least once a month to keep the gremlin happy" and Video B posits "the Jewish-controlled mass media is conspiring to make us believe school shootings are a real thing and you should stockpile weapons for the coming race war." They are both technically "videos in which people are wrong about things," but isn't there a qualitative difference between those two things? Don't you think it's defensible for YouTube to take down Video B without taking down Video A?


I think that videos that say kids didn't die in school shootings should not be allowed to make money off of said videos especially when they try to get people riled up to harass the survivors and their parents.

Furthermore I expect Google to apply their policies evenly which they don't. If violations of Google ToS are reported they should be taken care of don't you think.

It isn't like you can actually debate them on their channels either as they delete comments that don't support them the vast majority of time.


Youtube has made significant efforts to not show ads on content that is deemed unsafe, so those videos generally aren't making any money.

You seem to be against people putting out content that don't align with your views, which is something completely different.


> You seem to be against people putting out content that don't align with your views

Not GP, but I see this a lot as a retort to people who make the claims they did and I tend to call BS on it.

I strongly suspect there is an extremely broad library of "content that doesn't align with their views", including an extremely broad spectrum of political views, they would have zero issues with on YouTube.

Somehow, climate denialism, antivax content (and other health hazards), holocaust denial, general nazi shit etc are consistently not part of that.

Now GP can come in, correct me and tell me that they want everything they ever would disagree with off Youtube for good. But somehow, I suspect they won't.

So, tangential to your post but, tell me: why is it that there's such a consistently-easy-to-define line for content that a lot of people think should be kept off various platforms, and why the hell is it that, whenever I see someone defending such content, it's almost always someone who belongs in the category of people who believe in said content, rather than a staunch defender of free speech.

I ask this knowing full well there are many people I hold in very high esteem, who legitimately do defend extremely vile shit they disagree with, based on free speech principles. Most of those people I personally know work at the EFF and I've never heard them say much about deplatforming nazis. (could it be because free speech is a government thing, not a youtube thing …)


> why the hell is it that, whenever I see someone defending such content, it's almost always someone who belongs in the category of people who believe in said content

Since nobody here has given any indication that they believe in these ideas, it's probably because you simply assume that anyone defending it believes in it.

Possibly also because the concept of not wanting to take down objectionable videos is a "weird" idea, and people who are willing to take weird ideas seriously probably have lots of other weird ideas.


YouTube suggestions are worthless now. Used to be if you were watching "Jack cooks a 4 course meal part 2" the recommended videos would be "Jack cooks a 4 course meal part 3", Part 1, and some other videos by the same user. Now it's a host of other unrelated bullshit.


A few weeks back I got a weird fan biopic of Himmler created by some account with the SS logo in their picture recommended to me after watching a video on some obscure DNS features. I reported it but it's still there.

Google is beyond broken. Chasing those engagement numbers on a forever treadmill, just hoping they inch up a little more at any cost.


I still get the old-style suggestions from time to time. It seems like the functionality is still there to display videos related to the one you're watching, but half the time it displays the same mix of recommendations you get on the youtube homepage. I'm not sure what it is that triggers the related video recommendations instead of the "personalized" recommendations.

I went through a "This Old Tony" binge a couple months ago because i stumbled upon one of his videos and youtube kept recommending more. But that seems to be the exception rather than the norm.


I remember that! That system was so sensible, it was great when I was following the lessons on ExcelIsFun/VBAisFun. The next video in the series would play seamlessly.

Now, thanks to the all powerful algorithm, I can watch a video about a rescued pitbull living with his new family, and have it followed up by "10 times Ben Shapiro owned the libs".


it changed from being “users who watched this video watched X next” to “You would like Y videos based on your watch history” (somewhat biasing the first 2 or 3 to be related to current video)


I like that it seems heavily based on the watch history because it gives me some control.

In general I find the selction on the homepage to be useful, however it does take some work as I regularly remove a lot of random things.

I also open random videos in a private window to keep my watch history more focused.

So it works fairly well for me, bringing up videos from channels I like that I haven't seen before... but it takes effort to get it to work for me. So it is arguably not very good for a typical user.


Good point, I hate how multi-part videos aren't automatically queued up. I also don't like getting videos queued that I've already seen.


I was curious why this is happening now. Someone mentioned it may be because I was using an adblocker or had privacy settings on google too high, I haven’t checked.

Currently I see absolutely zero related content next to the current video I’m watching, from the current user or the subject matter.


It does seem to be getting better, for a long while I had to make sure to clear my history of any conservative leaning videos or I'd end up down a rabbit hole. Too many and all of a sudden its bloody flat earth videos and other insane conspiracies. Made it very difficult to try and watch a balanced set of videos.


Personnally I find those kind of meta informations useful.

It's midly annoying to not have better suggestions, but on the other end, it gives you a perspective on what kind of people are fed with what kind of info.

Kinda like the old email chains my mother used to send me. It allowed me to understand what were her fears, hopes, and lack in understanding about things I never had the chance to talked about with her before.


Yep. I like to watch some videos of varying view points and political leanings to avoid falling into an echo chamber.

But more than. One or two conservative videos, especially about European current events, and you are sucked in to the Nazi Propaganda hole.

Another inescapable pit of content is electronic music production.

I was doing a bit of research on the teenage engineering pocket operators as a possible gift for my younger brother. Now Im almost exclusively recommend music production videos.


The way the algorithm pushes steadily more radical content is really concerning. Try making a completely fresh browser profile and deliberately browsing around right-leaning or "anti-SJW" channels and see how long it takes before YouTube starts pushing videos about white genocide and the impending destruction of western civilization by the muslim hordes (not very long in my experience).

edit:

I just tried this again and not much has changed. Started by viewing Joe Rogan's interview with Elon Musk on a fresh profile. The first suggestion from there is a Jordan Peterson anti-feminist video, from there we get a Jordan Peterson anti-islam video, then "The Suicide of Europe" by PragerU, and from there the floodgates are opened to fear-mongering about "rapefugees". That's less than a dozen clicks on highly related videos to get from Elon Musk smoking weed to videos with an overtly racist agenda, on a profile that's never sought out that kind of content previously.


> The way the algorithm pushes steadily more radical content is really concerning. Try making a completely fresh browser profile and deliberately browsing around right-leaning or "anti-SJW" channels and see how long it takes before YouTube starts pushing videos about white genocide and the impending destruction of western civilization by the muslim hordes (not very long in my experience).

Seems like they're trying to profile the user for recommendations too early, or just generally suggesting based on too small of a sample of videos. The blank profile is like a knife balanced on edge, where the slightest push will tip it one way or the other. If they'd just hold off the personalized recommendations entirely until you had 100 or 200 videos, it might be better.


It sounds like they are overfitting the recommendations.

They really need to get the users explict preference, rather than trying to infer it from what videos they watch, as that data is tainted by the recommendation engine.

Also they have killed the annotation system that most video creators have used to manually recommend videos to their viewers, which is probably going to cause even more overfitting of recommendations.


Ok, I'll bite. What exactly did Google do wrong here?

I'd be persuaded if you could show that a random walk through the recommendations ended up in far-right opinion some disproportionate percentage of the time. What you did was a goal directed search. At every point you selected the most right leaning perspective, and in the end you got where you wanted to go.

The only way to stop this from happening is to ban objectionable opinions from the platform entirely, or alternatively to disconnect them from the graph of recommendations. The latter is what YouTube's "limited state" already does.


From the post you're responding to:

>The first suggestion was...

It doesnt sound like they were choosing the most right-leaning perspective, but rather the first on the list (the one that would play after your current video finishes if you have autoplay enabled).


Well, I tried to reproduce it with a private browsing window.

That Jordan Peterson video was in the #4 position in the sidebar from the Musk interview. However, it was the first video that did not include Joe Rogan. So far so good.

Thereafter, clicking the first recommended brought me to another Peterson video, then Gordon Ramsay, and then into an endless loop of Kitchen Nightmares clips.

Certainly, there's some stochastic element here, but I'm still not convinced.


Interestingly, YouTube actually recommends liberal content at a rate substantially higher than conservative content. Centrists content links to liberal content about three times more frequently than it links to conservative content. Is there evidence to corroborate the claim that YouTube is pushing people to extreme content? The data seems to indicate otherwise.

https://www.thepostmillennial.com/does-youtube-facilitate-ri...


Sure, if CNN is „iberal“, and QAnon is „conservative“.


I'm not sure I follow. The implication of your comment seems to be that extremist content is more conservative. On the whole I agree. Which is why the fact that YouTube steers people towards conservative content less than centrist or liberal content broadly contradicts the claim that YouTube is steering people to radical content.


The extremity doesn't seem to be the same in my experience. It could be because I fall slightly more that way that it's able to get a better handle on my preferences but I don't find I get suggested the insanity of the left when I watch more liberal videos in the same way I do when I watch conservative ones. It's a tricky one to measure though.


This doesn't match reality. I've watched lots of conservative videos and I've never seen anything about flat Earth or "insane conspiracy theories".

Honestly this comment thread seems like trying to build a narrative bridge between straightforward right-leaning content and bizarre conspiracies. The goal is to use censorship of bizarre conspiracies to justify censoring right-leaning content by conflating the two.

I'm very suspicious of YouTube's intent here. Given how vicious Google is towards conservatives inside their organization (see Damore) it's pretty obvious to suspect that they'll use their power to reduce the spread of right-wing ideas of all stripes. Put simply, after that display, nobody can trust them to be even-handed. Most of them are good people but they're totally dominated by the intransigent minority [0] of high-and-righteous recreational witch-hunters in their midst.

The first excuse will be they're insane conspiracies. Truly crazy videos will be censored. But that's just the bait, setting the narrative for the switch, where they narrative becomes about "racism" and they start censoring the right-most 10%, 20%, 30%, of the opinion spectrum. Criticism of Islam, support for Western culture, statements against anti-white bigotry, arguments for reduced immigration (even illegal immigration), arguments for equal legal treatment for men, arguments about biological differences between sexes or population groups, arguments about deep differences between religions will all be censored under this ever-expanding umbrella.

Even the notion that "conspiracy theories" are a right-wing thing is part of this - there are lots of left-wing "conspiracy theories" and always have been. Some openly racist and sexist ones are quite widespread right now.

[0] http://fooledbyrandomness.com/minority.pdf


This is exactly my experience. I've watched many videos from conservative content creators on YouTube to understand their perspectives, and by and large I found them to be reasonable and not dissimilar in their balance to the content I usually watch. I did not find conspiracy theories, flat out lies, or other extremist content either nestled within those videos or recommended to me in other videos. They may exist out there, but I haven't been exposed to them.

The Buzzfeed article (and others like it) is ultimately just a collection of anecdotes. It is not data that reflects my reality and I expect it does not reflect reality for many of YouTube's users. However, it provides a lot of confirmation bias for those who look at the existence of content/perspectives they disagree with and have an urge to want to erase that content wholesale. It feeds into what seems to be a crisis manufactured by some activists on Twitter and some news outlets (like Buzzfeed).

And of course, it dehumanizes the right and their experiences/perspectives by associating them with terms like 'radicalization', or 'conspiracy theory', or 'fake news', which are probably not only exceedingly rare but also present on the left.


> But that's just the bait, setting the narrative for the switch...

Are you sure you’re not watching too many conspiracy theory videos?


The strange thing about slippery slope is that it is both a logical fallacy and an effective means to accomplish political objectives (of course one does things incrementally)

Left wing conspiracies are usually narrow in scope: the Republican party, the Koch brothers, George Soros, Dick Cheney, the Oil Industry, its not really the same as a intra-national conspiracy that transcends party lines.


Conspiracy theories are, for the most part, a right-wing thing.

Unless you can give me examples of conspiracy theories and their peddlers on level of Alex Jones for the far left.


The anti-vaxers are from all over the political spectrum I am sad to say.


The most notable anti-vaxxer is our President.


From my personal experience they've always tended to be further right, since often the anti-vaccine fear comes as a result of government mistrust as well as not trusting scientists. Very similar to the people against climate change in my opinion.

Though it's true that the anti-GMO and even anti-Nuclear sentiments I see accross the spectrum but those differ a bit from conspiracy theories in that it stems from junk science, rather than a belief in a hidden government or group wanting to personally change you.


My experience couldn’t have been further from yours. Most of the anti-vaxxers I encountered both irl and online were of the kind that is all about natural remedies and distrusting of GMOs and such (as opposed to those distrusting of government and believing in conspiracies; that kind i mostly encountered online only).

Just google for outbreaks in the US that are linked to anti-vaxxing and take a note of their locations. Most will start in heavily left-leaning areas. To clarify my point, I personally believe that anti-vaxxing is about evenly distributed between left and right wing, just for different reasons.

P.S. As I was trying to find an article about an outbreak in Seattle that I remembered from a few years ago, I realized there is another one happening right this moment https://www.seattletimes.com/seattle-news/an-anti-vaccinatio...


Every looked into anti-vax (there is some right, but a lot of it is left)? How about the impeach trump stuff? Organic, anti-GMO, alternative medicine. Plenty of left wing nut jobs there. (some of them have a right wing component as well)


GMO/Monsanto, Big Petrol & all their wars, Putin controls XYZ, "The Oligarchy", "The Patriarchy", "The Privileged", Plastic straws from Europe killing turtles in the Galapagos, "Systemic Racism", The Gender Gap, etc.


I'm gonna need you to explain how 'The Patriarchy' ,'The Privileged', 'Systemic Racism' etc are somehow conspiracy theories considering they're more like social critiques around very real issues people face.

Unless you mean things you don't like are conspiracy theories in which case we might as well just call everything a conspiracy.


Being poor is a very real issue somebody can face. Attributing the cause of this state of affairs as being the intentional result of a coordinated group of people, without real evidence that this is the case, is a conspiracy theory.

I seems to me that 'The Patriarchy' / 'The Privileged' on the left are conceptually quite similar to 'The Deep State' or 'The Jews' on the right. And those are most often described as conspiracy theories, and rightly so.


If you think they're conceptually quite similar then I don't know what else to say except that you would just be objectively wrong.

The difference between a conspiracy theory and a social critique is the personal nature of it. People that believe that the 'Jews' or that 'The Deep State' is responsible for everything believe that there are specific actors, people doing things explicitly to stop someone.

When people complain about the patriarchy or systemic racism they're complaining about a large tangled of social contracts and norms that result in certain people, races, genders etc being disenfranchised. For example it's a statistical fact that a black man smoking marijuana is not only more likely to be thrown in jail but also more likely to face harsher sentences than a white man smoking marijuana. What else would you call this, other than systemic racism?

You can't point to a similar statistic, study or what not that shows 'The Deep State' is specifically targeting Donald Trump or that 'The Jews' are attempting to kill all white people. You can, however, point to specific actions and trends the FBI has taken in the past to disenfranchise black people, including actions taken against Martin Luther King Jr.

One side explicitly has a factual basis behind it. The other does not. That's what makes one a conspiracy theory, and the other something worthy of scrutiny.


> You can't point to a similar statistic, study or what not that shows 'The Deep State' is specifically targeting Donald Trump

It usually gets more personal at the extreme. People will show you stats that shows a disproportionate majority of federal employees supporting the Democratic party and infer that there is 'systemic bias', some will point at specific cases and make leaders of hidden conspiracy out of them. I've seen enough of 'Kill All Cops' to know that there is similar extremes on the left.


You seemingly avoid the core conceit here, which is making me think you're not actually here to argue in good faith.

Yes, there are going to be people that will take anything to an extreme. That does not make 'Systemic Racism' somehow a conspiracy theory, especially on the level of things like the 'Deep State'. Arguing otherwise is to me the highest delusion.


If they could simply make the "Not interested" feature work that would be great.


Seems to work fine for me. You just have to click "not interested" on every video of the topic that you don't like and after 5 times or so it stops recommending that topic.

Let's say you watched a video called "Top 10 most amazing things you never knew until today" and now the youtube algo recommends you a bunch of clickbait. Just hit "not interested" on every single clickbait video you see and eventually it stops showing you those videos unless you watch more of them.


Or delete the video from your watch history. Your watch history is built based on your history


I would like a stronger option of blocking certain accounts that keep getting recommended.


I would like a spam bayes like detection and js modification add-on.

I would flag a video title as clickbait and the algo learns what I think, and then when it sees a similar title in the recommended list it removes the div from the page.


yep, feels like they completely ignore it, right up to the point they just re-recommend the same videos.


You should make a video detailing why you have this theory and post it on youtube.


I pledge to rebuttal with a counter theory, if you make this video detailing yours


Long overdue. Maybe in a few months I can watch a Joe Rogan podcast video without tainting my YouTube recommendations for the next year with Ben Shapiro-style clickbait.


Youtube OBLITERATES elektor's recommendations with Logic & Reason!


On the other hand, I wouldn't know who Ben Shapiro is without having watched Rogan videos. It's an important info to have, because it tells you what people are interested in, and what they believe in. You can't communicate efficiently without knowing that.


I don't know who Ben Shapiro is so I'm just using him as an example and maybe he is noteworthy, but if the system tells everyone to watch Ben Shapiro videos of course everyone is gonna know who he is. Maybe he wouldn't be 'required knowledge' if youtube didn't force it and reinforce it on everyone.

The current system doesn't tell you what people are interested in, it tells you what the system made the people pay interest to.


Well, I assume it recommends what other people that watched similar videos watched as well.

I doubt Ben Shapiro is 'required knowledge', nor that he became popular only through youtube. Those people usually have an IRL crowd. They have IRL impacts.

But I see your point.


> Well, I assume it recommends what other people that watched similar videos watched as well.

The problem is it creates a feedback loop and amplifies whatever is slightly over represented.

I also noticed that on LinkedIn profiles, on many high profiles ones the related section contains pretty girls profiles with unrelated positions, just because statistically they get more clicks I guess.


>The current system doesn't tell you what people are interested in, it tells you what the system made the people pay interest to.

I agree with you 100% here. But this problem can't be fixed. All that's about to happen is Google is going to decide that people will be interested in something Google finds more palatable. I don't know how to fix/escape this problem.


To take the position that people only believe stuff because it is fed to them is incredibly patronizing. Sure, the fundamental dynamic of social media (that immediate and direct feedback about what captures user attention is used to determine what users see) makes things more polarized, but people are polarized to begin with. Conservatives are conservatives because they have a fundamentally different outlook about the world, not because they just happened to fall down a specific click hole on the internet.


You're basically saying that well done propaganda doesn't work then, which is patently false.

Watching polarizing YouTube videos isn't going to flip a bit, but constant exposure to a single perspective constantly reinforced is certainly going to influence your formation of beliefs. That's exactly what YouTube does with their recommendation system


No, that's not what I'm saying. I agree that propaganda and social media are problematic. But they can only be a problem if people pay attention. And nobody is forcing people to pay attention. Which gets to my point: people have pre-existing notions and inclinations before propaganda and social media can even have an effect.


I have not taken that position. To misrepresent the views of others to make your argument easier is very patronizing.

My post was about notoriety not belief. Many things, trends and personalities are created by the media, whether you believe what they say or not. Flat earth conspiracies would never have been repeated by NBA players if not because of Youtube's recommendation algorithm.


Way before internet, I remember people were talking about how Marilyn Manson had surgery to be able to suck himself

I also remember when I watched in disbelieve people saying the sun was going around the earth.

Internet is just the reflect of human nature. Flat earth theory would have existed without it.


Maybe so, but it now enables global reach at the speed of light. You can't argue that the danger from a stick of dynamite is the same as the danger from a nuclear bomb. You might not care about the difference if you're standing next to it, but they are bit the same thing


same! I touched a Rogan podcast & ever since I've been getting recommendations for a certain JP


Yeah, and maybe some day I'll be able to do anything on the site whatsoever without it shoving CNN videos in my face.


However, who determines what is a conspaircy? The government might be deciding that Iraq has WMDs and any evidence to the contrary is a conspiracy. While some measures are needed to be taken, who decides what is? No one is truly impartial. The censorship (while required due to how much crap is in YouTube) might devolve to 1984 without comple transparency, which we all know google isn’t providing.


Honestly, every time I hear an argument along these lines, that any attempt at moderating content will inevitably devolve into 1984 style censorship because all terms are arbitrary and "what even is x?", it becomes less convincing.

Unless the US government bans all sites except google and turns Youtube into a direct propaganda tool and declares that publishing any unauthorized content is a crime, then the worst that can happen here is that Youtube's updated algorithm fails to serve the needs of its userbase, in which case it will probably be amended due to a drop in the site's popularity and engagement metrics.

Content that isn't recommended hasn't been censored, it's still discoverable as long as it exists on the platform, and free speech remains unaffected even when a single platform decides to alter its algorithm in a way that might slightly reduce the immediacy of certain kinds of content.

Not all slopes are slippery.


You say "a single platform" as though there is another viable platform.

Someone tried to set one up. It's called BitChute. All the payment processors blocked it.

The reality is that YouTube is not just "a single platform", it's THE platform. It's owned by a megacorporation with extreme political leanings. It's incredibly influential on votes and opinions globally. It's run with zero transparency, by a small unidentified unelected group of wealthy and powerful people.

You're just comfortable wtih this because you think they're on your side politically. Most people are comfortable with moderate levels of tyranny as long as they think it's their tribe that holds the levers of power.

But if Google was owned and run by evangelical Christians, and they were shifting their recommendations to discourage "immoral" videos of gay pride, pro-trans-rights arguments, sex worker rights, etc, you'd be incensed.


Why are you conflating legitimate conservative political speech, bullshit conspiracy theory and white supremacy?


Who determines when daytime is? is 7AM daytime? is 7AM in Alaska in the middle of winter daytime?

There is a discussion to be had about some topics at the edge...but 2PM everyone agrees is part of daytime; the flat earth 'conspiracy' should never be recommended to anybody.

And sadly, in the current climate, this also has to be said: This does not mean Google is picking sides. If you on your own reflections decide that for some strange reason the earth is flat, that is your right, and no one is infringing on it by not giving you a platform.


Truth, but however, the road to hell is paved with good intentions.


Why not? Who are you to decided what should and should not be recommended? Maybe I would enjoy watching some flat earth theory. Maybe watching it will help me understand it better so I can have a civilized conversation with one when I come across them. What about other conspiracies? What if I am researching conspiracy theories? Wouldn't flat earth theory is relevant to me?

'There is a discussion to be had about some topics at the edge...but 2PM everyone agrees is part of daytime; the flat earth 'conspiracy' should never be recommended to anybody.'

It should be recommended to anything to whom it would be relevant to. That's the whole idea behind recommendation system.


NSA mass surveillance was considered conspiracy before PRISM.

Worse, people that said it was a conspiracy are now calling other things conspiracy, without any consideration for their past errors.


There seems to be people around who still think NSA mass surveillance is a conspiracy. Heck, don't the flat earther think round earth is a conspiracy? One mans theory is another mans conspiracy.


There's some fun even around the origin of the term. In late 1963 JFK was assassinated. According to a sourced referenced on Wiki [1] the first mainstream reference to "conspiracy theory" was the New York Times who, in 1964, posted some 5 articles using the phrase. In 1967 the CIA had a psychological operation manual on discrediting conspiracy theories. [2] The reason I mention JFK there is because the conspiracy theory tactics created by the CIA were specifically relating to the Warren Commission [3].

Ultimately I've always found the term silly. People can believe idiotic things for sure, yet on the other hand I think there is an equilibrium where on one end you have people who believe no 'conspiracy theory' could ever be real, and on the other hand you have people who indulge every 'conspiracy theory' as probable. Both opposite extremes are equally naive.

[1] - https://en.wikipedia.org/wiki/Conspiracy_theory#Etymology_an...

[2] - https://www.zerohedge.com/news/2015-02-23/1967-he-cia-create...

[3] - https://en.wikipedia.org/wiki/Warren_Commission



"don't the flat earther think"

Sure... some. But that's not the reason it's promoted. Paradoxically, many "fall for" it's real utility my merely lumping it in with whatever they want to discredit; disinformation is rather effective when it attaches obvious bs to other things.


Say what you want about YouTube recommended political videos, but I have found some absolutely great music because it showed up in my recommended.

A small Canadian band showed up in my recommended and that first song captivated me. Then I listened to their other singles and two albums, and now I'm going to their show next month.

Also, I just found an artist from Belarus that makes fantastic synthpop. The weird thing is that all his videos use cyrillic characters, so I don't know how google even matched it. I am thankful though.


I also recommend going to https://www.youtube.com/feed/music . It's a landing page that used to be behind a "Music" button on the youtube homepage, but has since been removed. But luckily the page still exists. It gives you a bunch of different playlists, including "Recommended" and "Latest Videos". I like the Latest Videos because it allows me to see what's "current" at a glance, even if I choose not to view any of them, I can see what' going on in the mainstream.


Also, I discovered awesome electro music and house music along with the Seven nation Army Remix by Glitch Mob. That’s song is now in my main playlist.


I agree. I've found a host of new artists this way. I've taken to subscribing to a few choice channels that have live performances (mostly local radio stations). There are a few that do a thing where they have a band on and the do one original and one cover - sometimes I end up finding two new bands.


> an artist from Belarus that makes fantastic synthpop

Alright, I'm gonna need a link.


https://www.youtube.com/watch?v=ypknY6nIDe0

Check the comments. It looks like there were plenty of people who had the same experience as me.


Ah, I'm not sure that this recommendation is tied to user's history of watched videos. Because I've also seen it in my ‘recommendations’―and though I tend to occasionally listen to a couple tracks of Gary Numan or a vaporwave mix, I don't think a 2018 release would follow from that dataset.


Well, I did watch a couple "yugowave" videos, so maybe that has something to do with it.


The Dead South ?

They seem to be oddly popular on youtube.


"Men I Trust" from Quebec. This is the song that started it for me: https://www.youtube.com/watch?v=9IZKcb3LndA


I love watching crackpot conspiracy videos on YouTube, especially flat earth & moon landing denier stuff. Beyond just the curiosity to dissect their weird contrived explanations, there's tons of interesting scientific & historical context to be learned from the counterpoint experiments and such. I hope it will still be generally available & in the sidebar if you're already watching such content.

Also, generally speaking, you really should block youtube cookies. The recommender doesn't seem to track your history then, and the entire experience is FAR better, simply recommending associations from that single video, instead of always trying to railroad you back into where you've historically spent more time.


Why do they recommend the same five videos of a series (like Barefoot Contessa) over-and-over when I've watched them already?

I don't want to say "not interested" because I don't know what that means... I want a feature where it's like "more like this, but not this one in particular."

It's surprising they've been under Google's control for a very long time and yet their main recommendation algorithm still leaves a lot to be desired.


If I remember correctly, the "not interested" UI has an option for "I've already seen the video"


That just makes it seem like their algorithm is powered by bad code or bad management, or both. They know what I've watched. They should know if a video is likely to be watched repeatedly (ie. classifying songs/music videos differently from other content).


Ok, but why? If there is one thing the algorithm should know it's which videos you've already watched!


YouTube recommendations are so terrible I consider any suggestion of a channel I don't already watch as a warning. "We see you watched an informative video on topic X, would you also like to watch DudeWhoHatesEverything rant incessantly about X, or BaityMcClickface's Top 10 Ys about X? Don't worry if you don't, we'll keep bringing them up every time you watch anything even tangentially related to X."


For some insane reason, no matter what video I watch, I almost always get recommendations for Penn & Teller. Doesn’t matter that I don’t care about that even a little bit, and haven’t watched one of their bits for years. Doesn’t matter that I was just watching a technical tutorial. Doesn’t matter that I was just listening to a music video. Doesn’t matter if I just played a Thomas and Friends video for my kid. I mean, wtf? I want a “why am I seeing this recommendation” button at this point, because my curiosity has gotten the better of me and I want to see how this magic trick is done. Oh, hey, wait a minute...


I too watched a couple of "Fool Us" videos and now YouTube believes I am training full-time to be a professional magician.


My expectations are very low, and I generally only browse my sub feed, but I have noticed that some recommendations are getting better.

Specifically, I notice more of:

* Subjects I haven't watched in a while, but went through a period of watching. This is especially useful if it's a recent update or revival of a particular topic.

* Content from channels that are actually similar to what I often watch, and not just clickbait stuff. Occasionally there's actual valuable discovery in the recs.

* Videos that I very much would normally watch from sub'd channels, but missed for whatever reason. The recs are actually helping remind me to watch things I want to watch.

It's still the minority of recommendations, but I have to agree that the system is actually improving.


They should introduce some features that allow better search or a hierarchical directory for ppl who like to pick and choose. Perhaps allow a more descriptive presentation of the creators themselves. Not everything needs to be a flat feed.


Hm - I was hoping to read a little bit about the algorithms behind YouTube's recommendation engine but instead got a few paragraphs informing me that YouTube was going to implement (even more) arbitrary censorship and claim it was in the spirit of "protecting" us.


...I hate that feeling when I browse youtube and want to watch a trashy video, but then have to stop myself because of the negative effects on the recommendation engine. (yes, if I'm in browser youtube I have the option to open an anonymous tab)


My YouTube suggestions list is filled with videos I just watched in the last few weeks or months. Along with the some new videos, but remarkably many already watched ones. They are not labeled as such.

I suspect I blocked some cookie or changed some setting they didn't want me to change, and this behavior is deliberately calculated to be punitive. It's not very user friendly.

Or, the YouTube developers are just not very smart. Either way, it doesn't look good for them. This would be so easy to fix. If they don't know how, they are welcome to reach out to me and I can give them some ideas.

Yes it did occur to me that some users might like this. But they way they have it set up is very confusing and suboptimal for my needs as a user. I will admit though it may be serving some business need that they have, like saving their bandwidth by re-serving me old videos from my cache.


I have always had the idea that there should be a law requiring any recommended item anywhere on the internet to be accompanied with a reason of why the item has been recommended. Youtube already does this with certain videos but I would much like it to be expanded completely.


I don't want YouTube to reccomend anything for me. I don't want Facebook to show me the news stories they think I should see. I don't want Google to show me the ads they think are more relevant to me. None of this improves my life in any way whatsoever, and in fact is leading to the dumbing down of society. We have completely handed over individual taste, interest, and curiosity to algorithms that do nothing but optimize for engagement. I'm done. I want the old internet back. A place where you were presented with a variety of information, indexed by it's content, from which I am able to pick and choose, and then discuss with other human beings.


The users of YouTube, Facebook, and Google generally like a feed of things recommended to them. I feel like if this were an article about too many people pissing in a pool, your comment is saying no one should want to swim. That's one way to not get urine on you, but plenty of people clearly want to swim and it doesn't help much to blame them.


Let me block channels from appearing in my recommendations. Obviously I don't like that channels content, so I don't want to see it.


Should be "block this channel and very similar ones" IMHO.


I think the problem is they're using machine learning on something where it's hard to know when to reward the AI. Do you reward it for max minutes the users watch, max clicks, etc? None of them imo really measure whether the users are satisfied with the recommendations which even if that means less watch time for them, it's good for youtube as a brand.

I just ignore them completely now except for when browsing music. They often don't even help when watching serial things (e.g. part 1, 2, 3), I'll often make the effort to look up a playlist instead. I have also always ignored the front page, so can't say anything about it.

I do wish youtube would let you pick areas of interest and whether you wanted to be recommended based on those areas. If a video fits in an excluded area, you would just get random recs, otherwise you would get recs from around that area. Preferably a mix of recent + old + really old videos that still have good ratings / watch time.

Also they would have less to sort through, etc, if they allowed creators to "archive" a video (because its outdated and maybe some other newer one replaced it). This happens to me all the time. And as a user it would be nice if for example, when some video's audio was bad and then a newer fixed one uploaded, if I was prompted to go to the new one. Similar thing with serial content. It would be nice if creators could establish one video follows another or is related.


For a short time, you could actually hide a video in your subscriptions by just clicking one button (an "X" overlayed in top-right). Best feature ever. Now it takes two clicks and mouse movement to hide.


It would be great if Youtube actually stopped recommending irrelevant videos that I do not care about (the recommended for you videos) and instead only showed videos in the sidebar that are relevant to what I am watching at that very moment. If I am watching a conspiracy video then I would prefer it if I got another conspiracy video as a suggestion rather than a random cooking video - and it actually goes the other way too! Every time that I go to watch a gaming or a programming video at least one of the suggestions is some lame political or conspiracy-related video.


Manual selection doesn't bode well.

I like to watch spiritual and metaphysical talks, Papaji, Eckhart Tolle, Conscious TV, Rupert Spira and the like.

I don't want someone to tell me what's "true" or what's "real" by refering me to the so called "facts".

Sounds like a slippery slope to me...

There sure is a lot of drivel on YouTube.. much of it like content farms playing search engines, is driven by the ad revenue. I find this empty / mindless content far more distracting than "false claims".


I don't want someone to tell me what's "true" or what's "real" by refering me to the so called "facts". Sounds like a slippery slope to me...

It sounds at a first reading as though you’d feel everything since the Enlightenment has been a slippery slope. To be fair to YouTube though, if you want an environment devoid of “what’s ‘true’ or ‘what’s real’ by referring [you] to the so-called ‘facts,’” then YouTube seems like it should be your paradise. I watch mostly technical, science, and history videos, but apparently YouTube interprets that as a preference for ranting conspiracists and people who make Joe Rogan look like a gentleman and a scholar. I can only imagine the trash it throws up if you actually seek it out!

If you want to create a bubble of people saying things totally divorced from anything like reality, what more could you ask for than YouTube?


Garbage content in my opinion is mindless, soulless content created by content farms. Which in a way is what YouTube encourages through their advertising business. Same issue with low quality results in search engines due to all the content farms.

Any content that makes me question things is good in my opinion.

I had a period when I was into UFOs and whatnot.. and it led to an interesting realization about what I actually know. So in my view everything has its place. For example a video about flat earth may very well engage the viewer to wonder WHY the earth wouldn't be flat?

You can't force people to ask questions. Some people will get into the deep end and lose themselves. So be it. You can never force someone else to wake up out of their dreams. It's each individual's choice. At best it is only through the heart, and not rationality, that you can help someone see more clearly. Everyone believes n one thing or another, in order to feel safe.

And yes you are right. YouTube is absolutely perfect right now in some funny ways. Through the limitation of their own business model, they allow everyone to express themselves. And this is better for everybody.

I think they're going to stick to moderation politics and whatnot anyway, so who cares.


> To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.

Instead, why not just include videos that thoroughly debunks these ideas in the recommendation section of these videos ?


I am only 24 years old and hearing 9/11 being referred to as a historic event makes me feel old.


In the last year the videos recommending on the right margin have always been the same videos regardless of what video I was on. Which are the same videos at the top of my home page recommended videos. What ever happened to video specific recommendations? That was what allowed me to discover new things related to what I was watching and not just some total aggregate of everything I've ever watched.


It would be much cheaper to use user feedback instead of "human evaluators" who represent a tiny segment of audience. Just try adding some sort of rating system for reccomendation like 10 is Perfect Reccomendation and 0 is Completely Irrelevant and adjust the algorithm accordingly. I think the current one use something like network analysis that feeds its own reccomendations back to itself.


I would love to see youtube add more videos to the recommended section. Right now I believe the limit is 18. When you load the homepage it shows you 12 tiles and you can click "show more" to display 6 more for a total of 18.


Yeah, this is the most annoying thing for me too. They have an endless stream of recommendations if you go to www.youtube.com/feed/recommended , so why not put it in the main page?

I can only assume that it's an intentional marketing thing to get people to watch other things.


I wish YouTube kept better tabs on what I've watched and not re-recommend them. I especially wish they wouldn't automatically add to the automatic play list videos I've already seen.


> We now pull in recommendations from a wider set of topics—on any given day, more than 200 million videos are recommended on the homepage alone.

What does the first part (wider set of topics) have to do with the second part (number of videos recommended)?

Anyway, YouTube recommendations are generally quite poor I find, though sometimes a few gems come along. I also made the mistake of watching YouTube on my Android TV box. Holy cow, an ad _every single minute_. And the same 4 ads, rotating.


Just guessing, but it could be that they are claiming that 200 million unique videos are recommended on a given day, with the number of unique videos being a proxy for how diverse they are. Perhaps before some of their recent changes, the number of unique videos was smaller. (Who can say for sure, though)


Improving recommendations? All I'm seeing is a complete deterioration of youtube's recommends since the glory days of the early 2010s.

How about stop spamming me CNN, SNL, Late Night Show ( kimmel, john oliver, etc )? I've yet to accidentally click on these links/channels in all my years using youtube and yet, I still see them every day? Why?

I've pretty much given up on recommendations and just directly go to the channels I want to watch. There was a time ( early 2010s ) when youtube recommends was great and it was actually fun going down the youtube recommends rabbit hole. Now it's just corporate shilling. Just like google search and its incredible decline.

You guys aren't going to improve anything because you don't have competition.


YouTube keeps showing me videos I have already seen, while not sending notifications for videos on channels I'm subscribed to (yes, with the bell!). In order to find videos from people I am subscribed to, I have to go to their Twitter/Discord/Minds so that I don't miss out on at least one video per day, it's nuts.


honestly, i'd like to be the judge of if a theory is crackpot or not. i'm not a kid, i don't need youtube to save me from "bad information".

hopefully the change is mild. i've discovered a ton of great stuff through youtube recommendations (especially music! youtube blows everything else away for music discovery).


>honestly, i'd like to be the judge of if a theory is crackpot or not. i'm not a kid, i don't need youtube to save me from "bad information".

If your local library or city center decided to prohibit a local white supremacy group from putting up flyers on their billboards recommending Holocaust conspiracy theory materials, would you criticize them the same way? I'm curious about why people seem to have very different standards for public spaces and websites moderating themselves.


not a perfect analogy, because (1) billboard space is limited (2) this is restricting thingd that otherwise specifically would be recommended for me.

anyway, i, personally, would still give them (and every1) the space on the billboard. let every1 talk, even the crazy and the hateful.

more generally, i dont think any institution should be privilged over the individual to decide which ideas they get to hear.

just one possible philosophical position, one choice on a spectrum of tradeoffs.


Recommendation systems are often too specific, full of recency bias. I use YouTube for classical music while I'm working, so I play several hours of video every day. My feed is full of similar videos that are all the same! YouTube can step it up by offering a wider variety of potential interesting videos.


I wish they would provide to turn off recommendations on the home page when you're logged in. Far too many click-baity channels show up and it becomes cumbersome to wade through all of it to find the channels that I follow.


This is why I never look at the recommendations on YouTube. I just go straight to my Subscription feed; there is usually something worth watching there and if not, I find my entertainment elsewhere.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: