Do we know it's actually dumb?
Like, given that you just bought a mixer, perhaps in an absolute sense, you're unlikely to buy a second mixer. But relative to everything else that you might buy, perhaps you are very likely to buy a second mixer!
There are huge financial incentives to get these ads right. I'm more inclined to guess that Amazon is doing something right than that they're leaving money on the table.
Some percentage of people who just bought a mixer will get a product they aren't satisfied with, for whatever reason. That population has a much higher change of buying another mixer than other people. And mixers are probably far more expensive products.
I have no idea if, in an absolute sense, there's more chance you'll buy a cookbook rather than another mixer, but it wouldn't surprise me either way - on the one hand, obviously most people don't need two mixers. On the other hand, plenty of people who buy a mixer will never buy a cookbook in their life, but literally 100% of people who bought a mixer have proven that they were at some recent point in the market for a mixer. I'm not sure which effect wins out here.
Not really. All you have to do is convince companies that machine targeted ads are worth it. Apparently that is orders of magnitudes easier than actually recommending anything relevant.
Which isn't hard to imagine either, there aren't much competition after all. So the primary purpose of targeted ads is that it functions as an imaginary hammer you can use to squash new competition with.
Just bought a mixer, how about: A new set of mixing bowls, alternative whisks and hooks, perhaps a set of measuring cups, a kitchen scale?
Just bought a new smartphone, how about: an external battery, screen protector (that fits the specific model), a fancy charging dock?
It really doesn't seems like it would take much intelligence to create these sets of complementing products and then use them for further ad targeting in the future, you could even use AI and ML to create the sets in the first place so you can fill out your buzzword qouta.
You think "I already have a mixer, idiot!" but in reality the chance that you are interested in buying a mixer just went from 1% to 2%. (Because you want another one, because you want to gift one, because you returned yours, etc.)
The last time I saw an Amazon employee weigh in, he said the ads are in fact not effective but the ad team isn't measuring it well. In particular, he claimed that he showed them that after adjusting for returns, the order-again rate for a particular expensive item was effectively zero, they asked him to repeat his study for another product, and he wandered off and found work to do for his own team.
I'm not sure if put more stock in the government successfully hiding a fake moon landing, or analysts in one of history's most successful companies not running the numbers.
That, I think, hits the nail on the head.
I think it explains the Amazon case quite well. I'm not so sure sure about the YouTube idiocy.
So far it looks just as dumb to me.
I once bought crab meat in a can from an online store (hard to get it any other way in Poland).
A banner for buying more canned crab meat kept haunting me for the next two years, in various internet locations :)
and scarily brilliant at the same time. I was looking up Newtek some time ago of the Amiga Video Toaster fame. Googled Wil Wheaton promo video (Will worked for Newtek :o), Penn Jillette (also promoted it) and finally Kiki Stockhammer (spokesperson). Here is where things get interesting. Google spewed 1997 "How To Have Cybersex On The Internet" https://www.youtube.com/watch?v=PBDKEn-TeQg as the result for Kiki haha, why would it do that? Turns out this "instructional video" was made using Video Toaster and features one of the iconic special effect transitions
here on Toaster demo reel https://www.youtube.com/watch?v=K1OVWfmynPw&feature=youtu.be...
so what does it have to do with Kiki Stockhammer? This transition is Kiki doing poses on the green screen. Google search somehow connected this information together and was able to recognize Kiki Stockhammers silhouette on random youtube meme video :o !
This isnt the first time I got image search results that could only happen if Google was analyzing actual image content. Microchip part searches often return pictures of random boards featuring that particular IC despite no bom list in the html, Google is OCRing images and recognizes part numbers. Same with faces, cars etc.
> More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles.
I guess we all know this by now, but google doesn't care about your being able to use its products in a reasonable way. In some sense, this makes sense since I bet the overlap of power users and people with ad blockers is pretty high.
Maybe you should speak for yourself? I do not run ad blockers (much less anything targeting Google ads!) and the experience I get as a power user on Google services is no better than anyone else's - although, to be fair, it's not noticeably worse, either. (And mind you, the average ad supplier on the Internet is a hell of a lot shadier than Google, Facebook or any of the big "household name" tech companies!) I'm sure that plenty of other people here can confirm my POV.
Google's trouble engaging with power users as of late is entirely self-inflicted. There's quite simply zero doubt about that.
It always annoys me that giving users options they can control is viewed as complex but search engines that return whatever they deem as important aren't viewed as complex.
What you’re after already exists, the “recently uploaded” section on the home page.
Why not just pause your watch history (or, in case you forget, remove the items after the fact)?
No ads, no direct tracking , just videos.
 Google can still watch your device downloading videos of course...
WARNING: PUTTING NEWPIPE OR ANY FORK OF IT INTO GOOGLE PLAYSTORE VIOLATES THEIR TERMS OF CONDITIONS.
Newpipe is available on F-Droid but that version is sometimes a bit old. The project maintainers suggest the following order of preference for getting access to the latest version:
In order to get this new version, you can:
- Build a debug APK yourself. This is the fastest way to get new features on your device, but is much more complicated, so we recommend using one of the other methods.
- Download the APK from releases and install it.
- Update via F-droid. This is the slowest method of getting updates, as F-Droid must recognize changes, build the APK itself, sign it, then push the update to users.
If I want something new, escape my bubble, I use other platforms like bitchute or peertube. Sometimes it's pretty scary but it's getting better.
Maybe I'd just prefer to get links from friends
But if I want to go watch Episode 5 again or jump ahead to Episode 26, it's happy to help. Awesome.
The behavior that really gets me now is having the top recommended videos on my YT homepage ones I've watched... Recently. I can watch a video from my notifications, and then have it stay at the top of my homepage until I hide it manually. Refreshing the page changes some videos, but rarely the top few.
Of course, hiding the video helps but I get tired of having to do it constantly. I think I'm resigned to my YT homepage being useless at this point.
For that youtube is great because it's autoplay just seamlessly plays one after another.
For everything else it's recommendations are hilariously bad.
Don't get a started on discovery!
You know the exact date of Episode 18 so you assume Episode 19 was published soon afterwards. So you go into the profile and click videos.
You realize that the date is not even listed, it just says "x months ago", or "x years ago". And since this video was from two years ago you have to sift through all videos of the last two years. And you can't just ctr+f. You have to go to the bottom to load the next 30 or so videos.
The user interface for these kinds of sites are just incomprehensible poor.
Plus, on the iOS app there's no way to search within a channel or playlist. If I want to find Justin's lessons on a Nirvana song I have to search "justin guitar nirvana" rather than just "nirvana" when I'm on that channel.
I'd much rather have youtube recommend episode 1 or the first video of the playlist to people but instead it gives them a random one in the middle. This causes the viewer to get mad and press thumbs down. This also makes the "average viewed minutes" go drastically down.
That's probably the source of most of my thumbs downs.. It also causes ridiculous stats like episode 27 of 50 has 100 times the number of views as the first episode but not more view time.
It's actually just a playlist setting for the creator but the default is reverse and that makes the whole list useless. You even have chrome and firefox plugins to fix it.
I also link the playlist in each youtube description but unfortunately they are not so easy to spot on app/tv/game consoles.
Also, in this case I think it's OK to youtube-dl the list and play them in order locally.
Unfortunately I use Safari so it's not an option for me, sadly one of the very few downsides to using Safari is relatively weak extension support.
Example: The Slow Mo Guys channel has a "Planet Slow Mo" series: https://www.youtube.com/playlist?list=PLbIZ6k-SE9Shmj0Uvxtrv...
It's not even subscriber gated or verified channel gated, it's someone at youtube for some reason flicked the switch on for this channel gated
I don't search for the playlist, then go to the playlist and scroll through to the episode I want, then pick it.
But then since I didn't go to the playlist, now YouTube can't recommend the incredibly obvious next video in the series.
Secondly, I can't watch any video without it taking over my recommendations. Alex Jones for example. I still haven't watched an Alex Jones video (I was curious about what rubbish he spews), without worrying that I'll get a bunch of delusional hyper-conservative drivel taking over my recommendations.
There should be a way to watch a video once and not have it take over your recommendations.
On the one hand, we find that we're making great strides in certain areas with ML. On the other hand, recommendation systems are still just so, so, so naive and bad.
I’ve talked about this before, but if you have kids, the only surefire solution is curating the videos yourself and downloading them. Then playing them offline. Anything else is a gamble, which may or may not be the route you want to take.
YT cracking down on videos also has a lot to do with skittish advertisers. I’m not sure if a solution is possible other than utilizing Patreon, librepay, etc. Even if you don’t make “controversial” content, that’s what most people do these days for consistency, as well as merchandise.
"In this setting, your child will only be able to watch videos, channels and collections that you have hand-picked. Collections are videos and channels grouped by topics, such as science and music, picked by the YouTube Kids teams or our partners."
YT has proven themselves incapable time and time again when it comes to dealing with human support and services. I cannot read PR text from them and just believe it.
is a lie
Why can't we opt out completely from "recommendations"?
I helped contribute to that article, feel free to ask any questions!
I know some people on HN have such setups; could you share recommendations for hardware and software stack? I suspect a Raspberry Pi may not be enough (having network and USB sharing capacity), especially if we want to simultaneously stream movies for ourselves. But I also don't want to turn it into some $1k+ server build, the way they do on /r/plex and elsewhere. Could anyone recommend a "compromise" setup, that would allow to comfortably stream HD videos to 2-3 devices simultaneously?
(I have a seven year old and a three year old)
In the end we just settled on netflix and the national broadcasting company (NRK), but I do have a synology nas with some stuff that I can stream to mobiles/ps4 via dnla or apps which would do most of what you want.
If you're worried about the quality of content from YouTube, or the chances that someone gaming the system dumps questionable content into your stream, it's worth spending ten bucks on an antenna. It's also good for supplementing your child's YouTube viewing when YT starts showing the same stuff over and over.
Still, I want to make it video-on-demand, but with content fully curated by us - to avoid age-inappropriate content, ads and recommendations. There's no way I'm going to let my kid touch raw YouTube in the next decade.
Also PBS kids apps, ad-free as well.
I am able to get most of the videos downloaded and converted using that application. Sometimes I have to go and manually get a couple of videos that failed to download, but the app has served me well.
For me, at least right around the 2016 election...
Do you think all videos in which people are wrong about things should be taken down?
If somebody writes on their blog that they don't think the holocaust is real, should their blog get taken down, or is there something different about it being a video?
In my opinion, YouTube has gone much too far towards the side of taking down videos rather than leaving them up. We don't need to be calling for them to be even more strict with removing videos that people don't like. If you don't like it, you don't have to watch it.
If your goal in taking down conspiracy theory videos is to kill the conspiracy theory, I think that may actually be counterproductive. If the video stays up and everyone laughs about how stupid it is, nothing bad happens. If the video gets taken down then it just provides them an opportunity to shout about how they're trying to spread the truth but they're being silenced by the establishment.
Do you think all videos in which people are wrong about things should be taken down??
Suppose Video A posits "your computer has a little gremlin inside of it and you should shove an oatmeal cookie into the CD slot at least once a month to keep the gremlin happy" and Video B posits "the Jewish-controlled mass media is conspiring to make us believe school shootings are a real thing and you should stockpile weapons for the coming race war." They are both technically "videos in which people are wrong about things," but isn't there a qualitative difference between those two things? Don't you think it's defensible for YouTube to take down Video B without taking down Video A?
Furthermore I expect Google to apply their policies evenly which they don't. If violations of Google ToS are reported they should be taken care of don't you think.
It isn't like you can actually debate them on their channels either as they delete comments that don't support them the vast majority of time.
You seem to be against people putting out content that don't align with your views, which is something completely different.
Not GP, but I see this a lot as a retort to people who make the claims they did and I tend to call BS on it.
I strongly suspect there is an extremely broad library of "content that doesn't align with their views", including an extremely broad spectrum of political views, they would have zero issues with on YouTube.
Somehow, climate denialism, antivax content (and other health hazards), holocaust denial, general nazi shit etc are consistently not part of that.
Now GP can come in, correct me and tell me that they want everything they ever would disagree with off Youtube for good. But somehow, I suspect they won't.
So, tangential to your post but, tell me: why is it that there's such a consistently-easy-to-define line for content that a lot of people think should be kept off various platforms, and why the hell is it that, whenever I see someone defending such content, it's almost always someone who belongs in the category of people who believe in said content, rather than a staunch defender of free speech.
I ask this knowing full well there are many people I hold in very high esteem, who legitimately do defend extremely vile shit they disagree with, based on free speech principles. Most of those people I personally know work at the EFF and I've never heard them say much about deplatforming nazis. (could it be because free speech is a government thing, not a youtube thing …)
Since nobody here has given any indication that they believe in these ideas, it's probably because you simply assume that anyone defending it believes in it.
Possibly also because the concept of not wanting to take down objectionable videos is a "weird" idea, and people who are willing to take weird ideas seriously probably have lots of other weird ideas.
Google is beyond broken. Chasing those engagement numbers on a forever treadmill, just hoping they inch up a little more at any cost.
I went through a "This Old Tony" binge a couple months ago because i stumbled upon one of his videos and youtube kept recommending more. But that seems to be the exception rather than the norm.
Now, thanks to the all powerful algorithm, I can watch a video about a rescued pitbull living with his new family, and have it followed up by "10 times Ben Shapiro owned the libs".
In general I find the selction on the homepage to be useful, however it does take some work as I regularly remove a lot of random things.
I also open random videos in a private window to keep my watch history more focused.
So it works fairly well for me, bringing up videos from channels I like that I haven't seen before... but it takes effort to get it to work for me. So it is arguably not very good for a typical user.
Currently I see absolutely zero related content next to the current video I’m watching, from the current user or the subject matter.
It's midly annoying to not have better suggestions, but on the other end, it gives you a perspective on what kind of people are fed with what kind of info.
Kinda like the old email chains my mother used to send me. It allowed me to understand what were her fears, hopes, and lack in understanding about things I never had the chance to talked about with her before.
But more than. One or two conservative videos, especially about European current events, and you are sucked in to the Nazi Propaganda hole.
Another inescapable pit of content is electronic music production.
I was doing a bit of research on the teenage engineering pocket operators as a possible gift for my younger brother. Now Im almost exclusively recommend music production videos.
I just tried this again and not much has changed. Started by viewing Joe Rogan's interview with Elon Musk on a fresh profile. The first suggestion from there is a Jordan Peterson anti-feminist video, from there we get a Jordan Peterson anti-islam video, then "The Suicide of Europe" by PragerU, and from there the floodgates are opened to fear-mongering about "rapefugees". That's less than a dozen clicks on highly related videos to get from Elon Musk smoking weed to videos with an overtly racist agenda, on a profile that's never sought out that kind of content previously.
Seems like they're trying to profile the user for recommendations too early, or just generally suggesting based on too small of a sample of videos. The blank profile is like a knife balanced on edge, where the slightest push will tip it one way or the other. If they'd just hold off the personalized recommendations entirely until you had 100 or 200 videos, it might be better.
They really need to get the users explict preference, rather than trying to infer it from what videos they watch, as that data is tainted by the recommendation engine.
Also they have killed the annotation system that most video creators have used to manually recommend videos to their viewers, which is probably going to cause even more overfitting of recommendations.
I'd be persuaded if you could show that a random walk through the recommendations ended up in far-right opinion some disproportionate percentage of the time. What you did was a goal directed search. At every point you selected the most right leaning perspective, and in the end you got where you wanted to go.
The only way to stop this from happening is to ban objectionable opinions from the platform entirely, or alternatively to disconnect them from the graph of recommendations. The latter is what YouTube's "limited state" already does.
>The first suggestion was...
It doesnt sound like they were choosing the most right-leaning perspective, but rather the first on the list (the one that would play after your current video finishes if you have autoplay enabled).
That Jordan Peterson video was in the #4 position in the sidebar from the Musk interview. However, it was the first video that did not include Joe Rogan. So far so good.
Thereafter, clicking the first recommended brought me to another Peterson video, then Gordon Ramsay, and then into an endless loop of Kitchen Nightmares clips.
Certainly, there's some stochastic element here, but I'm still not convinced.
Honestly this comment thread seems like trying to build a narrative bridge between straightforward right-leaning content and bizarre conspiracies. The goal is to use censorship of bizarre conspiracies to justify censoring right-leaning content by conflating the two.
I'm very suspicious of YouTube's intent here. Given how vicious Google is towards conservatives inside their organization (see Damore) it's pretty obvious to suspect that they'll use their power to reduce the spread of right-wing ideas of all stripes. Put simply, after that display, nobody can trust them to be even-handed. Most of them are good people but they're totally dominated by the intransigent minority  of high-and-righteous recreational witch-hunters in their midst.
The first excuse will be they're insane conspiracies. Truly crazy videos will be censored. But that's just the bait, setting the narrative for the switch, where they narrative becomes about "racism" and they start censoring the right-most 10%, 20%, 30%, of the opinion spectrum. Criticism of Islam, support for Western culture, statements against anti-white bigotry, arguments for reduced immigration (even illegal immigration), arguments for equal legal treatment for men, arguments about biological differences between sexes or population groups, arguments about deep differences between religions will all be censored under this ever-expanding umbrella.
Even the notion that "conspiracy theories" are a right-wing thing is part of this - there are lots of left-wing "conspiracy theories" and always have been. Some openly racist and sexist ones are quite widespread right now.
The Buzzfeed article (and others like it) is ultimately just a collection of anecdotes. It is not data that reflects my reality and I expect it does not reflect reality for many of YouTube's users. However, it provides a lot of confirmation bias for those who look at the existence of content/perspectives they disagree with and have an urge to want to erase that content wholesale. It feeds into what seems to be a crisis manufactured by some activists on Twitter and some news outlets (like Buzzfeed).
And of course, it dehumanizes the right and their experiences/perspectives by associating them with terms like 'radicalization', or 'conspiracy theory', or 'fake news', which are probably not only exceedingly rare but also present on the left.
Are you sure you’re not watching too many conspiracy theory videos?
Left wing conspiracies are usually narrow in scope: the Republican party, the Koch brothers, George Soros, Dick Cheney, the Oil Industry, its not really the same as a intra-national conspiracy that transcends party lines.
Unless you can give me examples of conspiracy theories and their peddlers on level of Alex Jones for the far left.
Though it's true that the anti-GMO and even anti-Nuclear sentiments I see accross the spectrum but those differ a bit from conspiracy theories in that it stems from junk science, rather than a belief in a hidden government or group wanting to personally change you.
Just google for outbreaks in the US that are linked to anti-vaxxing and take a note of their locations. Most will start in heavily left-leaning areas. To clarify my point, I personally believe that anti-vaxxing is about evenly distributed between left and right wing, just for different reasons.
P.S. As I was trying to find an article about an outbreak in Seattle that I remembered from a few years ago, I realized there is another one happening right this moment https://www.seattletimes.com/seattle-news/an-anti-vaccinatio...
Unless you mean things you don't like are conspiracy theories in which case we might as well just call everything a conspiracy.
I seems to me that 'The Patriarchy' / 'The Privileged' on the left are conceptually quite similar to 'The Deep State' or 'The Jews' on the right. And those are most often described as conspiracy theories, and rightly so.
The difference between a conspiracy theory and a social critique is the personal nature of it. People that believe that the 'Jews' or that 'The Deep State' is responsible for everything believe that there are specific actors, people doing things explicitly to stop someone.
When people complain about the patriarchy or systemic racism they're complaining about a large tangled of social contracts and norms that result in certain people, races, genders etc being disenfranchised. For example it's a statistical fact that a black man smoking marijuana is not only more likely to be thrown in jail but also more likely to face harsher sentences than a white man smoking marijuana. What else would you call this, other than systemic racism?
You can't point to a similar statistic, study or what not that shows 'The Deep State' is specifically targeting Donald Trump or that 'The Jews' are attempting to kill all white people. You can, however, point to specific actions and trends the FBI has taken in the past to disenfranchise black people, including actions taken against Martin Luther King Jr.
One side explicitly has a factual basis behind it. The other does not. That's what makes one a conspiracy theory, and the other something worthy of scrutiny.
It usually gets more personal at the extreme. People will show you stats that shows a disproportionate majority of federal employees supporting the Democratic party and infer that there is 'systemic bias', some will point at specific cases and make leaders of hidden conspiracy out of them. I've seen enough of 'Kill All Cops' to know that there is similar extremes on the left.
Yes, there are going to be people that will take anything to an extreme. That does not make 'Systemic Racism' somehow a conspiracy theory, especially on the level of things like the 'Deep State'. Arguing otherwise is to me the highest delusion.
Let's say you watched a video called "Top 10 most amazing things you never knew until today" and now the youtube algo recommends you a bunch of clickbait. Just hit "not interested" on every single clickbait video you see and eventually it stops showing you those videos unless you watch more of them.
I would flag a video title as clickbait and the algo learns what I think, and then when it sees a similar title in the recommended list it removes the div from the page.
The current system doesn't tell you what people are interested in, it tells you what the system made the people pay interest to.
I doubt Ben Shapiro is 'required knowledge', nor that he became popular only through youtube. Those people usually have an IRL crowd. They have IRL impacts.
But I see your point.
The problem is it creates a feedback loop and amplifies whatever is slightly over represented.
I also noticed that on LinkedIn profiles, on many high profiles ones the related section contains pretty girls profiles with unrelated positions, just because statistically they get more clicks I guess.
I agree with you 100% here. But this problem can't be fixed. All that's about to happen is Google is going to decide that people will be interested in something Google finds more palatable. I don't know how to fix/escape this problem.
Watching polarizing YouTube videos isn't going to flip a bit, but constant exposure to a single perspective constantly reinforced is certainly going to influence your formation of beliefs. That's exactly what YouTube does with their recommendation system
My post was about notoriety not belief. Many things, trends and personalities are created by the media, whether you believe what they say or not. Flat earth conspiracies would never have been repeated by NBA players if not because of Youtube's recommendation algorithm.
I also remember when I watched in disbelieve people saying the sun was going around the earth.
Internet is just the reflect of human nature. Flat earth theory would have existed without it.
Unless the US government bans all sites except google and turns Youtube into a direct propaganda tool and declares that publishing any unauthorized content is a crime, then the worst that can happen here is that Youtube's updated algorithm fails to serve the needs of its userbase, in which case it will probably be amended due to a drop in the site's popularity and engagement metrics.
Content that isn't recommended hasn't been censored, it's still discoverable as long as it exists on the platform, and free speech remains unaffected even when a single platform decides to alter its algorithm in a way that might slightly reduce the immediacy of certain kinds of content.
Not all slopes are slippery.
Someone tried to set one up. It's called BitChute. All the payment processors blocked it.
The reality is that YouTube is not just "a single platform", it's THE platform. It's owned by a megacorporation with extreme political leanings. It's incredibly influential on votes and opinions globally. It's run with zero transparency, by a small unidentified unelected group of wealthy and powerful people.
You're just comfortable wtih this because you think they're on your side politically. Most people are comfortable with moderate levels of tyranny as long as they think it's their tribe that holds the levers of power.
But if Google was owned and run by evangelical Christians, and they were shifting their recommendations to discourage "immoral" videos of gay pride, pro-trans-rights arguments, sex worker rights, etc, you'd be incensed.
There is a discussion to be had about some topics at the edge...but 2PM everyone agrees is part of daytime; the flat earth 'conspiracy' should never be recommended to anybody.
And sadly, in the current climate, this also has to be said: This does not mean Google is picking sides. If you on your own reflections decide that for some strange reason the earth is flat, that is your right, and no one is infringing on it by not giving you a platform.
'There is a discussion to be had about some topics at the edge...but 2PM everyone agrees is part of daytime; the flat earth 'conspiracy' should never be recommended to anybody.'
It should be recommended to anything to whom it would be relevant to. That's the whole idea behind recommendation system.
Worse, people that said it was a conspiracy are now calling other things conspiracy, without any consideration for their past errors.
Ultimately I've always found the term silly. People can believe idiotic things for sure, yet on the other hand I think there is an equilibrium where on one end you have people who believe no 'conspiracy theory' could ever be real, and on the other hand you have people who indulge every 'conspiracy theory' as probable. Both opposite extremes are equally naive.
 - https://en.wikipedia.org/wiki/Conspiracy_theory#Etymology_an...
 - https://www.zerohedge.com/news/2015-02-23/1967-he-cia-create...
 - https://en.wikipedia.org/wiki/Warren_Commission
Sure... some. But that's not the reason it's promoted. Paradoxically, many "fall for" it's real utility my merely lumping it in with whatever they want to discredit; disinformation is rather effective when it attaches obvious bs to other things.
A small Canadian band showed up in my recommended and that first song captivated me. Then I listened to their other singles and two albums, and now I'm going to their show next month.
Also, I just found an artist from Belarus that makes fantastic synthpop. The weird thing is that all his videos use cyrillic characters, so I don't know how google even matched it. I am thankful though.
Alright, I'm gonna need a link.
Check the comments. It looks like there were plenty of people who had the same experience as me.
They seem to be oddly popular on youtube.
Also, generally speaking, you really should block youtube cookies. The recommender doesn't seem to track your history then, and the entire experience is FAR better, simply recommending associations from that single video, instead of always trying to railroad you back into where you've historically spent more time.
I don't want to say "not interested" because I don't know what that means... I want a feature where it's like "more like this, but not this one in particular."
It's surprising they've been under Google's control for a very long time and yet their main recommendation algorithm still leaves a lot to be desired.
Specifically, I notice more of:
* Subjects I haven't watched in a while, but went through a period of watching. This is especially useful if it's a recent update or revival of a particular topic.
* Content from channels that are actually similar to what I often watch, and not just clickbait stuff. Occasionally there's actual valuable discovery in the recs.
* Videos that I very much would normally watch from sub'd channels, but missed for whatever reason. The recs are actually helping remind me to watch things I want to watch.
It's still the minority of recommendations, but I have to agree that the system is actually improving.
I suspect I blocked some cookie or changed some setting they didn't want me to change, and this behavior is deliberately calculated to be punitive. It's not very user friendly.
Or, the YouTube developers are just not very smart. Either way, it doesn't look good for them. This would be so easy to fix. If they don't know how, they are welcome to reach out to me and I can give them some ideas.
Yes it did occur to me that some users might like this. But they way they have it set up is very confusing and suboptimal for my needs as a user. I will admit though it may be serving some business need that they have, like saving their bandwidth by re-serving me old videos from my cache.
I just ignore them completely now except for when browsing music. They often don't even help when watching serial things (e.g. part 1, 2, 3), I'll often make the effort to look up a playlist instead. I have also always ignored the front page, so can't say anything about it.
I do wish youtube would let you pick areas of interest and whether you wanted to be recommended based on those areas. If a video fits in an excluded area, you would just get random recs, otherwise you would get recs from around that area. Preferably a mix of recent + old + really old videos that still have good ratings / watch time.
Also they would have less to sort through, etc, if they allowed creators to "archive" a video (because its outdated and maybe some other newer one replaced it). This happens to me all the time. And as a user it would be nice if for example, when some video's audio was bad and then a newer fixed one uploaded, if I was prompted to go to the new one. Similar thing with serial content. It would be nice if creators could establish one video follows another or is related.
I like to watch spiritual and metaphysical talks, Papaji, Eckhart Tolle, Conscious TV, Rupert Spira and the like.
I don't want someone to tell me what's "true" or what's "real" by refering me to the so called "facts".
Sounds like a slippery slope to me...
There sure is a lot of drivel on YouTube.. much of it like content farms playing search engines, is driven by the ad revenue. I find this empty / mindless content far more distracting than "false claims".
It sounds at a first reading as though you’d feel everything since the Enlightenment has been a slippery slope. To be fair to YouTube though, if you want an environment devoid of “what’s ‘true’ or ‘what’s real’ by referring [you] to the so-called ‘facts,’” then YouTube seems like it should be your paradise. I watch mostly technical, science, and history videos, but apparently YouTube interprets that as a preference for ranting conspiracists and people who make Joe Rogan look like a gentleman and a scholar. I can only imagine the trash it throws up if you actually seek it out!
If you want to create a bubble of people saying things totally divorced from anything like reality, what more could you ask for than YouTube?
Any content that makes me question things is good in my opinion.
I had a period when I was into UFOs and whatnot.. and it led to an interesting realization about what I actually know. So in my view everything has its place. For example a video about flat earth may very well engage the viewer to wonder WHY the earth wouldn't be flat?
You can't force people to ask questions. Some people will get into the deep end and lose themselves. So be it. You can never force someone else to wake up out of their dreams. It's each individual's choice. At best it is only through the heart, and not rationality, that you can help someone see more clearly. Everyone believes n one thing or another, in order to feel safe.
And yes you are right. YouTube is absolutely perfect right now in some funny ways. Through the limitation of their own business model, they allow everyone to express themselves. And this is better for everybody.
I think they're going to stick to moderation politics and whatnot anyway, so who cares.
Instead, why not just include videos that thoroughly debunks these ideas in the recommendation section of these videos ?
I can only assume that it's an intentional marketing thing to get people to watch other things.
What does the first part (wider set of topics) have to do with the second part (number of videos recommended)?
Anyway, YouTube recommendations are generally quite poor I find, though sometimes a few gems come along. I also made the mistake of watching YouTube on my Android TV box. Holy cow, an ad _every single minute_. And the same 4 ads, rotating.
How about stop spamming me CNN, SNL, Late Night Show ( kimmel, john oliver, etc )? I've yet to accidentally click on these links/channels in all my years using youtube and yet, I still see them every day? Why?
I've pretty much given up on recommendations and just directly go to the channels I want to watch. There was a time ( early 2010s ) when youtube recommends was great and it was actually fun going down the youtube recommends rabbit hole. Now it's just corporate shilling. Just like google search and its incredible decline.
You guys aren't going to improve anything because you don't have competition.
hopefully the change is mild. i've discovered a ton of great stuff through youtube recommendations (especially music! youtube blows everything else away for music discovery).
If your local library or city center decided to prohibit a local white supremacy group from putting up flyers on their billboards recommending Holocaust conspiracy theory materials, would you criticize them the same way? I'm curious about why people seem to have very different standards for public spaces and websites moderating themselves.
anyway, i, personally, would still give them (and every1) the space on the billboard. let every1 talk, even the crazy and the hateful.
more generally, i dont think any institution should be privilged over the individual to decide which ideas they get to hear.
just one possible philosophical position, one choice on a spectrum of tradeoffs.