Talk about fanning the flames. When I first started reading I assumed what they had created was a bunch of dummy accounts that would like various posts based on political viewpoint and then showing actual feeds Facebook pushed to those accounts. That is not at all what is happening here.
Generate a few accounts. consistently cause the algorithm to think you are Red vs Blue (via Likes, Profile tweaks, Posts from specific sources like Mother Jones vs Instapundit/Drudge) and then see how they start to diverge.
Reminds me of the Like-Everything experiment Wired tried. a while back. Much more interesting to see how it slowly moved toward what the Facebook thought was the local maxima. (http://www.wired.com/2014/08/i-liked-everything-i-saw-on-fac...)
Unfortunately, without knowledge of Facebook's algorithms, it might be a bit difficult. For example, I'm sure a decent chunk depends on your social network and what they are posting and that's much harder to simulate.
EDIT: I still find this illuminating. I am subscribed to two feeds, one Far Left and one Far Right, and it's always fascinating to look at them and how little they overlap. They pull from completely different sources and focus and nitpick and the craziest things about each other. Even when they cover the exact same story, they contextualize it completely differently with insinuated comments and bylines.
So, it's true it's not purely organic of real Facebook feeds. And as a result doesn't get to the heart of the criticism of Facebook generating echo chambers. However, it is a good exercise of showing how this might look and how jarring it can be to compare them.
This is the key line:
> These aren't intended to resemble actual individual news feeds.
The WSJ specifically filtered each feed to only include conservative or liberal sources. Facebook could observe a full equal time rule in each user's feed, thereby constantly exposing users to a variety of viewpoints and this so-called study would have given the exact same results.
Especially with the older generation. Anecdotally, I have seen my mother have very extreme opinions on things simply bc "she saw an article on facebook" and didn't fact check anything in the article. It was impossible for me to convince her that the article was wrong.
You didn't have to read more than the synopsis to see that the article was thoroughly refuted.
The study had roughly zero effect on the beliefs of the person who shared the article. In effect, they already believed something (hot dogs are going to kill their baby), they found an article that agrees with them, and there will be no changing minds thereafter.
I may just engage with a subset of people who are prone to be like that, but a large portion of my facebook feed is similar. They just believe there's some puppeteer pulling the strings on everything. We can cure cancer but 'they' don't want to. We can create infinite free solar energy but 'they' don't want us to have it. "They" orchestrated the entrance into wars, "they're" hiding aliens, etc. Some of it is cliche conspiracy theory, but some of it leans towards "we're in the matrix" level of conspiracy.
I come to HN to keep grounded. I love it here because of rational discussion & debate among people who seek factual truths.
There's something called the Backfire Effect, wherein presenting facts and evidence actually reinforces people's positions, and can make them believe even more strongly in the thing you're proving is incorrect.
I mean your example is pretty extreme, I'm not going to say that was rational. But in general the average person doesn't have enough scientific knowledge to dispute a scientific study. In fact it's actually quite easy to make a case for almost anything by cherry picking studies.
So people have to rely on taking the consensus of others in their social group, or authorities they trust. And it means they don't actually have an opinion themselves, so you can't argue against them. They are just trusting the opinions of someone else.
This actually isn't a bad heuristic in general, but can lead to crazy beliefs like that. I think we all do this to some degree. I believe in global warming, but I couldn't possibly dispute any arguments against it that you could give me. I'm not going to change my mind though.
I parsed the PDF and did a scrape of the feeds in this repo: https://github.com/dannguyen/facebook-trending-rss-fetcher
(but didn't do the work of extracting items from each individual XML)
I know that FB shows me articles that their feed algorithm determines to be aligned with things I've clicked on before (so that I'll click again), however it's pretty shocking to how my clicks can effect what sort of articles I see.
It's almost as if every click digs me further and further into a specific bucket that shapes what I see and that I can't really climb out of.
As in life, it's important to actively seek out information you disagree with. That's how you learn and broaden your horizons. It's the only way.
Most of the stuff out of my "filter bubble" is just incredibly low quality garbage, whether it agrees or disagrees with what I think.
I can't see how actively going out of my way to read articles about how Kim Kardashian is an interesting intellectual adds much to my horizons.
There's only so much information we can consume. There seems to be this myth that looking now and then at the opposite viewpoint from what you believe frees you from the constraints of bounded rationality. It doesn't, and I haven't seen very conclusive evidence that it helps in any meaningful way.
The thing is... it's entirely probable that most of the people who agree with you do so because of low quality garbage. So you're very much comparing yourself to a large group of other people and coming out that you read things above their level - not very hard. You need to find people on your level on "the other side" and figure out what they're paying attention to.
> So you're very much comparing yourself to a large group of other people and coming out that you read things above their level - not very hard.
Not sure I understand. I'm not comparing myself to anybody. I'm simply stating that I can't find a reason to go out of my way to read garbage.
Your last point is absolutely correct, and it probably summarizes why I believe the whole "filter bubble" thing is irrelevant or at least vastly overstated: the criteria that I use when deciding whether to read or not read something are entirely orthogonal to what "side" it comes from.
Your fallacy is assuming that there is nothing outside your bubble that isn't garbage.
First, whether something enters my filter bubble or not is not the same thing as the criteria I choose to look for new things, or to decide whether something stays within my filter bubble (say, a website I would add/remove from my RSS feeds). There's a passive/active dichotomy here.
Second, I don't think I've claimed there was nothing outside my bubble that wasn't garbage. Now, I do assume that 99% of the stuff out of it is, because that's the nature of the SNR on the internet. Note that all of that is of course entirely subjective, and I'm only talking about the value I derive from it personally.
Certainly, there are things that would be interesting that at a given time live outside of my filter bubble. But actively looking for them is not necessarily the best way to find them, as I might have to expend a lot of effort going through lots of junk before I get there, and other sources within my bubble might percolate that information faster, with less effort from me (say a link post on SlateStarCodex will give me a bunch of links to stuff I would have never found by myself).
Finally, I never said my bubble consists of all good information available, precisely because as I stated, all that information could not fit in anybody's bubble due to the limited nature of our attention or available time. So that's just a straw man.
The fact is that there's a game being played between what's inside my filter bubble and what's outside, and that game is not zero sum: most likely, actively trying to add more information from other sources will decrease the utility I get from it. It could be a temporary/local minimum (going through a bunch of garbage to find some hidden gem), but it could also be a more stable, lower utility state.
And the only way I can find to do that consistently is to actively seek sources of information on specific subjects from the "opposing" point of view. The articles my filter bubble send me are generally going to be emotionally-fueled rubbish, because it takes time and effort and a very good understanding of the subject matter to take apart an argument otherwise, and for most people, their time on the Internet is time they don't want to be spending mental energy.
If you only Like Left, then you keep drifting. But (I assume) they will always include some stuff to the Right of where they think you are as an option. It won't be Far Right, but it will be in that direction.
Obviously, this is a lot of speculation of how Facebook is doing it's feed optimizations, but I expect they provide opportunities to get out of your localized maxima.
The fact that when I Google "python" it gives me exclusively results about programming and nothing about snakes is a feature, not a bug.
DDG doesn't do this, and when you use it to query google search, it doesn't let google do this either.
Although (this coming from a huge Bernie supporter), it seems like both sides are in agreement that they should be attacking Bernie Sanders.
As an aside, the format of "And now you won't believe THIS happened" as a headline/tagline with almost no additional info to get more clicks is absolutely infuriating. I've always made it a point to not click on headlines/links written in that format.
It seems the pro-Hillary left is fairly well aligned with the anti-Bernie right.
I have no idea if this updates to display different articles throughout the day, might have just been my small sample.
Also, there's a lot of internecine quarrelling in the blue feed about Clinton and Sanders which might be an interesting angle to examine further.
I think that only showing people news that agrees with their way of thinking leads to the dangerous situation where you end up with a positive feedback cycle of groups self-confirming their own beliefs, and the other side — "them" — is viewed as a deranged bunch that is incapable of irrational thought. While increasing groupthink may lead to higher advertising revenues, it also leads to interpersonal polarization and higher levels of animosity.
There was an interesting NY Times article that came out recently (http://www.nytimes.com/2016/05/08/opinion/sunday/a-confessio...) about how liberals and libertarians are far overrepresented in academia. While that in itself didn't surprise me, the comments on the article did. Here are a few of the NYT picks:
> "It's not that conservatives aren't bright; it's that, for the most part, they are narrow-minded and are sure they have the right answers."
> "Conservatives are entitled to their views, but they also must understand that young adults pay thousands of dollars to obtain a college education in order to learn facts, not fiction."
> "Is intolerance necessarily a bad thing?"
> "No scientist holds groundless beliefs to prove some kind of subservience to a deity."
What is interesting is that I frequently hear conservatives making the same comments — just swap out a few of the words:
> "It's not that liberals aren't bright; it's that, for the most part, they are narrow-minded and are sure they have the right answers."
> "Liberals are entitled to their views, but they also must understand that young adults pay thousands of dollars to obtain a college education in order to learn facts, not fiction."
> "How could any intelligent person possibly believe the universe came from nothing?"
What I generally find is that the further left or right someone leans, the more similar their personality becomes to their right/left-wing counterpart. In my experience, the "extreme" people are much more similar to each other than they are to the people in the middle of the political spectrum. I would argue that — born into a different family — many of the zealots would hold their opposing view just as strongly.
One oddity that I haven't yet found an explanation for is that conservatives seem to be less prevalent online than liberals. I'm not quite sure why this is the case.
Your filter bubble, probably. Telling whether that's true in an absolute sense would be difficult.
(Bear in mind that I believe it is not possible to "not have" a filter bubble, so no offense is intended in my first sentence; everyone has a bubble, the only question is the nature of it, not whether it exists, and it is not sensible to want to not be in one, only to ask how you might change it.)
HN is the most neutral place I can think of. I think it might just hide it well, as politics aren't a large topic here. I'm just glad to see that even political conversations here are largely rooted in facts.
There is an old maxim - "all organizations that are not designed expressly to be conservative get more and more left leaning over time" or the like.
I see his ideas spreading everywhere, directly and in derivative form.
I can seriously see him being studied in academia in XXIIth Century. Him and Satoshi and a handful of other radicals in our midst who should not be named. The Net is like a flower unfolding, the people who intellectually influence the beginning are sure to have some of their number be rockstars much later on.
I live in Texas and I can definitely tell you that my internet experience is definitely not like that. In my neck of the internet, there are far more conservatives.
Demographics. Rural areas with worse internet access skew conservative. Older folks who can't understand the internet skew conservative.
Evidently commonly held ideas today like democracy or social welfare or multinational corporations not explicitly backed by the mothership would have been outrageous daydreams, utter folly to believe in the 16th century.
The problem with centerist positions is that they seem so reasonable because so many of them are reasonable. That makes us too comfortable.
I wonder what metrics you might use to measure that? Number of poster on the largest 100 "message boards" that slant one direction vs the other? Or is there an independent measure of political leaning-ness, and you could track how many click they make on Yahoo? Number YouTube views for certain videos? Facebook likes of certain stuff? Something else?
Expressing conservative views tends to be a social liability among the groups that tend to be active internet users.