I didn't think about the collateral damage it could cause to others, which is to bombard your "friend's feeds". This is also interesting because, well you can destory facebook in this manner. If enough people are peeing in the pool, people are going to get out.
Maybe I should write a greasemonkey script to accomplish just that. Not that I want to destroy facebook, but that if i wanted to destroy my data this would be the way because facebook doesn't give me that option.
Also does everyone else think it's creepy when your friends stop using facebook and old "likes" pop up on your feed to make it look like that user hasn't left?
This kind of bloody-minded civil obedience is something that I very much like.
Many years ago the supermarket (20? 25?) Safeway (UK) started a customer loyalty card. My flatmates and I applied for one in the name of "Ivor Trotts". That card was only ever used to buy loo paper.
So a browser-plugin to allow me to auto-click all the social buttons might be fun; especially if I can do that on a throwaway social media account. Or maybe it's better if I do it on my main accounts?
Also another commenter mentioned that facebook does actual get around to deleting things, albeit slowly.
As for imposing my dislike on others, that is a fair arguement, but there is also the other side of the coin to consider. To impose your like of facebook on others. People who want to remain off facebook get pulled into it via tags. While i'm not saying you are doing the tagging, but opting out isn't really an option in a network as vast and pervasive as facebook is.
"Maybe that was true in the past, but today when you delete your data it is gone. Trust me, I wrote it myself. The law enforcement guidelines that have been circulating recently corroborate this."
"If you don't think you'll use Facebook again, you can request to have your account permanently deleted. Please keep in mind that you won't be able to reactivate your account or retrieve anything you've added. Before you do this, you may want to download a copy of your info from Facebook. Then, if you'd like your account permanently deleted with no option for recovery, log into your account and let us know."
Somehow it still suggests me friends from the old account. I can't figure out the way it could fetch this data without actually restoring it from the previous account.
A friend of mine died last year. Her FB account is still active. From time to time I get old "likes" of hers. It is a bit creepy.
You can. At least, this seemed to work effectively for me:
> In theory, deleting your account immediately removes all Facebook data related to you. In reality it's more complicated, taking about a month.
> Even data tagged for deletion can still take 90 days to be deleted.
> "‘It typically takes about one month to delete an account, but some information may remain in backup copies and logs for up to 90 days.’" — Facebook Data Use Policy.
> Allegations of complicity with US National Security Agency surveillance suggest that your data may never truly be deleted.
the last of which links to the Glenn Greenwald PRISM article:
The article says the NSA has "direct access" to Facebook systems, but doesn't mention Facebook deletion policy. If the NSA has a copy of your data, we can't really expect Facebook to be responsible for deleting it.
By my reading of the page, deleting your Facebook account functionally deletes your account immediately (that is, you no longer appear on Facebook) and after the two-week deactivation period, if you don't reactive your account, your data will be deleted within 90 days.
Of course, if any of this is true, I have no idea.
1) Obfuscate the private view by liking everything. This raises the signal-to-noise ratio exponentially and surely makes it hard(er) for "them" to characterize you. The longer you do this for, the more effective this strategy will be I would think.
2) Delete from public view.
If you want to have any real effect, you'd need to provide a fake but plausible signal, not noise.
Be interesting to see if any of the Facebook privacy lawsuits help Facebook to delete that data when they're asked to.
- create a new email address with a long random password
- change your FB account to use this new email
- destroy your copy of the email password
- change your FB account to have a long random password and don't keep a copy of it
A few months ago Facebook shared it would rank news higher, maybe this is the result about why Huffington posts were ranked so high.
Interesting experiment. Tweaking Facebooks algorithm is an ongoing process.
You'd have to slowly blend it in to your normal activities so that the abnormal behavior would be harder to classify.
Abusive ex-husband posts pictures of himself making threatening remarks to his wife. Other potentially incriminating documents. If he was able to delete it, then that evidence is lost. My only thought of how to preserve privacy rights is to have the "deleted"-not-deleted user files sent to a legal-compliance department and out of facebook's user database (where I have no doubt they're using the "deleted"-not-deleted user's files in analysis.
>Also, as I went to bed, I remember thinking “Ah, crap. I have to like something about Gaza,” as I hit the Like button on a post with a pro-Israel message.
>By the next morning, the items in my News Feed had moved very, very far to the right.
>Rachel Maddow, Raw Story, Mother Jones, Daily Kos and all sort of other leftie stuff was interspersed with items that are so far to the right I’m nearly afraid to like them for fear of ending up on some sort of watch list.
>While I expected that what I saw might change, what I never expected was the impact my behavior would have on my friends’ feeds.
This article has so much modern anxiety in a nutshell. We have the pervasive surveillance society, and having our behaviour affected by algorithms.
What this really highlights, to me, is the extent to which Facebook exerts editorial control over the news that you're subjected to. This has all sorts of other effects on how media dollars are spent and as a result the shape of discourse - I'm immediately reminded of http://www.slate.com/blogs/moneybox/2014/05/22/facebook_s_mi...
This is not to say that there haven't _always_ been pernicious incentives at work; but before you could at least question those incentives and motivations instead of shrugging, and pointing to there's an unexplainable, mysteriously biased, support vector machine et al pulling strings.
So yeah, count "survive a dystopian dilemna" on the list of things that are now well and truly DONE.
If you like everything a recommendation algorithm feeds you, of course you are going to get shit recommendations. Your like data is no longer correlated with your actual preferences at all. It's an interesting experiment, but people are projecting their own fears and opinions onto the result.
Bear in mind that "extremism" is the expected result of any reasonable algorithm that you abuse like this. If you like X, the algorithm will focus around X. If you like everything the algorithm suggests, you'll end up liking a "bigger target", and it will start suggesting stuff yet further away from your original starting point, until eventually you've "liked" everything. This is deliberately engineering a scenario in which Facebook will feed one a set of things that all seem "extreme" on some metric or other, be it "political", "pop culture insipid", or anything else, because they are all "extremely" far away from your starting point.
You are also engineering a scenario in which you are receiving "everything", so you will end up experiencing this proportional to raw volume within the segments of the interest graph that you've transitively explored. We can't eliminate the possibility that he's missing large chunks of the graph in his transitive exploration. A massive anime fan or somebody else big into niches and staying away from politics who starts doing this may end up with a very different experience. (In fact I'd suspect that it's decent odds that why the author ended up with "extreme politics" in the first place is that he probably had a pre-existing politics seed set of likes, which then "fuzzed" into the full set of political views while not getting into niches like anime or figurine collection.)
As a story it's interesting, but I'm not sure it reveals stuff about society so much as it reveals a very fuzzy picture of the author.
Facebook is supposed to be this big brother that knows everything you like, knows your whole circle of friend and acquaintance even maybe your inner thoughts. But in reality, it will forget everything if you have one day out of the ordinary.
Very quickly, the monetization model that Facebook uses is basically taking over the social aspect of the website. If you like it, chances are Twitter or even Reddit will give you a much better experience. That's a bit like if Google decided that you like ads so much you only want to see that and not the search result anymore. That's certainly something that Facebook investors should look at.
Lowbrow celeb culture trash and political extremism was selling newspapers for decades before the internet.
If there's anything interesting it's the proportion of content political extremist organizations are sharing to win "likes" from relatively sane people that actually isn't political extremism, or even vaguely related to their agenda.
The expected outcome of liking everything the algorithm suggests indiscriminately should be more moderation, not more extremism. If (as a oversimplified example) you like colors so the algorithm asked if you like, blue, green, yellow or red and you select them all, one would not expect it to eventually lead to a community that believes green is the best color and all others should be destroyed.
Your phrasing suggests that you misunderstood my point, but does not prove it. My point is not that he started with extreme politics (or politics that might be viewed as extreme by most people since of course $MY_VIEWS are always perfectly sensible and obviously correct, it's everybody else who is extreme), my point is that he started with politics at all.
"The expected outcome of liking everything the algorithm suggests indiscriminately should be more moderation, not more extremism."
Uh, no, that makes no sense. You're engaging in an operation that is deliberately blurring Facebook's impression of you. Open your favorite image editor. Cover an image with pure white. Put a blob of black in the middle. Start using the "blur" tool. It doesn't shrink. And that's not perfect, either... the center of the blob will remain the darkest, but in this system, Facebook keeps presenting you with more "extreme" views, and you keep telling it you like it. Of course it pulls you away from the "center"... you keep telling it to. And this is where the raw volume point comes in... the extremes are louder, so if you blur like that you'll see a lot more of it.
Similarly, purely out of chance you might be given the opportunity to like the "destroy non-green" community before the "destroy non-blue" community.
Maybe what we can take from the fact that the writer ended up shown/liking both left and right views is that the algorithm doesn't give a very strong weighting to negative correlation; i.e. even though people who like left-wing pages (presumedly) don't like as many right-wing pages; facebook still gives users the opportunity to like both. And possibly will not necessarily bias on one side of apparently conflicting interests when choosing what to present in the news feed.
Facebook has the advantage over a supermarket that they can sell you extreme political material without pissing off their other customers. What you end up with is probably what supermarket tabloid shelves would look like if people only saw what was tailored to them.
Actually, that does not sound surprising at all.
By the end of day one, I noticed that on mobile, my feed was almost completely devoid of human content. I was only presented with the chance to like stories from various websites, and various other ads. Yet on the desktop—while it’s still mostly branded content—I continue to see things from my friends. On that little bitty screen, where real-estate is so valuable, Facebook’s robots decided that the way to keep my attention is by hiding the people and only showing me the stuff that other machines have pumped out. Weird.
Maybe its more likely that its a reflection of those who have the power and influence to game the Facebook algorithm; i.e. those who are most heavily invested in the need/right to influence social perception. The right-wing are currently invested in this business with much to lose. Why is it such a surprise?
These modifications might also cause me to spend less time on my news feed because all the garbage is filtered out, so Facebook might at some point decide that it hurts them and disable these modification options, but then I'll probably just stop using Facebook's news feed altogether. I mostly use FB for groups and chats anyway. Just like with Youtube when they started showing what friends liked, I unfriended everyone and kept subscriptions.
I wish there were more discussion venues where the quality of your participation was based on the value to the discussion, not as to whether or not you supported what that person said.
People seek social groups that make them feel good and avoid ones that make them feel insecure. They especially avoid places where they are made to feel stupid. So they tend to prefer places where their beliefs are confirmed and maybe they can get away with calling someone else stupid.
What you are fighting is as natural as water flowing downhill: animals seeking comfort. You have to overwhelm gravity itself to turn the flow uphill.
There is no lack of good discussion venues. There are only too few people who prefer them. How can any such venue attract the masses out of their comfortable, partisan clubhouses?
A great, very recent, example of this was reddit making /r/TwoXChromosomes, a subreddit dedicated to women's issues, a default sub (meaning everyone, even non-users, are automatically subscribed). The stereotype demographic for reddit is young white males. I have no way of actually verifying these demographics--I'm not sure they do either.
Yesterday, a comic about the casual sexual harassment caused by men was posted and although there was some valuable discussion, most of it was pretty awful to read (the further down you scroll, the worse it gets). Although we'd like to believe that engaging in debate / conversation / dialogue generally cause us to have empathy for and understand the other side, the opposite tends to happen, especially in large, anonymous communities. And this is in a venue where the quality of participation IS (supposedly) based on the value of the discussion.
Although I think you're right that most websites (Hacker News included) suffer from only having up and / or down votes. It's possible that a more complex system might be more useful. Maybe in addition to the like / dislike paradigm, it might be useful to add quality / not quality functionality.
Historically you lived in a place which - through economics and upbringing - was full of people who were a lot like you. You worked in a job where your colleagues were - through reasons of education and geography - a lot like you. Where you found people disagreeable generally you'd avoid them if you had the chance. You chose the newspaper you liked and the news show you felt most comfortable with on TV or radio.
We've always put ourselves in information bubbles, this isn't anything to do with the internet, the internet is merely recreating what we already had. The only difference is that with the internet it's a larger wasted opportunity because there is so much more information out there you might subject yourself to.
As it stands, the title makes it sounds as if someone tried Facebook for two days and was happy with what they saw. (I know this is the original title, but that doesn't make it correct.)
Most people rarely "like" branded or advertising content on Facebook. So Facebook has tuned their algorithms to be very sensitive to the few data inputs it receives per person.
Now, here is a person flooding it with input. I am not surprised that the result is crazy deluge of branded content. It's like taking a carefully balanced lever and pushing real hard on one end. The other end is going to move a lot.
> I kept thinking Facebook would rate-limit me, but instead it grew increasingly ravenous.
Why would they build in a rate limit? The vast, vast majority of the time, the level of input is very low. To build in a rate-limiting system would be pointless gold-plating.
The rarity of this sort of flood is obvious from the fact that a human being at Facebook noticed the level of activity and contacted the author. That is an incredibly high bar at Facebook, which prides itself on automating everything.
It's been 3 weeks so far and it's amazing how after the first day Facebook started sending me two emails per day, along the lines of "you are missing important posts from your friends" and "you have 8 pending notifications". It instantly reminded me of Zynga's tactics to try to suck you into a game and paying for content.
 I made ONE exception- my close friend was missing for 24 hrs, left his wallet and phone at home, his parents freaked out so I reached out to all his friends on Facebook. This tells me I only need one feature Facebook has to offer - access to my own social network. Somehow we conditioned ourselves to only share Facebook information but not phone numbers or emails or any other means of communication. If Facebook suddenly went away I would lose 3/4 of my contacts. That's a scary thought.
If one builds it properly, Facebook is a good replacement for email's Address Book. Beyond that, it is not much more.
I am not removing myself, nor even going to delete from Facebook, because ultimately, it all boils down to how we can manage ourselves with discipline.
Of course, there's no such thing as the vanilla user. You'll still be targeted on generic features (location, browser, OS, etc). A vanilla Linux user in SF is different than a vanilla Windows XP user in London.
Sometimes it was fun to set up a joke in a back and forth comment thread with myself though.
I still log in as my alter ego every now and then to see what my profile looks like from someone else's perspective.
Or in this case like/post.
Full disclosure - I closed my account 2 years ago. Did not miss it even for 10 seconds.
Don't worry, they'll tell you.