Hacker News new | past | comments | ask | show | jobs | submit login
I ‘Liked’ Everything I Saw on Facebook for Two Days (wired.com)
304 points by sethbannon on Aug 11, 2014 | hide | past | web | favorite | 115 comments



I was thinking of a way to destroy my facebook account because you can't actually delete it, this is pretty much it, like everything. Also I guess you could check-in every where at once all the time.

I didn't think about the collateral damage it could cause to others, which is to bombard your "friend's feeds". This is also interesting because, well you can destory facebook in this manner. If enough people are peeing in the pool, people are going to get out.

Maybe I should write a greasemonkey script to accomplish just that. Not that I want to destroy facebook, but that if i wanted to destroy my data this would be the way because facebook doesn't give me that option.

Also does everyone else think it's creepy when your friends stop using facebook and old "likes" pop up on your feed to make it look like that user hasn't left?


> I was thinking of a way to destroy my facebook account because you can't actually delete it, this is pretty much it, like everything.

This kind of bloody-minded civil obedience is something that I very much like.

Many years ago the supermarket (20? 25?) Safeway (UK) started a customer loyalty card. My flatmates and I applied for one in the name of "Ivor Trotts". That card was only ever used to buy loo paper.

So a browser-plugin to allow me to auto-click all the social buttons might be fun; especially if I can do that on a throwaway social media account. Or maybe it's better if I do it on my main accounts?


I like Facebook. It helps me keep up with friends in distant places. If you find it to be pernicious and intrusive, don't use it (or use it differently). Don't impose your dislike on others. "Peeing in the pool" is a great metaphor in that regard.


I agree that it can help people keep in touch with other, and i never considered the network effects of doing what I was thinking, which is actually why I'm glad I didn't get around to doing it.

Also another commenter mentioned that facebook does actual get around to deleting things, albeit slowly.

As for imposing my dislike on others, that is a fair arguement, but there is also the other side of the coin to consider. To impose your like of facebook on others. People who want to remain off facebook get pulled into it via tags. While i'm not saying you are doing the tagging, but opting out isn't really an option in a network as vast and pervasive as facebook is.


Another thing to do with store loyalty cards is to always look up by phone number - many people in my extended family live nearby and use the same phone number, so the card has so many purchases on it it becomes useless to target.


this is very simple and should like all the loaded like buttons http://pastebin.com/CUhS9vq4, it also unlikes whatever was already liked though, since fb doesn't seem to change the class.


That's not going to work. Facebook will block you from liking on account of "going way too fast". Calling a sleep() function after every like might help...


You can actually use this method on LinkedIn to endorse all of your friends, and because it offers more results, you can set an interval to endorse infinitely. My LinkedIn friends all hate me now because they've been endorsed for random things.


So i guess the script would need to include some kind of gradually escalating random "like" increase.


I made this and it seems to work pretty well. http://pastebin.com/bfhw3R7H


>I was thinking of a way to destroy my facebook account because you can't actually delete it

"Maybe that was true in the past, but today when you delete your data it is gone. Trust me, I wrote it myself. The law enforcement guidelines that have been circulating recently corroborate this."[0]

"If you don't think you'll use Facebook again, you can request to have your account permanently deleted. Please keep in mind that you won't be able to reactivate your account or retrieve anything you've added. Before you do this, you may want to download a copy of your info from Facebook. Then, if you'd like your account permanently deleted with no option for recovery, log into your account and let us know."[1]

[0] https://news.ycombinator.com/item?id=3320240

[1] https://www.facebook.com/help/224562897555674

https://www.facebook.com/help/delete_account


I guess not being able to erasing data on FB servers is now just an urban myth.


I've erased my facebook account this way, and created a new one from scratch with the same name.

Somehow it still suggests me friends from the old account. I can't figure out the way it could fetch this data without actually restoring it from the previous account.


If the people that you are no longer friends with have your contact information (email, phone, etc.) and sync their contacts to Facebook, and you have the same details associated with your new account, then a relationship can be inferred and Facebook could show these people as suggested friends.


you deleted only your account, not of your friends, they have uploaded phonebooks, email directory ... of there own which refers back to you.


> Also does everyone else think it's creepy when your friends stop using facebook and old "likes" pop up on your feed to make it look like that user hasn't left?

A friend of mine died last year. Her FB account is still active. From time to time I get old "likes" of hers. It is a bit creepy.


> because you can't actually delete it

You can. At least, this seemed to work effectively for me:

http://deletefacebook.com/


It's hidden from public view. They hold on to all of your data, which the page you linked says.


That's not what the page or the Facebook Data Usage Policy says. Quoting from the page:

> In theory, deleting your account immediately removes all Facebook data related to you. In reality it's more complicated, taking about a month.

> Even data tagged for deletion can still take 90 days to be deleted.

> "‘It typically takes about one month to delete an account, but some information may remain in backup copies and logs for up to 90 days.’" — Facebook Data Use Policy.

> Allegations of complicity with US National Security Agency surveillance suggest that your data may never truly be deleted.

the last of which links to the Glenn Greenwald PRISM article:

http://www.theguardian.com/world/2013/jun/06/us-tech-giants-...

The article says the NSA has "direct access" to Facebook systems, but doesn't mention Facebook deletion policy. If the NSA has a copy of your data, we can't really expect Facebook to be responsible for deleting it.

By my reading of the page, deleting your Facebook account functionally deletes your account immediately (that is, you no longer appear on Facebook) and after the two-week deactivation period, if you don't reactive your account, your data will be deleted within 90 days.

Of course, if any of this is true, I have no idea.


So the steps would be:

1) Obfuscate the private view by liking everything. This raises the signal-to-noise ratio exponentially and surely makes it hard(er) for "them" to characterize you. The longer you do this for, the more effective this strategy will be I would think.

2) Delete from public view.


Notice how Facebook noticed what she was doing and got a PR person to talk to her. They have algorithms to detect abnormal behavior like that and flag it.

If you want to have any real effect, you'd need to provide a fake but plausible signal, not noise.


I'm sure they have a history of all your actions. They could easily revert your profile to a time before you went to town liking irrelevant stuff.


Step 0: Do not join Facebook.


> They hold on to all of your data, which the page you linked says.

Be interesting to see if any of the Facebook privacy lawsuits help Facebook to delete that data when they're asked to.


Hey, append-only databases are awesome, but hard to delete from!


When I deleted my account I first unfriended everyone, unliked everything I had liked, deleted all my pictures and posts and changed my name and personal details to fake information. Then I waited a couple of days and went through the official Facebook deletion process. I don't know if any of this worked but it made me feel better and like I did more than just stop logging in or rely on Facebook to stop using my account information once I "deleted" my account.


And to cap it all off:

- create a new email address with a long random password

- change your FB account to use this new email

- destroy your copy of the email password

- change your FB account to have a long random password and don't keep a copy of it


Or you could simply request a permanent deletion: http://deletefacebook.com


That's what worries me about Facebook too, about being able to destroy someone else's experience in this way. Of course you can hide posts from users but Facebook's algorithm should be able to intervene when things get "out of control".

A few months ago Facebook shared it would rank news higher, maybe this is the result about why Huffington posts were ranked so high.

Interesting experiment. Tweaking Facebooks algorithm is an ongoing process.


Isn't it trivial to block a user's stuff from your feed, though?


Yeah that sounds like creepy facebook. Blowing content tumbleweeds through its digital ghost town. This explains recent likes I've had on stuff I posted 3-4 years ago.


of course, it wouldn't be hard for a facebook engineer to write graph queries that would ignore such obvious noise, thereby preserving the original information they had on you.

You'd have to slowly blend it in to your normal activities so that the abnormal behavior would be harder to classify.


I'm guessing there are legal requirements that prevent them from truly deleting your content.

Abusive ex-husband posts pictures of himself making threatening remarks to his wife. Other potentially incriminating documents. If he was able to delete it, then that evidence is lost. My only thought of how to preserve privacy rights is to have the "deleted"-not-deleted user files sent to a legal-compliance department and out of facebook's user database (where I have no doubt they're using the "deleted"-not-deleted user's files in analysis.


In the US, at least, I believe there is no such legal requirement unless the party doing the deleting is aware of pending legal action relevant to the documents being deleted. I'm not a lawyer though and may be wrong.


Accurate. Many corporations exploit this with "document retention" policies, which are more often about document destruction.


Agreed, it's not Facebook's job to retain evidence for all their users' lawsuits.


From the TFA,

>Also, as I went to bed, I remember thinking “Ah, crap. I have to like something about Gaza,” as I hit the Like button on a post with a pro-Israel message.

>By the next morning, the items in my News Feed had moved very, very far to the right.

[...]

>Rachel Maddow, Raw Story, Mother Jones, Daily Kos and all sort of other leftie stuff was interspersed with items that are so far to the right I’m nearly afraid to like them for fear of ending up on some sort of watch list.

[...]

>While I expected that what I saw might change, what I never expected was the impact my behavior would have on my friends’ feeds.

This article has so much modern anxiety in a nutshell. We have the pervasive surveillance society, and having our behaviour affected by algorithms.

What this really highlights, to me, is the extent to which Facebook exerts editorial control over the news that you're subjected to. This has all sorts of other effects on how media dollars are spent and as a result the shape of discourse - I'm immediately reminded of http://www.slate.com/blogs/moneybox/2014/05/22/facebook_s_mi...

This is not to say that there haven't _always_ been pernicious incentives at work; but before you could at least question those incentives and motivations instead of shrugging, and pointing to there's an unexplainable, mysteriously biased, support vector machine et al pulling strings.


Think about it. This is an article, with its own 'share on Facebook' icon, entirely about a single persons interaction with a mechanized society, highly intricate and well and truly operating under the constraints of its totalitarian designers. Its not like we vote for Facebook features; that privilege is reserved for those with money to spend, making money, and it is an exclusive club indeed. Particularly given that those who attend to "Facebook habits", i.e. members of that church, are not in control. At all.

So yeah, count "survive a dystopian dilemna" on the list of things that are now well and truly DONE.


Good god, this reads like a parody. There is no totalitarian conspiracy to manipulate your facebook feed. You are not in a dystopia. Having users vote for features of a popular website is unreasonable.

If you like everything a recommendation algorithm feeds you, of course you are going to get shit recommendations. Your like data is no longer correlated with your actual preferences at all. It's an interesting experiment, but people are projecting their own fears and opinions onto the result.


So if you like everything on Facebook, the result is political extremism and the most insipid pop culture trash. What frightens me is what if this has nothing to do with Facebook, but is instead about human nature? Does this also describe what happens to a person's life when they stop caring, when they are no longer motivated to put forth the effort to be discerning, when they give up trying to make anything but the easiest and most immediate choices?


"So if you like everything on Facebook, the result is political extremism and the most insipid pop culture trash."

Bear in mind that "extremism" is the expected result of any reasonable algorithm that you abuse like this. If you like X, the algorithm will focus around X. If you like everything the algorithm suggests, you'll end up liking a "bigger target", and it will start suggesting stuff yet further away from your original starting point, until eventually you've "liked" everything. This is deliberately engineering a scenario in which Facebook will feed one a set of things that all seem "extreme" on some metric or other, be it "political", "pop culture insipid", or anything else, because they are all "extremely" far away from your starting point.

You are also engineering a scenario in which you are receiving "everything", so you will end up experiencing this proportional to raw volume within the segments of the interest graph that you've transitively explored. We can't eliminate the possibility that he's missing large chunks of the graph in his transitive exploration. A massive anime fan or somebody else big into niches and staying away from politics who starts doing this may end up with a very different experience. (In fact I'd suspect that it's decent odds that why the author ended up with "extreme politics" in the first place is that he probably had a pre-existing politics seed set of likes, which then "fuzzed" into the full set of political views while not getting into niches like anime or figurine collection.)

As a story it's interesting, but I'm not sure it reveals stuff about society so much as it reveals a very fuzzy picture of the author.


What's rather surprising though, is the speed of it. Sure if you like everything right you will eventually hit the limit. However, it happened in 1 day !

Facebook is supposed to be this big brother that knows everything you like, knows your whole circle of friend and acquaintance even maybe your inner thoughts. But in reality, it will forget everything if you have one day out of the ordinary.

Very quickly, the monetization model that Facebook uses is basically taking over the social aspect of the website. If you like it, chances are Twitter or even Reddit will give you a much better experience. That's a bit like if Google decided that you like ads so much you only want to see that and not the search result anymore. That's certainly something that Facebook investors should look at.


It's less to do with how underlying algorithm reflects the author and their network's pre-existing preferences and more to do with the entities that buy vast amounts of "promoted content" (political moderates don't buy "likes", except maybe when running for office) and what garners clickthroughs (surprisingly, outside HN "Key Advances in the OpenGL Ecosystem", "A Computational Introduction to Number Theory and Algebra" or even "Conway's Proof Of The Free Will Theorem" isn't really clickbait, but "What Celebrity X Did Next Will Make You Laugh, Cry and Orgasm all at the same time" does very well indeed)

Lowbrow celeb culture trash and political extremism was selling newspapers for decades before the internet.

If there's anything interesting it's the proportion of content political extremist organizations are sharing to win "likes" from relatively sane people that actually isn't political extremism, or even vaguely related to their agenda.


Your idea that it was due to his pre-existing politics could be tested if others wanted to repeat the experiment, but the fact that he ended up with both extreme left and extreme right political news seems to be disproof enough.

The expected outcome of liking everything the algorithm suggests indiscriminately should be more moderation, not more extremism. If (as a oversimplified example) you like colors so the algorithm asked if you like, blue, green, yellow or red and you select them all, one would not expect it to eventually lead to a community that believes green is the best color and all others should be destroyed.


"Your idea that it was due to his pre-existing politics could be tested if others wanted to repeat the experiment,"

Your phrasing suggests that you misunderstood my point, but does not prove it. My point is not that he started with extreme politics (or politics that might be viewed as extreme by most people since of course $MY_VIEWS are always perfectly sensible and obviously correct, it's everybody else who is extreme), my point is that he started with politics at all.

"The expected outcome of liking everything the algorithm suggests indiscriminately should be more moderation, not more extremism."

Uh, no, that makes no sense. You're engaging in an operation that is deliberately blurring Facebook's impression of you. Open your favorite image editor. Cover an image with pure white. Put a blob of black in the middle. Start using the "blur" tool. It doesn't shrink. And that's not perfect, either... the center of the blob will remain the darkest, but in this system, Facebook keeps presenting you with more "extreme" views, and you keep telling it you like it. Of course it pulls you away from the "center"... you keep telling it to. And this is where the raw volume point comes in... the extremes are louder, so if you blur like that you'll see a lot more of it.


Sure, if time wasn't a factor. However, if you were to like green first, the algorithm might show you this "destroy non-green" community before giving you the opportunity to like other colours.

Similarly, purely out of chance you might be given the opportunity to like the "destroy non-green" community before the "destroy non-blue" community.

Maybe what we can take from the fact that the writer ended up shown/liking both left and right views is that the algorithm doesn't give a very strong weighting to negative correlation; i.e. even though people who like left-wing pages (presumedly) don't like as many right-wing pages; facebook still gives users the opportunity to like both. And possibly will not necessarily bias on one side of apparently conflicting interests when choosing what to present in the news feed.


I don't think it's telling us what most people would rather hear - I think it's more likely that it shows us exactly who is trying to yell loudest.


The kind of person who clicks "Like" on these stupid things is kind of like the person who buys the Enquirer magazine from a supermarket tabloid shelf. They're going to click "Like" on a bunch of other stupid things, and visit the stupid links.

Facebook has the advantage over a supermarket that they can sell you extreme political material without pissing off their other customers. What you end up with is probably what supermarket tabloid shelves would look like if people only saw what was tailored to them.


OBAMA AND BATBOY IN CAHOOTS TO SELL KIM KARDASHIAN TO ILLEGAL IMMIGRANTS!


But working backward, if we see somebody who is into political extremism, it could hint at the behavior that lead them there. What if the members of radical hate groups are not aggressive idealists, but people whose spirits have been crushed, have low self-confidence, or are afraid to go against those that yell the loudest.

Actually, that does not sound surprising at all.


In this case, it does have to do with Facebook. To quote the article:

" By the end of day one, I noticed that on mobile, my feed was almost completely devoid of human content. I was only presented with the chance to like stories from various websites, and various other ads. Yet on the desktop—while it’s still mostly branded content—I continue to see things from my friends. On that little bitty screen, where real-estate is so valuable, Facebook’s robots decided that the way to keep my attention is by hiding the people and only showing me the stuff that other machines have pumped out. Weird. "


The comments people put on those kind of posts scare me more than the posts themselves! I think you can tell a lot about what's wrong with our society and the root of a lot of our problems from internet comments..


>What frightens me is what if this has nothing to do with Facebook, but is instead about human nature?

Maybe its more likely that its a reflection of those who have the power and influence to game the Facebook algorithm; i.e. those who are most heavily invested in the need/right to influence social perception. The right-wing are currently invested in this business with much to lose. Why is it such a surprise?


I did sort of the opposite. I would click on "don't want to see this" for any link to an external site. It took a few days, but now pretty much all I see is updates from friends.


I do the same. Changed all settings I possibly could and unfollowed (while still remaining friends) a few people, and my news feed is now basically exactly what I want to see. It still hides some news events, like I found out that someone got their driving exam months after everyone else did, but that's the algorithm's fault and it's a lesson that you can't expect all friends to see everything you post.

These modifications might also cause me to spend less time on my news feed because all the garbage is filtered out, so Facebook might at some point decide that it hurts them and disable these modification options, but then I'll probably just stop using Facebook's news feed altogether. I mostly use FB for groups and chats anyway. Just like with Youtube when they started showing what friends liked, I unfriended everyone and kept subscriptions.


Confirmation bias bubbles like a social media feed like this are atrophying the debate muscles of our society. When we don't actively participate in two-sided discussions we lose the empathy and scope of how so few things are really black and white.

I wish there were more discussion venues where the quality of your participation was based on the value to the discussion, not as to whether or not you supported what that person said.


You only say that because you keep reading the news stories about confirmation bias,


You won't get anywhere by pinning the blame on the medium. Freedom of association allows birds of a feather to flock together and make sport of the other side.

People seek social groups that make them feel good and avoid ones that make them feel insecure. They especially avoid places where they are made to feel stupid. So they tend to prefer places where their beliefs are confirmed and maybe they can get away with calling someone else stupid.

What you are fighting is as natural as water flowing downhill: animals seeking comfort. You have to overwhelm gravity itself to turn the flow uphill.

There is no lack of good discussion venues. There are only too few people who prefer them. How can any such venue attract the masses out of their comfortable, partisan clubhouses?


The alternative to confirmation bias bubbles is, especially for social media, showing people stuff they might not like or agree with--the problem is that this isn't great for business, and generally get's people up in arms.

A great, very recent, example of this was reddit making /r/TwoXChromosomes, a subreddit dedicated to women's issues, a default sub (meaning everyone, even non-users, are automatically subscribed). The stereotype demographic for reddit is young white males. I have no way of actually verifying these demographics--I'm not sure they do either.

Yesterday, a comic about the casual sexual harassment caused by men was posted[1] and although there was some valuable discussion, most of it was pretty awful to read (the further down you scroll, the worse it gets). Although we'd like to believe that engaging in debate / conversation / dialogue generally cause us to have empathy for and understand the other side, the opposite tends to happen, especially in large, anonymous communities. And this is in a venue where the quality of participation IS (supposedly) based on the value of the discussion.

Although I think you're right that most websites (Hacker News included) suffer from only having up and / or down votes. It's possible that a more complex system might be more useful. Maybe in addition to the like / dislike paradigm, it might be useful to add quality / not quality functionality.

[1] http://www.reddit.com/r/TwoXChromosomes/comments/2d85wt/next...


Nonsense. If anything, the nature of our civilizations have caused the atrophying of "debate muscles" and technology has allowed us, more than any other time in history, to actively research more viewpoints and narratives than ever before. The information/medium is there, the problem is the people.


Agreed. Anything framed as a 'debate' has only ever been two sides arguing their point, whilst studiously ignoring any potential validity in the other side's opinion. Confirmation bias is an issue, but not because of social medial. It's because people want validation of their existing opinions, not complex worldviews.


Haven't confirmation bias and information bubbles always been the way though?

Historically you lived in a place which - through economics and upbringing - was full of people who were a lot like you. You worked in a job where your colleagues were - through reasons of education and geography - a lot like you. Where you found people disagreeable generally you'd avoid them if you had the chance. You chose the newspaper you liked and the news show you felt most comfortable with on TV or radio.

We've always put ourselves in information bubbles, this isn't anything to do with the internet, the internet is merely recreating what we already had. The only difference is that with the internet it's a larger wasted opportunity because there is so much more information out there you might subject yourself to.


Could someone please put 'Liked' in the HN title in quotes so it makes grammatical sense.

As it stands, the title makes it sounds as if someone tried Facebook for two days and was happy with what they saw. (I know this is the original title, but that doesn't make it correct.)


Ok.


There's going to be a lot of Facebook bashing in this thread, but I'm going to posit that this response from the Facebook software is not unexpected or even wrong, because the author is purposefully playing against the assumptions built into Facebook algorithms.

Most people rarely "like" branded or advertising content on Facebook. So Facebook has tuned their algorithms to be very sensitive to the few data inputs it receives per person.

Now, here is a person flooding it with input. I am not surprised that the result is crazy deluge of branded content. It's like taking a carefully balanced lever and pushing real hard on one end. The other end is going to move a lot.

> I kept thinking Facebook would rate-limit me, but instead it grew increasingly ravenous.

Why would they build in a rate limit? The vast, vast majority of the time, the level of input is very low. To build in a rate-limiting system would be pointless gold-plating.

The rarity of this sort of flood is obvious from the fact that a human being at Facebook noticed the level of activity and contacted the author. That is an incredibly high bar at Facebook, which prides itself on automating everything.


i have yet to meet anyone who wants the things they've 'liked' to show up in their friends feeds. it's a purely user-hostile feature that facebook insists upon because it drives their engagement numbers up.


It would be purely user-hostile if all Facebook users feel the same way your sample of users does, but that's not necessarily true.


fair point, "purely" is an exaggeration, but it is at least telling that not a single one of my friends actually wants this behaviour.


User hostile? If displaying liked content to friends increases engagement, what's wrong about that? What would a feed that wasn't optimized for engagement look like? Facebook would already be a dinosaur in that case. People go to Facebook to see the most relevant and engaging content, what your friends like seems like a great source for this!


it's user-hostile insofar as no one wants their likes showing up in their friends' feeds. from a consumption point of view it makes little difference; you see content and interact with it. note that there is absolutely no way to opt out of this feature.


Nice. This is the post I needed to read, as I am on one-week fasting from Facebook (in fact, I am fasting from all kinds of email/group, facebook interactions). It is hard work, but much needed. I tend to believe, as humans in this age and time, we are losing our fundamental ability to delay gratification.


Same here. I decided to remove myself from FB altogether (inspired by 99 days of freedom).

It's been 3 weeks so far[1] and it's amazing how after the first day Facebook started sending me two emails per day, along the lines of "you are missing important posts from your friends" and "you have 8 pending notifications". It instantly reminded me of Zynga's tactics to try to suck you into a game and paying for content.

[1] I made ONE exception- my close friend was missing for 24 hrs, left his wallet and phone at home, his parents freaked out so I reached out to all his friends on Facebook. This tells me I only need one feature Facebook has to offer - access to my own social network. Somehow we conditioned ourselves to only share Facebook information but not phone numbers or emails or any other means of communication. If Facebook suddenly went away I would lose 3/4 of my contacts. That's a scary thought.


You are correct. I have been saying this for a while. Facebook has fundamentally replaced the key ingredient of email, that is: the ability of people to reach out to each other easily.

If one builds it properly, Facebook is a good replacement for email's Address Book. Beyond that, it is not much more.

I am not removing myself, nor even going to delete from Facebook, because ultimately, it all boils down to how we can manage ourselves with discipline.


HN sucks more time than FB for me :/


If it makes you feel any better, the time you spend on HN is infinitely more productive. :)


0/0 is "undefined" rather than "infinity" (:


It's a far more symbiotic relationship tho...


Just a very tangential point to the post, but the author mentioned a FB employee trying to connect him with the PR department. Exactly what does FB have in mind in doing that?


One of the jobs of the PR department is to fill in reporters with details about the company's activities.


I'd love to be able to hit the reset button on my FB profile, undo all the profiling they've done on me and start again. Same thing for Google. What does it look like to be a vanilla user?


Well, with respect to Google search, you can just use an incognito window.


Only up to the limits of Google being able to track you anyway. There's IP address, user agent, and any other HTTP header information your browser leaks.


A VPN + Incognito should be enough. Not to truly anonymize you, but probably enough to disable the personalization layers. It doesn't make economic sense to waste engineering hours and computing time trying to show ads to the 0.0001% of paranoid users, which are likely blocking them anyway, even if the data is there.

Of course, there's no such thing as the vanilla user. You'll still be targeted on generic features (location, browser, OS, etc). A vanilla Linux user in SF is different than a vanilla Windows XP user in London.


The EFF has a service called Panopticlick[1] that gives you a sense of how unique your browser looks to the sites you visit. My result: "Within our dataset of several million visitors, only one in 886,262 browsers have the same fingerprint as yours."

[1] https://panopticlick.eff.org


Well, if the rate holds for Google, it means there's 1320 users that look just like you.


While it can be an issue, browser fingerprinting isn't as powerful as the EFF claims. Any change to your user agent (browser updates), plugin list, time zone, or screen size makes Panopticlick think you are a different user.


Correlated with a second or third tracking method such as cookies sent to analytics, ad and fb sites, would it be super sticky?


I don't get why people recommend VPNs so much. I never bothered to look for a free one because I didn't think it'd have a decent speed and all ports opened, but doesn't Tor basically do the job?


Free ones probably suck, but even a very cheap (<$50/year) VPN will be much faster than TOR.


It still feels double to pay for an internet connection and then a second time for a VPN, but yeah it might come in handy. Still think I'd go for a VPS instead and, if I need a VPN, just tunnel it through, but that's just me.


Well, I said VPN has a catch-all for using a remote gateway. I actually use a VPS as well (with sshuttle).


You can turn off "private results" in your google search preferences. https://www.google.com/preferences


I turned off "private results" a long time ago and my results are still customized...


Now that you mention it, it's obvious that mine are as well. I guess I shouldn't be surprised.


Could there be a browser plugin to like and share everything as you wander through the web? That would be an interesting way to demonstrate tracking technologies.




This made me think about creating a second profile on facebook, adding myself as a friend and liking whatever I post on my original - just to find out what "advertising value and quality" I create. Anyone got any ideas on how to automatise this?


I tried this (not automated). It was nice to have someone actually read and like my posts. I quickly found I was pretty boring however and so got tired of myself and stopped doing it after a while.

Sometimes it was fun to set up a joke in a back and forth comment thread with myself though.

I still log in as my alter ego every now and then to see what my profile looks like from someone else's perspective.


I'd be surprised if they didn't flag your second account as just spam and ignored it completely.


I don't care about them, it'd be interesting to find out only out of personal curiosity.


The only winning move is not to play.

Or in this case like/post.


I never like anything except my friends, anywhere. Of course Facebook gleans stuff from that as well. I would pay to have a site where all you could do is interact with friends posts/photos/movies with zero commercial features but we all know FB would kill such a thing in its infancy.


That's been tried before. App.net was that way. The downside is it would have to be a paid service, because how else would you fund something like that?


it's possible to unlike everything you've ever like isn't it? I'll try that one day.


I'm sure their database doesn't just store a series of checkmarks, but rather a series of "User liked", "User unliked", etc.


this could be some kind of metaphor? =D it starts with a simple but very humanist text, about everybody should like everybody, but this all ends with the statement that after he had liked everything, he got with nothing that he really likes. This could be the same with our relationship with people, where if you dont stand for your principles you could get around with people you don't appreciate.


So, that's what the bots see in Facebook. Doesn't that mean that bots are consuming most of the ads?


This article has reminded me of how astoundingly ahead of his time Andy Warhol was.


Every since I read The Circle stuff like this seems to ring true.


Interesting read. wonder what would happen if they added a dislike button.


I think it would not be used much other than to dislike a brand status update. It could create bad feelings in your Facebook community if you disliked someone's status update. Probably why they never have added the dislike button; it could sow flame wars -- "Hey, why did you dislike my update?" -- kind of like on HN when someone gets a bunch of downvotes.


Somebody still uses Facebook, film at 11.


I'm so put off by Facebook that I'm even put off by articles describing how one is put off by Facebook.

Full disclosure - I closed my account 2 years ago. Did not miss it even for 10 seconds.


How do you know if someone deleted their Facebook account?

Don't worry, they'll tell you.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: