Companies such as Facebook and the like undoubtedly draw great power from their "social graph"; however, human interactions are subtle, complex, multifaceted and often contradictory.
Nothing good can come out of applying dumb algorithms (that is, any algorithm) to a sufficiently rich "social graph"; it will always lead to situations like that which are, at best, embarrassing, and at worst, dangerous, for whoever is in the system.
I don't think it's a matter of doing things the right way or fixing them- trying to put human relationships in a computer system will never reproduce the human experience (however, you may make marketers very happy). The best you can do is complete certain very focused subsets of it in an interesting way.
Zuckerberg's views on privacy and openness are laughable, and to be completely expected from someone of his background and world experience. If you think about who the average Facebook employee is, you'd realize that you probably wouldn't want them to be in charge of designing a system meant to model the intricacies of human interaction.
Nothing good can come out of applying dumb algorithms (that is, any algorithm) to a sufficiently rich "social graph"; it will always lead to situations like that which are, at best, embarrassing, and at worst, dangerous, for whoever is in the system.
What does this even mean? The "dumb algorithm" here will (a) take a user's query and (b) search that user's friends and public profiles for hits on that query. What's next: shouting about the robots taking over? Lamenting about how we're all losing our humanity and man weren't things better back in the old days?
I'm really disappointed this is the top comment on HN's top post. Users identified by Graph Search have public accounts and have willingly entered personal information into Facebook. We go thorough this song-and-dance every year: FB updates security policies, everyone is up in arms, then -- gasp! -- everyone updates their security settings. Facebook has a clear transaction with users: build a profile, get sold as eyeballs to advertisers. If you don't like it, then quit.
"With great power comes great responsibility," indeed. People have received numerous warnings about Facebook and privacy, and yet they've chosen to share personal information with everyone they know. At this point the user is responsible for choosing to participate or drop out. Meanwhile, every third HN post includes a commenter, wagging their finger, reminding us in a nasally, know-it-all voice that we're not the customer if we're not paying.
Is Graph Search really that shocking to you--to anyone on HN? Why the hell is anyone up in arms about this in 2013, given what we know about social networks?
If you fail to see the danger that a tool with the ability to search a network for such things as "Islamic men interested in other men, living in Tehran," a tool that can then display their place of work or other contact details (that they may have willingly ((yet unwittingly)) entered), in the hands of bullies, bigots or oppressive regimes then personally I feel you are similarly limited in your world experience to what the grandparent poster is accusing Zuckerberg of being.
Basically, it's one of two things:
1.) such incredible naiveté that, frankly: you require supervision
2.) such hideous cynicism, so deeply felt, that it represents, to my sensibilities, a form of evil
This thing is dangerous. The people to whom it poses danger are the ones least likely even to comprehend the danger imposed upon them.
I cannot see how you could defend such a thing.
BTW: decrying the state of HN when you feel personally ill at ease with the general feeling of the community is poor form and a very strong indicator of butthurt.
Here's a thought: If you live in a country where people are summarily executed for being gay, don't put "I'm gay" on a public website with your name on it.
YES people should be allowed to be gay in Tehran, and YES Facebook should help them with that - and they do: By not requiring you to enter your sexuality.
I don't live in Tehran, so I am privileged here - but if I put "I did tax fraud, I win!" on my public Facebook profile, and the tax authorities decided to investigate me, anyone suggesting that I didn't bring that upon myself, frankly required supervision. Even if I did it on my closed Facebook profile and a "friend" decided to report me, it's still not Facebook's fault.
> It strikes me that you are of the belief that cyberspace ought to mimic life in meatspace directly.
If it's a problem that Facebook uncovers someone who's gay in Tehran, then that is only because that persons cyberspace identity mimics his meatspace one. If the gay person in Tehran profile doesn't actually link up to a physical person in Tehran, then there's no added danger to anyone.
An "Islamic gay man living in Tehran" in 1997 creates a web page and populates it with exactly the same personal information you are describing. Sometime later Google comes of age and indexes his site for the world to easily discover.
Do you feel that Google and Facebook are acting differently in this scenario, and if so, why?
I do. Primarily because Facebook caters to users with low technical ability and includes baked in privacy settings that are notoriously ambiguous and difficult to understand.
Facebook actively promotes an anti-privacy styled online presence.
Those with the skills to build a website in '97, along with the requisite metatags to allow for the type of indexing you describe can reasonably be expected to understand what they are doing.
Not so with Facebook.
Adding content to a webpage is a pro-active move. Vanilla Facebook with no security tweaks or other expert knowledge (by my understanding) leaves users wide open through no action on the part of that user.
Did anyone opt-in to being included in Graph Search results? If they knew what it was, would anyone do this?
It's an interesting question that you pose and I think there is definitely more to it but the answer above is from the top of my head. I will mull on it some more and add should I come up with anything further.
I built a website in or around '97 on geocities. You talk about metatags being required for indexing but I think you forget that the web was young once. Google had a voracious appetite to index and they still do.
The barrier to entry for someone to build a website in '97 wasn't much higher than it is now. Things just look prettier these days.
Sorry, I should have added a final point that I made in another post: "Interested in" is not interpreted by everybody as "sexual attracted in". I'd be willing to bet that in Iranian culture, it's quite common to say you are interested in being friends with men, and it is not viewed as homosexual.
Bullies, zealots, and police are prone to misinterpreting words in the way which is most beneficial to THEIR ends, not yours. The kids denied entry into the US because they said they were going to "destroy" $CITY were a prime example of this, where "destroy" was local slang for "get drunk and party in $town".
Similarly, anyone looking to oppress gays will say, "He likes men. Why didn't he say he likes women? He must be gay." You can't argue what you really meant in a kangaroo court, or when a mob of angry villagers are throwing stones: it's too late by then.
"Is Graph Search really that shocking to you--to anyone on HN?"
Are serial killers who actually kill people really shocking to people? Well, the answer to that question depends a lot on how much you let its framing determine your thinking, right?
Sure, I too, personally, don't share anything personally identifying on FB, in violation of FB and G+'s TOS BTW. But all those naive people, they walked through the door, they didn't see the samurai behind the door and pow, what do they expect? They won't be play in the Seven Samurai now. Why should we Samurai give a heck about them morons? Well, there are some reasons, starting with the fact that we have personal relations, with non-Samurai.
> Users identified by Graph Search have public accounts and have willingly entered personal information into Facebook.
This is quite misleading. Before Graph Search much of that capability was limited to Facebook Inc. and the affiliated government. This really opens the floodgates.
Were they warned? If you follow Wired and HackerNews yes. For general population though it's not something they were warned enough. They teach you how to cross a road in elementary; nobody explains you how to use privacy controls.
The whole liking and sharing ecosystem is promoted by social media itself. While you have skills to be critical of privacy implications, many more people have very vague idea or none whatsoever. Facebook de-facto is seen as a way of communicating with your social circle, that's why normal people join it; there is a (misguided) expectation of the same kind of privacy that you have in a clique of friends.
It is obvious that you either didn't read the search queries on the linked page, or you don't understand the impact that some of these queries can have on peoples' lives.
People are stupid, so when warned, the will still act stupid. Despite this it is still the responsibility of whoever is able, to protect them, even against themselves.
I also agree with the argument of others here that facebook, to some, seems like a social requirement. It is like having a phone. You don't NEED one, like we need air, but it makes social life harder not to. That of course doesn't mean that you need to put everything online, but that is another matter. What is happening with Facebook again and again though is that they change the rules. 'Liking' used to be quite harmless, now suddenly, my old 'harmless' likes can get someone killed or arrested, as is demonstrated in the demo searches. Not good...
We keep smoking when we know it is unhealthy, we keep drinking and driving, we vote on politicians based on one liners and the amount of media attention they get, we believe that the evolution theory is 'just a theory', we get in huge debt because of loans we cannot pay and we keep posting things on Facebook despite knowing better.
We are stupid in the sense that we do very stupid things even though we, in many cases (should), know better. It is the task of 'us' (people who know about internet privacy) to protect other people around us from being stupid with Facebook just as I would like to be protected by others in the things I am stupid at.
"Assuming you're socially allowed to quit Facebook."
I don't even know what that means. How is it that all of the rest of us managed to quite Facebook and continue to have a relatively vibrant social life. Is it the case that WhatsApp/Viber/SMS/Email are no longer considered effective means of communication to get together?
You might well lose touch with a geographically diverse crowd of people. FB and the like keep people involved in the minutiae of each others lives and very much help keep real friendships alive.
Of course FB is trying to destroy this as much as possible by controlling what you see of what people post. Trying to decide what's important to me is another facet of the 'dumb' algorithms the OP complains about.
I know there have been a lot of issues about information leaking out because of various security and privacy defaults being bad, but that's not what I meant.
There seems to be some algorithm deciding what gets put into your stream by weighting posts on who you most frequently interact with and how "important" the news is, meaning that your fb experience turns more and more into a echo chamber, and posts by people that don't post often seem to get lost.
I have never used Facebook but I do feel 'out of the loop.' especially since the baby arrived - I don't hear about things 'through the grapevine' and subsequently seldom find myself with many social options on the days that I have free.
This is good for my startup but bad for my self esteem.
Hey there, 34, married, 3 kids here. You have two good options in this situation. One is to get your wife on FB so she can keep you in the loop. The other is to have one more kid and then you won't have to worry about what to do with your free time because you won't have any ever again. If you're starting up, I'd recommend option A.
There, of course, people whose circumstances mean that Facebook is a (/the) major source of human interaction in their lives. I really wish those people, and the rest of us, could move to something less ethically troubling.
> "Nothing good can come out of applying dumb algorithms (that is, any algorithm) to a sufficiently rich "social graph""
These aren't algorithms. This is a search engine - what you put into it is up to you. It can be anywhere from the silliness seen in the link, to "friends who live in London" before you take a trip.
This is just handing people another tool - the uses, and misuses, of said tool are entirely on the users.
> " If you think about who the average Facebook employee is, you'd realize that you probably wouldn't want them to be in charge of designing a system meant to model the intricacies of human interaction."
A snarky and mean-spirited stereotyping of all nerds as socially inept! How clever.
>A snarky and mean-spirited stereotyping of all nerds as socially inept! How clever.
Not at all. I'm just willing to bet that the average Facebook employee has never had to hide his sexuality/political/religious beliefs for fear of execution/incarceration/abuse to his family/etc. Yet a non-negligible part of Facebook's user base are people precisely in this situation.
That being said, the very fact that you call what the link demonstrates "silliness" speaks for itself.
> I'm just willing to bet that the average Facebook employee has never had to hide his sexuality/political/religious beliefs for fear of execution/incarceration/abuse to his family/etc. Yet a non-negligible part of Facebook's user base are people precisely in this situation.
I think by 'silliness' he's referring to "Girls who live nearby who are single and like Getting drunk!" Fair call in my opinion.
I made this point in another post: the take-away from this link shouldn't be how bad it was of Facebook to not realise that making available powerful search tools for already-public data might put some people in an unfortunate spotlight. The real take-away is that we shouldn't live in a world where people should be scared to be openly gay in Iran, or openly a Falun Gong member in China. Those are the things we should focus on and try and change. And we should appreciate, rather than chastise, the tools which make us realise that there are things we need to improve in the world.
All the stuff he's gone through—death threats, imprisonment, torture—has been for the sole "crime" of apostasy.
This graph search is going to enable sadists to lower their bar for finding victims from "people who openly and loudly proclaim their beliefs" to "people who accidentally clicked 'like' or forgot to fix their privacy settings." I have some difficulty deflecting the blame for Walid's treatment to simply being born on the wrong side of a fence. He may have known better, or perhaps he is on a crusade for religious freedom, but it's going to get a lot harder when we start talking about people with no interest in being martyrs being tortured and killed for this great cause of openness and improving the world.
I'm not sure what you mean by "one-way process." I think there are certainly situations where bad things happen yet nobody is responsible and nothing needs to be done differently. I do think randomness plays a big role in my arriving at that conclusion, and I don't see a very big role for randomness in the potential abuses of graph search.
Often the best first step to stop something evil is to remove the tools that make evil's job easier. In this case Facebook is facilitating persecution by making better and better tools to filter data on a large number of people simultaneously. If you want persecution to stop you can't just automatically absolve Facebook of responsibility for the consequences of the design choices they make. Sure people should be careful about what they post online but Facebook and the companies advertising on it offer incentives to get people to do it. Past 'likes' will now be easily accessible forever. It isn't only what you've put on your profile, inferences can be drawn based on the information about your friends. If all your friends are openly gay but you don't want others to know you're gay it can now quickly be deduced with a search.
>And we should appreciate, rather than chastise, the tools which make us realise that there are things we need to improve in the world.
What? Of course we all want the world to be a better place, but Facebook has no right to put people's lives in danger to try and force the issue. If they're so worried about making the world a better place they should do it themselves, not just open the flood gates on their users.
Bought this series on DVD recently, only watched a few episodes but it really is great. His presenting style leaves a little to be desired but this is a man of extraordinary intelligence and breadth of knowledge.
If anyone is wondering on the subject matter, it plots of the ascent of man not in terms of biological evolution but cultural evolution - which seems to me to be an oft overlooked facet of how we came to be.
Well, it's fantastic, but what makes it germane here is because the author/narrator switched from being a mathematician/physicist to a biologist mid-career because he saw no other way to continue working on nuclear physics without causing more harm. Several other physicists of that era did.
We are to some extent liable for the ill perpetrated by other people using our technology. The extent is debatable, but it must be greater than "not at all."
| This is just handing people another tool - the uses,
| and misuses, of said tool are entirely on the users.
The same applies to building hand-held nuclear weapons and handing them out on a street corner.
The same could be said of convincing people to put CCTVs in their houses and then hooking them all up chatroullete-style.
That doesn't mean that it's a good idea.
| A snarky and mean-spirited stereotyping of all
| nerds as socially inept! How clever.
I see it more as Mark Zuckerberg having lived a rather sheltered life. E.g., his view that people having different 'faces' with different people as being disingenuous is laughable. Many people only show selected parts of themselves to certain peer groups, while showing other parts to different peer groups.
That, and I assumed the 'average Facebook employee' part was assuming that they were all 20-somethings from (on average) middle-class or above backgrounds (i.e. possibly sheltered and lacking in life experience).
Maybe because I am also a "20-something", but I do not see how Zuckerburg/facebook employees being "sheltered" (I don't know why you asserted that, unless you went to high school with him) has anything to do with Social Graph.
1.) The fact that they had this information is not surprising at all. Again maybe because I am a 20-something, I think its pretty obvious that if you had terabytes of data you would want to search it.
2.) While many people do show different parts of themselves to certain peer groups, the data Facebook has was posted to Facebook. Facebook did not install a CCTV in anyones home and log their guilty pleasures. This gets parroted a lot, but if you don't want someone to find out you love Lifetime originals, don't post it on facebook. It won't end up in the graph, and you can continue playing your identity game.
| Facebook did not install a CCTV in anyones home and log
| their guilty pleasures. This gets parroted a lot, but if
| you don't want someone to find out you love Lifetime
| originals, don't post it on facebook.
You're either misreading my post or being disingenuous. I said:
| The same could be said of convincing people to put CCTVs
| in their houses
I could easily just say, "If you don't want people to see what you do, then don't allow a CCTV into your house." You're acting like CCTVs are by definition involuntary.
People put most things into Facebook because they don't understand the real implications of it. They say, "I like Lifetime originals," because they want their friends to know that, or because they view Facebook's profile questions like a survey. Most of these people are techno-illiterate (including the newer generations which are just more adept at using/consuming tech than their parents).
This is a tool that lets you search data, some of which was collected on a false(or dishonest) expectation set by facebook that the data will not be searchable in this manner. That is a violation of trust. I say this as a huge fan and user of facebook's privacy features.
In most cases, I am arguing on the other side telling people how misinformed they are when they complain about a lack of privacy features. But this feature even has me blindsided and I am someone who uses almost all facebook privacy features very carefully(I have posts visible to Public to posts visible to only selected people, for example).
I fully expect a new set of privacy controls from facebook specifically around this feature.
I think socialization under the umbrella of a facebook type environment will eventually create predictable and malleable identities which are inherently the consequence of a mix of various societal pressures which are in turn constricted through the lens of the facebook environment.
Combine that with facebook's transparent use of corporate cohabitation with people's relationships, and I think you quickly have a society which is much more determined and artificial.
I think "socialisation under the umbrella of a facebook type environment" was inevitable one way or another. The advantages of an efficient system for socialising with a centralised online persona system are just too huge to forgo. I'd be interested to hear what you consider might have been alternatives, but personally, I think the economics lead inexorably to a free-to-play ad-supported system such as Facebook. Show me the day when you can convince a couple hundred million people to sign up for a paid social network product and I'll eat my hat. Show me the day that an open-source community can compete with a for-profit business in terms of attracting the best talent willing and able to slave away for hours to build great products, and I'll eat my hat.
I don't think it means the end of individuality as we know it. We've had ad-supported newspapers for hundreds of years now and people seemed to manage okay.
I think you can create services which achieve the same function as facebook while making a dispersion of self and novelty apparent to its users. I'm not arguing monetization, but the means and consequences of exploiting users responsibly.
Yet again, I struggle to make sense of what exactly you're trying to say. You're using a lot of big words, but not necessarily making your point clear. What does "making a dispersion of self and novelty apparent" mean?
Wow. When Graph Searches were announced, I thought "meh, I'm tired of FB...". I never thought to explore what Graph Searches actually meant. And I would never have done so or will do so, but I'm very glad that someone has investigated the system.
>it will always lead to situations like that which are [bad]
I wonder if there isn't some definition analogous to "Turing complete" for social graphs? i.e. with a sufficiently powerful API, any question can be answered. Just as Turing complete-ness leads to viruses, worms, etc, might "XYZ complete" social networks lead directly and predictably towards A, B and C bad outcomes?
>Zuckerberg's views on privacy and openness are laughable
Perhaps. I'm not versed in them well enough to know. But I wonder if he isn't a bit Bill-Gates-eyan in his perspective of pushing the border between X-Y quite hard to see what bends and what breaks. For Microsoft, it was Code-Data and, while there were certainly people saying don't mix Code and Data!, people happily used their Excel spreadsheets and we suffered mightily from the resulting viruses. Whereas "viruses" were the unintended consequence of mixing Code-Data, are "outings" the result of mixing Social-Search?
Zuckerberg thinks people complain about privacy because they're old-fashioned and scared of the future. He thinks in the future we're all going to be extremely open, share shit all the time, aaaand basically the whole world is going to be one big chat room where people don't really hide things any more.
Personally I think he's probably right. It has shades of that old totalitarian argument, 'why should you be worried about privacy if you don't have anything to hide?' but hopefully it'll be a little more... forgiving than that.
It's worth noting that it's in Zuckerberg's financial interest should such a No Secrets reality come to pass.
Which should give one pause about whether he's giving an honest, objective assessment of the available information. One would expect quite a bit of confirmation bias when billions are on the line.
As to whether there will be a time without secrets: I don't see how that's even possible. Humans are social and tribal but they're also political. Politics is all about the tactful (mis)application of lies to swing tribal balance.
There will always be something people want to keep 'private'. Always something that the tribes disagree on and judge one another over. Regardless of whether those feelings and beliefs are substantiated in objective reality.
>He thinks in the future we're all going to be extremely open, share shit all the time, aaaand basically the whole world is going to be one big chat room where people don't really hide things any more.
He claimed to believe that after he had make everyone's private account information public again for about the fifth time. I'm not sure if he really believes something so ridiculous and nonsensical or he simply decided to claim to have a motto that would explain why he kept revealing everyone's secrets other than the obvious one: it makes him more advertising money to do so.
> I wonder if there isn't some definition analogous to "Turing complete" for social graphs? i.e. with a sufficiently powerful API, any question can be answered. Just as Turing complete-ness leads to viruses, worms, etc, might "XYZ complete" social networks lead directly and predictably towards A, B and C bad outcomes?
This graph search is just a fancy database interface, it's basically "SELECT name, age, etc from users WHERE location = 'Teheran' AND looksfor = 'men'" (massively over-simplified).
So, no it can not answer _any_ question, because it's limited by _what_ and _how_ you ask _and_ the data. You can't ask it something it doesn't already know.
I hate to flip the script on social networking, but I think google is doing a pretty good job with focusing on exploration and compartmentalization of friends and social groups in +. Maybe something like graph search is their eventual intention, but I really hope their vision is less...disastrous.
I don't think he's saying that at all. The insanity is trying to construct a platform that manifests a person's entire social life. I think there's a harsh combination of conflating business interests and disinterested, contrived people sorting that is potentially very, very bad.
I'm not sure I follow you. Right now, Facebook knows roughly this much about me:
1) All of the people I have legitimated as Friends.
2) Screeds and screeds of data about the quality of all my connections with those friends (have I hid them from my news-feed? how much do I interact with their posts? how much do I interact with them? talk to them? share with them? stalk them? how much time do I spend in interactions with them? how do the patterns of my interactions change with them (mouse-click patterns, scroll patterns, typing patterns)? do I use different words (different emotions/more formal/informal language/different function word patterns?)
3) People I've stalked but haven't friended
4) Brands, companies, causes, ideas, things from history, all sorts of random shit I've liked (along with similar data to 2))
5) What sorts of things I click on, how I leave the site and where I go.
And certainly more.
Etc., etc., etc. Facebook didn't manifest my entire social life. They just built a platform which allowed me to socialise orders of magnitude more efficiently than before, a platform of great benefit to me, and one into which I've poured vast quantities of data (and the amount, quality and variety of the data is only set to increase as we open up more input modalities - eg eye-tracking, gesture-tracking, and biometrics). the data is all there. All facebook is doing is cobbling together some of the most very basic tools one might use to make sense of a subset of that data.
My little brother and sister had it from 11 and 12 respectively. Yes, it's quite interesting to think about.
Anyway, what's your point? By the way, you're discussing this with someone who believes that in the future we will record and utilise orders of magnitude more amount of data about our lives and our planet than we do now, enough to make your eyes water, and furthermore, that we are inexorably fated to do so, and furthermore, that it's a really, really good trend, one which is going to play out in highly unpredictable and surprising ways.
So you're a fan of data, that's fair enough. Who isn't a fan of data? The issue is, who owns and controls that data? You, the true owner and source of the data, or some big advertising-driven tech giant with it's own crappy hollywood movie?
Not only that, but in order to make money from advertising, they link all this wonderful data (your data) with advertiser interests. Much of this linking is done under the radar, via rules you are not allowed to see.
It's very important to distinguish the benefits of shared data with ownership and control of the data, and the rules by which it is mined and accessed.
BTW, a lot of Facebook data is "self-expression" and if given the choice in real-time, many people would elect for NOT for their contribution to go into the tech-giants mainstream database. Going into "settings" and messing around with broader privacy options for content types and particular people is an absolute joke in terms of UX and human experience in relation to expressing personal views or communicating with friends.
"Privacy settings" does not come naturally in communication. Facebook knows this, and knows people will not bother or become lazy with privacy.
Bottom line is, Facebook is an inappropriate platform for the collected data that is your life.
A platform which is transparent, open and shared by it's nature. A platform for which the first and foremost priority is to be trustworthy with no conflict of interest between being all the above and economically feasible. This would probably become a government-funded international push rather than a single corporation with commercial interests.
The big problem in the future will be the fact that as the line between "real" and "online" life diminishes, the party which holds the data will become an authority akin to governments we have today. When or why should we trust an authority? How can we be sure that the authority is trustworthy? Politics is hard enough and we all know that we can't trust politicians, how on earth are we going to trust a party with commercial interests to somehow manage our social lifes?
You seem very optimistic about things like Facebook collecting data and interlinking people's social life via their platform by using the collected data. I for one find this very scary, to such extent that I'll rather cripple my social life and not use Facebook than trust an commercial authority over the data collected about me which I have no control over.
I provide the data. I see the data. I own the data. I control the data. What is so hard about it? Honestly.
What zxcdw said. :-) Also, my previous post has typo, should read "IMHO FB is not currently an appropriate platform"... damn what a bad typo,.
The thing is, Facebook could be better by allowing members to invite outside data to flow into their feeds from chosen sources, and providing more freedom with data-exchange in general. Enforced segregation is costly.
As suggested by another commenter, that's a false dichotomy. I'd argue there's lots of room for innovation and refinement here.
Manual or algorithmic identification of family members is one option, but I think there's something more to be gained here by understanding the behavior just a bit more abstractly... Shopping moods? Modes? Targets? Tasks? Something, possibly – but I'm only guessing here. The real power could come from actually working with/doing user research and prototyping.
I'd love the chance to work on such a project, actually – I had a chance to work with data from a study on the online shopping habits of mothers a few years ago and there's lots of interesting angles to possibly explore, IMHO.
Honestly, I have to be nervous logging into my personal account with other people standing around. The "Recently Viewed" and "Recommended for You" are full of things I'd really rather my family members and most of my friends see.