Hacker News new | past | comments | ask | show | jobs | submit login
Russian photographer matches random people with social network photos (rbth.com)
259 points by davesailer on April 13, 2016 | hide | past | favorite | 129 comments



The first thing that struck me (After I got over the whole, creepy/horror/revulsion thing) was how much more fun people seemed in their social media lives.

Rather than seeing this as an exposition of "people on social media are so much more bland than they appear" I found myself feeling "people on the subway are so much more beautiful and interesting in their 'real life'"

I feel like that might be backwards from how it gets presented elsewhere.

But, if you look for example at the blonde woman snarling a piece of cake https://birdinflight.com/ru/vdohnovenie/fotoproect/06042016-... (second last) I thought, that was charming and funny... But you'd never know that from under they grey fluorescent lights of the subway car.

It's really interesting to be confronted with the realization that every single person filling up space on the subway around you has a rich and lustrous story that brought them to that place and time.

Still, super creepy.


I noticed that many of these people photographed were staring at their own phones!


> It's really interesting to be confronted with the realization that every single person filling up space on the subway around you has a rich and lustrous story that brought them to that place and time.

The dictionary of obscure sorrows[1] names this feeling "Sonder" The realization that each random passerby is living a life as vivid and complex as your own—populated with their own ambitions, friends, routines, worries and inherited craziness

1. it's a made-up word, not a formal English one. http://www.dictionaryofobscuresorrows.com/post/23536922667/s...


Lovely! But quite the opposite of a sorrow, I felt it was a somewhat beautiful realization.

I'm sure if I was having a low day it could take it as a sort of "we're insignificant specks on a floating rock" but I was having a great day so tra-la-la :)


> it's a made-up word

most are


In a word: empathy.

Nothing beats the visceral explanation of a demo, though.


You are right, I saw this as that as well. Someone who looks bland may actually have a very fun and colorful life. That's an important lesson here.


Don't ignore the advantage of having been taken by a professional photographer, who maybe even specializes in portraits of people, with a high-quality lens attached to a high-quality body.


I don't see why this can't be done in real time. Point your phone towards a person in a cafe and immediately figure out her name, age, maybe relationship status, social class, augmented reality style.

5 minutes later accidentally bump into her .. and start a conversation, armed with lots of private info about the person.

Or even more basic - thieves picking their targets - is this woman's purse likely to contain a lot of money and jewelry ?

I guess you could even write a tool which picks all the faces from a photo, and detects who's probably richest, most popular, most followed, etc.

But this is unavoidable, right ? We just have to accept it as a fact of life and adapt to it..


How long before anti-facial recognition makeup, hair, and accessories becomes fashionable? https://cvdazzle.com


Holy moly, that's a very curious side effect of technology on culture. Crazy cyberpunk look necessary to avoid computerized detection. I can see this being the norm or strictly banned in the future.


my bet is on such looks being banned.

What do you have to hide? /s


Banks in the US already have signs on the door prohibiting hats and sunglasses.


Or how long before everybody starts wearing burqas? Islamic women are ahead of their time :)


or before burqas become illegal like in some Europeans countries where men were wearing burqas to slither into woman only areas.


Source?



I'm reminded of the comic book/graphic novel Private Eye[1].

[1] http://panelsyndicate.com/comics/tpeye


That doesn't seem like it would work at all. The makeup and hair styles suggested would make you stand out even more! Sure, 1990s-style facial recognition won't work as reliably, but surely today's systems would flag you right away!


Note this is designed to fool facial detection, not recognition. Detection generally uses much simpler algorithms because in most cases it needs to be orders of magnitude faster.

I was working on a project with OpenCV a few years ago. After a few embarrassing incidents I had to add a kludge whereby if it detected a small face directly above a big face, it was to discard the big one.

It turns out that the torsos of portly women with large breasts make ideal face candidates.


"Flag you" as what? It doesn't need to completely "fool" the system & make it believe there is no face. It simply has to get the system to believe that "makeup style A" and "makeup style B" applied to the same face are, in fact, two different faces. This is a pretty low bar, and one which I suspect will be achievable for quite some time.


I haven't tested it myself but according to CVdazzle it works. So the next question is how long before facial recognition technology overcomes efforts to confound it. Or how long until legislators get involved.

Yes the looks are ridiculous, but most fashion starts out that way. People often look for ways to stand out and being "retro" could enhance the cool factor.

What does someone care if a mugger's facial recognition software "flags" them? And who exactly is being flagged? And if a security surveillance system flags you what is the consequence? Who does that consequence apply to?


>And if a security surveillance system flags you what is the consequence? Who does that consequence apply to?

http://www.bbc.com/news/technology-35111363

> "What you can do now is link your face-recognition system to Facewatch and it will pull down the watch list that's relevant for your premises and your group," he says. "Then, if someone walks into your premises, it will send you an alert."

So if you have a criminal record, or your face somehow ends up in that database, you could find yourself followed around shops and asked (or told) to leave. These sorts of massive watch lists aren't feasible with merely human recognizers.


It's just a matter of time. With implications we can barely imagine today.

Think about it in reverse. Think about every photo in existence anywhere that has you in it, or has your stuff in it. Imagine all of that being tagged and cataloged as being tied to you. A selfie your friend took in your house with your kitchen in the background. Somewhere somehow the data is crunched, algorithms are leveraged, and these things are figured out. A lot of this isn't even a particularly hard problem, your social graph is already known, your address is known, people share their locations all the time, the people you hang out with is known, it's a simple problem of deduction to figure out all this info. And then there's an amazing degree of data that can be extracted. Imagine a database containing a list of everything you are known to have in your house, and when, and changes to that. Imagine tracking your whereabouts and who knows what else by the presence, or absence, of your self or your accouterments or your car or what-have-you in various locations in the backgrounds of other images.

We really don't have a good sense for the implications of this level of data becoming available to anyone and everyone.


Take a look at this video by Jack Vale.

https://www.youtube.com/watch?v=IqeOnfQmZPY

He stalks people on social websites and then strikes conversations with them.


If you are interested in exploring a world where something like what you suggest is a generally-accepted part of life, check out the book Super Sad True Love Story. http://supersadtruelovestory.com/

This novel is set in a world that takes current social and sharing trends and extrapolates out a dozen years or so. Depending on your thinking, it could be seen as an extreme or a reasonable guess of what might happen in the future.


I was just going to suggest this. Totally Super Sad.


This describes almost exactly the world of the video game Watch Dogs. Didn't necessarily use a photo though, just a connection to the other person's phone.

In the world of the video game, people did adapt to it. However, there was a faction that actively tried to destroy that system.


This is something that caused Google/Eric Schmidt to say they were going to hold back on facial recognition back in 2011.

http://www.huffingtonpost.com/2011/06/01/facial-recognition-...


> But this is unavoidable, right ? We just have to accept it as a fact of life and adapt to it..

For a woman with a lot of diamonds in her purse i would suggest wearing a burqa, it would prevent most biometrics from being spotted by the cam. On the more serious note, what methods did he really use?


> But this is unavoidable, right?

I don't think so, if you're happy to remain off social networks.


I mostly stay off of social networks, with the exception of HN, but other people put pictures of me on them anyway. I say don't put that on FB and people just laugh because they think I'm kidding, but what can you do?


Don't Facebook already construct "ghost" profiles of people who aren't on FB but are mentioned/tagged in others' content? So when you eventually sign up they already know stuff about you, in which case even abstaining is not enough!


Exactly, it's frighting how everyone is assuming they have to change their public persona in order to protect their online version!


You think Zuck doesn't have this on his phone?


Or, just refuse to be one of the sheep?

Personally, I see nothing wrong with one internet picture for all these social sites. I wear huge women's sunglasses on my one picture I put out there. It's probally against TOS, but who really cares? There's always another website.

That said, I've never had anyone pick me on my looks.

I need to put in time writing, or talking. It's a pain, but I've never liked knowing anyone in the world can download my image.

Actually, I don't like anyone taking my picture. My only pictures in school was k through 8th grade. I wasen't present for picture day in high school. College did get my mug.

My hope is it will become socially frowned upon to take anyone's picture without their consent.

Whenever I see that Shutter bug at social functions--I just with they would put down the camera, or take it out at end of the hootenanny?


Wearing sunglasses makes us look difficult-to-recognize to humans, because we tend to focus on the eyes, and recognize people by their eyes more than other parts of their face (relatively speaking). However, these algorithms have no such biases. It's more difficult because there's less information to work with, sure, but they can identify you just as easily by the shape of your nose, mouth, and the distance between reference points on the lower half of your face, as the top half of your face. You might as well be covering your mouth and chin and nothing else, for all it matters.


> just refuse to be one of the sheep

careful with that edge, you might cut yourself


The artist (or the article) suggests that the pictures he took on the metro are the 'real' person, whereas the pictures on the social site are the 'ideal' person.

I'm not so sure. I think its very possible that the person in the social network photographs is actually far closer to the person they really are than the photo of them riding alone on the subway.


Further, the artist is clearly curating what he shows and what he doesn't and likely has his own message.


To be fair, he's pitching this as an art project, not a piece of social media or privacy research. Of _course_ he's curating the output to maximise his art's impact.

(I really like this sort of "subversive art", that has deeply important educational aspects showing people how the world really is in interesting ways...)


I thought exactly the same - and the photos chosen for the article don't really back up the point either. The pairs look like the same "person" to me, not one real and one fake.

Anyway I don't know anybody who busts out their truest expression of self to ride the subway.


I wonder...

Sometimes I suspect my early morning public transport "if you don't turn that stupid ringtone down I'm gonna stab you in the face" self is closer to my "truest expression" than my "look at me having fun being the life of this party" social media fake-self...


OTOH my tram "oh I'm so sleepy I'm afraid I'll miss my stop" face is further from my "true self" than the Facebook photos of me making stupid shit with duct tape or standing in front of a radio telescope dish, or showing off a Lisp book.

So I think it cuts both ways.


> The pairs look like the same "person" to me, not one real and one fake.

I seem to be alone in this, but... they don't even look like the same person to me. Do we have any confirmation that the pairs show the same person? About 10 million Russian men look like the guys in the second pair of photos.


I think they just picked the two most striking examples for that article. Look at the rest here: https://birdinflight.com/ru/vdohnovenie/fotoproect/06042016-...

Most of them are clearly the same person.


I dunno. Some of them clearly are because you can see they have moles in the same place and stuff like that, but a bunch leave me dubious. For example, that woman with the dyed white hair: the hair looks much frizzier/curlier in one than the other. And some I wonder how the face algorithm could possibly have matched them up - the guy in the parka vs the soldier don't even have the same facial hair, never mind how the parka fuzz obscures most of his head. (And you can't see more than a quarter of the next woman's face!) I wonder how much of this matching was manual.


Did it have to match those exact photos? I think he took a photo (or a few), used that (or those) to find the social media account(s). Then he could use whatever photos of that person he wanted to match up with a photo he had teken.


Oh yeah I think it was clearly manual. He uses this app to find their VKontakte account, and then he picks a good picture from their page. It's an art project, not a study.


Ah, thanks for this. Yes, I agree, this page has examples that really are the same person. I find those more striking.


I have a friend from Czech who looks just like that guy. I was wondering the same.


> I'm not so sure. I think its very possible that the person in the social network photographs is actually far closer to the person they really are than the photo of them riding alone on the subway.

That's how I see it too, but that may be tempered by my personality. When I'm on public transit or in a crowd, I do my best not to stand out (which is difficult being 6'4" and 240lbs) but when I'm around people I know, I will smile and have fun. When I go to a concert alone, I'm also a bit closed off and tend to dress blandly, but if I go with a friend or a group I'm more lively and interactive, and dressed to suit the music. It's all about being comfortable with those around you, at least for me.

So yes, on the subway I'm the dull guy in the jeans and t-shirt, and on social media I'm almost always with friends in a photo so I'm smiling and having fun.

Nothing out of the ordinary in these photos if you ask me.


>> whereas the pictures on the social site are the 'ideal' person.

Maybe it's that the picture on the web was chosen. Someone put thought into that, they said "oh hey I look good here", it's kinda like an ideal, or at least an above average view of themselves.


Of course my profile pic is a chosen ideal of myself. But you know i chose it and it depicts how I want to be seen. This says much more about me, than me monday morning or after a long of work, bored, alone, with the worst lighting conditions (blueish light from my mobile screen from below and neon lights from too far above).


Just like in other social situations, people idealise themselves at work or when out with friends rather than presenting an evenly balanced image of their inner self.

It's more surprising when people "overshare" and detail intimate information about themselves IME.

Who looks through photos to choose to preserve and doesn't pick the one's in which they look "better", doubly so when sharing those photos in public?


I don't think we are very good barometers for what kind of people we are. This is why having a confidant/BFF is so important.


Art can make you think or feel, and this art certainly makes me imagine, regardless of the artist's intended message.


I keep trying to figure out why this feels so wrong and creepy. You're in public, you can't avoid some random person from taking your picture. You have a public profile, you post pictures of yourself there, in public. But some how this feels... wrong, why is that?

Something about the surprise of it? It feels like a surprise, it feels like something that shouldn't be so easy.


Innocuousness is not a transitive property of shared data. Any individual act of sharing may be OK, but if everybody does the same, then it's easy for much more insidious things to happen.

You can't generalize from the particular to the whole because large differences in quantity add up to qualitative differences, and machine processing makes laborious correlations and connections easy to tease out.

There's similar problems with automated number plate recognition, public CCTV, non-cash transactions, etc. They all add up to massive potential privacy violations despite being fairly innocent in isolation.

Give someone your phone number, and they can't do much more than phone you. Give them a phone book, and they can phone someone given a name and address. Give them a phone book's worth of data in CSV, and they can deanonimize phone call records. Give them a list of phone numbers that have e.g. called a suicide helpline, and now they know the names and addresses of people who are likely to kill themselves. Without machine processing (i.e. with a book rather than CSV), this would be difficult; without people sharing the relatively innocuous detail of their phone number, it wouldn't be possible (leaving social engineering aside).


Your example suggests a trend - there's much more information hidden in data that it seems on a surface. Many people don't understand it, or don't want to accept it. But it's the nature of the world, and I see no way around it except banning all computation. Because our natural capabilities aided by data-crunching machines are enough to make those "massive privacy violations".

So if we want to live with computing, I think we need to learn to live with capabilities of computing and the nature of the casual structure of the physical universe.


I think it feels wrong because for all of human history a person could take for granted a degree of anonymity while among strangers, and more and more that seems to be no longer the case.

Forever, if someone recognized you, chances are you recognized them. People were either within some extended social circle or they were not, and this social circle was face-to-face. Now, we have something of an arms race going on with respect to anonymity, or the lack thereof. Maybe someone will scan me and identify me. What's my recourse? Either I go about in disguise, or I scan everyone around me in order to equalize things.

How could all this not feel strange or wrong? Based on the norms of 10,000 years of human society, it's either a-social or downright anti-social.


> I think it feels wrong because for all of human history a person could take for granted a degree of anonymity

I think for most of human history, humans knew the majority of the other humans within travelling distance and people were wary of strangers because there was no reciprocal trust relationship.


What if this is actually hypersocial instead?

Given enough information, everyone could approach cold reader's peak performance at a glance with additional biography. Sort of creepy - mostly because you can use such information to manipulate people.

But like any knowledge it can be used to break ice, strike interesting conversations and broaden one's circle of acquaintances. Or it could be used to build new tribalism.

The most problematic part is being able to cut loose and erase the past. This used to be much easier in the past - you just moved far enough and most everybody didn't know what you did.


I'm more surprised that people are surprised. You provide images to the public with your name and face connected to even more personal data.

Why are you surprised that this data can then be used by a third, or even a fourth party?


"Provide images to the public" isn't what people do; they provide images to a limited audience, either unwillingly (you can't walk around outside without creating some kind of image, it is not a willed action) or with people they know or expect to know them (online). It's technology which has converted this limited audience to a universal concept of "public".


> "Provide images to the public" isn't what people do;

It isn't what they think they do, but it very much is what they do.


In many cases it is a willed action though. The images in this article are quite probably uploaded by the people themselves. I don't know exactly how privacy settings work on VKontakte, but from what I've seen most images seem publically searchable.

Now tell me how uploading things to such a database is not a willed action.

I agree with you that there is the possibility of someone else taking such images of you and everyone around you, through surveillance cameras or whatever, but for something of the scale shown in this article you'd still need quite an effort to create such a database without the help from the people themselves, especially if you want to link it to other personal information.


People don't "upload things to a database", though, that's the point. They don't go to their photo app and think, "self, I'm going to upload this image to a global searchable database now". Rather, they think: "Steve and Mary would have liked this restaurant dish", or "I'm going to try and make my friends jealous of the great holiday I'm having".

Because people don't own their own clouds, or have privatized social networks, they are forced to upload things into databases that others control. Who controls the privacy defaults typically have the most control over where the data flows thereafter. And it's all out of the user's hands; they weren't given a choice, not a real one, because the incentives for the system lead you to accept the lack of control, or spend your life neurotically trying to keep a lid on everything, or live as a technological recluse. There's no other way to live today.


They're not surprised by that. They are surprised when one of the rules that has held true for the eternity of cities: that you can be semi-anonymous in a crowd, is no longer true.


In my opinion you still can. What makes this possible? You need a database linking images to names, as well as the face-recognition data.

People are freely giving away the images for the database linking it all to further personal data. If they did not do this, creating such a database would still be a massive surveillance effort, albeit probably still possible if someone really wanted to do it and had the sufficient resources.

I say it's still possible to stay anonymous if you want to.


Various businesses have a "take pictures of customers" policy. Medical offices are one example, due to insurance fraud they say. They typically don't ask. They simply tell the customer that they are about to take a picture and it is over before they know it.

When I spot such a camera I stand to the side so they have to ask me to move in front of the camera. Which is when I politely refuse to participate. They don't like that very much. Last time, the person made a "well you might not be able to refuse next time" remark. He may be correct, or not, we'll see.

These are cases where the images should never be uploaded to a public database or shared with any other parties. However, given the increasing use of SaaS and cloud providers, exposure to some third-parties is highly likely. Any consumer information hitting the cloud is at high risk of abuse.

In addition to those scenarios you have cameras behind ATMs, cameras embedded within grocery store self-checkout lines, etc and so forth. It has already become difficult to control your image and, in some cases, prevent that image from being combined with other personally identifiable information.


> "take pictures of customers" policy.

Even gym memberships require this in some places. Soon one would need to provide facial 3D scan just to buy groceries.


A photo ID is legally required (via form I-9) to work.


https://commons.wikimedia.org/wiki/Commons:Country_specific_...

And in most places it still wouldn't be socially acceptable even if it may be legal.


> I keep trying to figure out why this feels so wrong and creepy.

And I'm trying to figure out why this apparently feels so wrong and creepy to so many commenters here. I don't feel any of that. As far as I'm concerned, it feels... totally normal. Maybe I'm wired wrong or something, but I really don't see anything weird or creepy about learning from publicly available datasets (AKA observing your surroundings?). Nor do I see anything wrong in leveraging computational infrastructure to learn more from those datasets.

The way I see it - a hypothetical "creep" who takes a photo of someone and matches it with their public social media profile is fine. Where I draw the line is them attempting to put that into action through stalking, harassment, etc. because it hurts the other party. Trying to break into data sources that are not meant for public is of course a violation of personal privacy, which is wrong. Also, I consider a person with obsessive interest in another to have a problem that probably requires medical attention.

Are people so afraid today that the fact that another person notices them makes them uneasy?


I think that posting the pictures online like that is actually what's creepy. The matching almost has nothing to do with it. For instance, if he just had the subway photos, or he just had the profile pictures, it would give me a creepy feeling. It feels like an invasion of privacy. Here we even have laws that say you cannot publish someone's picture without their consent. Russia does too.

https://commons.wikimedia.org/wiki/Commons:Country_specific_...

Additionally he quite likely doesn't have permission to post the profile pictures due to copyright.


Ah yes, I agree. Posting someone's photo without permission feels totally wrong to me too.


There's a... presumption that this is difficult and that all these disparate services and fragments of identity are distinct and separated. This stunt demonstrates that this is not true. It violates some basic assumptions.


Yes, it gets easier and easier to correlate enough metadata and real data to find out many interesting things about many a person.


It makes glad about my zero-photos-of-me-on-the-Internet policy.

Also, the neural net may return a certain percent of false positives and there may be no particular way for this guy to tell. The impact is small here and would be larger in other circumstances.


How's that working out for you?

I have a significant number of friends who look at me like I'm speaking in Klingon when I tell them I'd rather not be tagged in their Facebook photos.


It works out pretty smoothly really.

Occasionally a family member wants to post something but there's seldom a problem.

Most of my friends don't post anyone's photo on Facebook even though they use it a lot.


Information asymmetry. They know you, you don't know them.


It's creepy because if someone took a picture of you, combed through enough records to attach your name to it, and then browsed through all your online presence, you'd be creeped out. This just automates the process. If they started knocking on your neighbors doors asking about you, you'd be worried too. It's weird to pry into complete strangers' lives uninvited.


> But some how this feels... wrong, why is that?

It is not voluntary.


This is just a taste... Now imagine what it must feel like to be a celebrity and have this be an every day occurrence-- hounded by paparazzi.


For one thing, such a scenario would typically involve the uploading of said picture or facial data or faceprint data to a third party. You have no idea how that third-party will use and/or share the information.

Some people will not have a public profile picture but they too could be subjected to it.


Because you don't want this: https://www.aclu.org/ordering-pizza


FindFace is a very interesting piece of technology. It's legit (the authors, N-Tech.Lab, have successfully competed against google and other strong opponents in MegaFace Benchmark). It's not real-time, but fast enough (0.5-1.0 minute time required to recognize faces and provide their profile links). Their business model is freemium (first 10 or 30 searches per month are free), and they're trying to make it a matching service, you can narrow search by age and relationship status, and then you can like photos of people you find to allow searching for "your type" of partners.


their website was pretty unhelpful is there a signup page somewhere i want to try it out


Here you go: http://findface.ru/

Looks as though it's only available in Russian and only works for VKontakte (the social network).

Warning: It highjacks your back button.


thank you


This is one of the cool features people have discussed for the past 20 years for wearable computers. I was hoping it would come to fruition with Google glass until Google got all weird about facial recognition on glass (and subsequently glass as a whole).

On one side it is creepy, on the other it would be nice to have your digital assistant suitably remind you the name of the person who is walking towards you and recognizes you but you cannot remember who they are. A quick summary of who they are and perhaps a summary of your last interaction whispered in your ear could be handy. Or even a couple of relevant recent headlines about their company, etc.

Of course creepy side is pointing at random stranger on street and instantly having their digital dossier overlaid on their face.


I personally would be pretty disgusted if someone used it on me. If someone doesn't remember my name or their last interaction with me, maybe it's for the best. The last thing I need is people faking nice and friendly after they just scanned me without my consent.


But this could be useful for someone with Alzheimers, or maybe a teacher that needs to remember his pupils.


Or normal people who, like me, can't connect names to faces without at least several interactions with a person in a short interval of time.


So instead of working on your memory or coming up with a mnemonic or just asking their name again (everyone forgets a name), you think it'd be less intrusive to scan someone without their permission?


Of course. Why would I need permission to analyze the same image that hits my eyes and to consult a digital database I maintain (or public one I have access to)? Going this way I'd end up rejecting bicycles and notepads because they're not "natural" enough.

Personally, I believe that your tools are extensions of your body.


I'm sorry but your social faculties are way out of whack if you think it's a better option to scan someone than to simply ask their name. Do you not understand that I may feel uncomfortable if you've just scanned me and then use that data to try and strike up a conversation? Talk about making me feel like a product or mark. Bicycles and notepads do not actively analyze and reduce people nor do they enable a surveillance state. Don't make such specious extrapolations.

Tools are an extension of your body, yes, but you don't go using your hands to feel up someone because you needed more texture data. You don't stare at someone for a long time because that's rude and intrusive. Same goes for using technology like this.


> Do you not understand that I may feel uncomfortable if you've just scanned me and then use that data to try and strike up a conversation? Talk about making me feel like a product or mark.

Do you feel uncomfortable about the fact that most of your friends don't even know your birthday date or phone number, and have both noted down somewhere (be it a phone or indirectly via profile info of your Facebook account)? Is it weird for you that some may even note down things you like so that they can check them out when looking for a good gift for you? Do you feel "like a product or mark" when a conference badge you wear displays your occupation or interests (sometimes used as an icebreaker)?

(Also, you shifted from just scanning to using the results to "try and strike up a conversation"; those are two different things.)

> Tools are an extension of your body, yes, but you don't go using your hands to feel up someone because you needed more texture data.

By scanning I'm not touching you. That would be a violation, yes. But why should I let you be a master of my eyes, and the (hypothetical, at the moment) tools I use to augment them?

People will get used to it, like they do with everything we've invented since dawn of time. Social conventions will adjust. Just because my eye implant can tell me what's your name doesn't mean I get to stop you on the street and start talking to you, no more than it is an acceptable behaviour today.


The difference is that for my friends I have granted them explicit privileges beforehand to have access to that kind of data. It's different when a stranger is scanning me, mining me for data. And actually yes I do feel like a product or mark at conferences when people come up to me and just chat me up because of the company I work for (you've heard of it) - they almost always want quid-pro-quo type interactions or otherwise don't care about me per se, but just the quantification of me that exists on the badge. It's one of the reasons I don't enjoy industry conferences that much or when I do go, I prefer not to wear a badge.

And honestly, scanning from a distance is still incredible invasive.

You can be the "master of your eyes" but don't get offended if I or others think you're a creep. You may not be touching me but you're definitely noticeably staring. The rejection Google Glass actually gives me some faith in the decency of people, but you're probably right, society at large will adjust because the profit motive and law enforcement capabilities for this sort of technology are too great for them not to become entrenched by media and government interests. Personally I find it inhumane and repulsive, but I'm old fashioned like that. I prefer more organic human interactions, nuance, getting to actually know someone beyond a readout on a screen. I'm a private person in an increasingly public circus of society.


Another corollary of this is that if you take a good image of someone else, you can build a model of it (sometimes automatically), and you can map it on to your face and appear as them digitally. I've built tools like this that even let you do it live in the browser using this library https://github.com/auduno/clmtrackr/


Reminds me of "Stranger Visions" from Heather Dewey-Hagborg. She created 3D faces by using DNA found in public places: http://deweyhagborg.com/projects/stranger-visions



I've seen this happen in real time with augmented reality googles. It had a bunch of the Facebook profiles in advance though so it was a much smaller search space.


I don't believe these claims.

These photos are matched too well, and I suspect he did it manually. Some of the train photos are so bad, there's no way they got automatically recognized.

Not to mention the resources you would need just to get the data on 280 million users, let alone train a neural net on it, and make it work with a 70% success rate.

I think it's a viral way to get app installs. The app reviews are really bad btw, it doesn't work.


I agree that this seems very far-fetched. From what I know about facial recognition (admittedly very little) for an algorithm to be able to recognize these profiles in such different conditions as displayed in the article seems very unlikely.

Incidentally, is there anyone who could ELI5 how picture data like this is treated in order to run these algorithms on it?


I am not sure this works or that success rates like claimed are possible:

> In 70 percent of cases, he was quite easily able to identify the people he had photographed.

VKontakte has 280m+ accounts and is the second most popular website in Russia (https://en.wikipedia.org/wiki/VK_%28social_networking%29); the Russian population is ~144m. I can't find any decent estimate of how many Russians use VK, but it seems to be at least as dominate in Russia as Facebook in the USA, which is to say, ~71% (http://www.pewinternet.org/fact-sheets/social-networking-fac...). So from non-VK usage alone, he should be able to identify no more than 70-80 percent since you can't identify those who aren't on VK in the first place.

Facial recognition is a notoriously difficult machine learning task, and doing facial recognition on the VK population of St Petersburg (>249688 * 0.7 = >174781) will be challenging even with extremely good facial recognition. I don't know what this 'FindFace' application is, but I'll be charitable and assume it matches the 2015 state of the art, "Facenet: A unified embedding for face recognition and clustering" http://arxiv.org/abs/1503.03832 AFAIK, with a best-case binary classification accuracy of 99.63% (https://en.wikipedia.org/wiki/Accuracy_and_precision#In_bina...):

0.9963 = true positives + true negatives / true positives + true negatives + false negatives + true negatives

If each person = 1 profile photo, then to classify a photo of someone there would be 1 true positive, 174781-1 true negatives, and unknown number of false negatives.

0.9963 = (1 + (174781-1)) / 1 + (174781-1) + x + (174781-1) 0.9963 = 174781 / 349561 + x 0.9963349561(1/174781) = x 1.9925943 = x

There would be ~2 false negatives for every true positive. So there would be a <1/3 chance the photographed person is not on VK at all, and when one searched for them when they are in fact on VK, they would only show up <1/3 of the time (the other two-thirds their subway photograph would not 'match' when tested in the dataset and it would be a false negative, and it gets much worse if the face detection has one of the lower state of the art accuracies on the other datasets). So on net you would expect to turn up a person <07 * 0.3 = <0.21 of the time. (It gets worse if we take into account the false positives as well but that particular paper doesn't report precision.)

There is no mention of any attempt to validate the findings, so I think for many of his results, he may have merely found a similar-looking person (possibly in some cases, look-alikes http://www.nytimes.com/2014/08/26/science/looking-at-twin-pe... https://en.wikipedia.org/wiki/Look-alike ). Look-alikes are far from identical twins, so it's not necessarily surprising that

> Often, there was a striking difference between a person’s real look and the image they projected on their social network profile: a shy and grim young man might appear to be the life and soul of the party and a lover of extreme sports on his VKontakte page.

This could be due to care in social presentation. Or it could just be that they're not the same person in the first place. (The photos are not that persuasive. I'm sure there's more than one grim young men with widow's peaks in St Petersburg, or young women with curly black hair.)


I appreciate your skepticism. And I mean that in the most positive way possible. Seems like most of HN visitors areskeptical about almost everything except extraordinary claims regarding AI prowess.

But really, you're being way too generous. 99.63% accuracy was reported on a relatively constrained dataset[1] when being asked a yes-no question: "are these two faces the same"? In our case, however, we're speaking about a search algorithm that tries to match something in a huge database. The accuracy will be way, way worse.

http://vis-www.cs.umass.edu/lfw/


Yep, and many of the photos presented on the page don't even look like the same person.


Completely missing the point. "Social Media Site" aka, a place where we put (mostly) public imagery. By definition this is curated, and as the prof says is a self-selected view of ourselves.

But, the point being missed is that OF COURSE people don't look the same "in day to day life" because those social media pics are just a moment of one day. If the prof happened to meet the person moments after the social media picture was taken then ... he would have concluded that "people present themselves truly as they are in reality" ...

My point, the point ignored in the article, is that Social Media is a self-reflective, self-curated snapshot. Not a continuous, objective representation.


Maybe it's not a popular look over there but I couldn't help but notice there were no photos of guys with a full beard, ball cap and aviator style sunglasses...


We are not far off AR-type scenarios where someone in public is overlaid with info about them and their interests. Keen on that girl? It instantly tells you she likes a particular band and movie genre, giving you quick access to small talk cues. Or that she never mentions a boyfriend online.

Or that the guy about to get off the subway earlier thanked his friend for giving him that money he owed him.


Minority Report's personal advertising is getting closer and closer: https://www.youtube.com/watch?v=7bXJ_obaiYQ

And they won't even need retinal scanners. Facial recognition will probably be good enough.


I found it interesting that with out exception in the photos on the artist's Website the people on the subway are activity using their smart phones and/or have headphones on. The transformation of the human race into cyborgs, of sorts, over the last two decades is uncanny.


So, you can no longer assume being anonymous when walking down the street. If your random photo is matched with your profile, you can be basically fully identified anywhere. With all those cameras everywhere, you could be fully tracked.


You could never assume that. The only thing that gives you anonymity on a public street is the fact that you're not interesting. Since most people walk mostly the same paths all the time, it's pretty much guaranteed that if you suddenly started to do something weird, at least some people on the street would recognize you and could - in principle - figure out who you are.

Also consider that your local grocery store clerk probably knows your eating and drinking habits better than you know them yourself.


Given that there are public images with your face in them and name connected to them, that can be searched by a neural net... Yes.


No need for the images to be public. A police state could match images taken by street CCTV to e.g. the drivers license database and track the citizenry in real time.

You could easily reduce the search space to match a face taken by a CCTV camera by using cell phone location data to limit it to people in the general area...


A police state has always been able to do this. Of course it gets easier by automatic face recognition and the possibility of sharing databases over long distances etc.

The difference from before I guess, is that it's less in your face, and can thereby happen without people noticing as easily.


>A police state has always been able to do this.

Not for everybody all the time, though...


I have registered http://wikiface.org/ some time ago to create an open facial recognition database as a proof of concept project.


It's interesting how almost everyone he photographed completely ignored him by either closing their eyes, diving into their cellphone or just pretending to be interested in something else


And friends bash me for always using a caricature avatar of myself on my social media accounts! I'll have to run it, and see if it matches a photo. It's a greyscale line drawing sketch.


This is very much a study of bad vs good lighting.


If this becomes more common then dating websites, like Badoo are doomed :)


If drones have Facial recognition software they'll become ultimate killing machines;


And this is exactly why, 11 years ago, I set up Famipix (www.famipix.com).


Link is to a "private photo sharing" website.

I'm curious naskwo, it says in the tour "We don’t synchronise any content to any external sites or social networks." - this implies you don't have off-site backups?

Second, for the birthdays calendar do you do anything like attempting to recognise photos of birthdays and log them, or use image meta-data to do the same?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: