Hacker News new | past | comments | ask | show | jobs | submit login
Let artificial intelligence guess your attractiveness and age (ethz.ch)
139 points by mkuhn on Jan 5, 2016 | hide | past | web | favorite | 129 comments

When I put a picture of myself that had the contrast turned up my picture was rated "godlike", when I put in the original picture it was rated "hot", and when I turned down the contrast some it was rated "ok." I'm an unhealthy, almost pasty shade of white with a slightly bulbous nose and a classic fivehead.

Now, this a problem for bigger reasons:

An older picture of Denzel Washington gets an ok: http://imgur.com/Li0gZqH

A recent picture of Howard Stern gets a hot: http://imgur.com/L8hxoVK

Obviously this is just a toy and your algorithm is pretty inexact, but... you need to fix it, or at least note in giant letters that it only works for white people right now and that you're working on your algorithm to make it more universal. Because it only (kinda, sorta?) works for white people right now. If you claim something is universal in your headline then note its specificity in the fine print, you're lying. If you build an algorithm that calls most people who aren't white ugly, you need to think about the buzz-to-backlash factor of demoing it.

It's really not a good look, and you've got a week at most before you get called a "Nazi Dating App" on twitter and your potential VCs get spooked and pull out. I don't think it's intentional on your part, but literally no one cares about what your intentions are when there's an opportunity to create moral indignation clickbait. Just a friendly word of warning!

If people are not mature enough to understand that a machine learning algorithm cannot be racist, shouldn't you go educate them instead of telling people not to offend these idiots?

Secondly what makes you think that every race is equally attractive universally? Studies have people find people of their own race more attractive. If whites are more attractive (either by their training set or user base ratings) doesn't that merely reflect the composition of their user base?

If an algorithm is based on people that are racially biased ("Studies have people find people of their own race more attractive"), the algorithm is racially biased - which can or cannot be interpreted as, believe it or not, racist.

You have very poor reading comprehension skills, or else you appended your comment to the wrong post. Almost no part of what you're saying corresponds to anything that I said.

PS: Giving some solid advice to public-facing startups looking for funding. Definitely take the autistic high-ground on every issue. Investors don't care about bad PR, they only care about abstract principles of truth as you understand them.

he was reading between the lines. A little too much of course, but people get pretty race-sensitive when you throw around look comparisons of black/white/whatever people.


Well, there goes my last ounce of self esteem.

If anyone needs me I'll be sitting in the corner with one of those criminal hacker ski masks while I work on open source stuff.

Well, if it makes you feel better, see what the algo thinks is hot…



I've been feeding it various pictures of fursuits at different angles to see if I could force it to treat one as human, but so far I've been unsuccessful.

I've tried to get it to tell me how godlike my cat is, but it just won't.

Is that ironic? 'Cause I think polar bears are so hot.


I got a http://iobound.com/pareidoloop/ output image categorized as "Hot"...

Get rid of the headphones and try again

Of course I tried a bunch of pictures of myself, just as everybody at HN is doing, ha ha!

I didn't get above the nice level.

First picture of my wife she immediately got the godlike level. Unfair advantage!!

"Before performing any experiments we removed underage people, anyone over 37 and bi- and homosexual users as these comprise only a small minority of the dataset."

That's already biasing it of course!

"Interestingly, 44.81% of the male ratings are positive, while only 8.58% of the female ratings are positive. Due to this strong bias of ratings by women, we only predict the ratings of men."

So, the algorithm learns to rate me as a heterosexual man?

From: http://arxiv.org/pdf/1510.07867v1.pdf

Yes, this was the data we worked with for attractiveness modeling. The algorithm learns from millions of ratings from males to females and from females to males, all heterosexuals. No underage or above 37 years old, and mostly people from Zurich area, Switzerland. For age and gender prediction we use more and diverse data.

I am Radu Timofte, one of the authors.

What did you mean about the bias of the rating by women?

Is it possible that quite a few men rate every women high irrespective of looks?

Or are women just more picky? Why is it called biased?

The ratings are "like" (positive) and "dislike" (negative). If men have ratings close to uniform distribution (44.81% are "like" ratings, 55.19% are "dislike"), women with only 8.58% "like" ratings are clearly unbalanced, or if you want the women are "very picky". We call this a strong bias in the ratings of women while for men there is a small(er) bias and this only to point out the difference from the uniform distribution (50%-50%). We can not tell and we are not the ones to judge if the men are voting randomly or are less picky or if the women are right or more picky, since the attractiveness is subjective and we do not have an ultimate ground truth to compare with. We can not say if the attractiveness should follow a particular distribution, we work empirically. In our study, the ratings of men tend to agree more and correlate more with our visual representation based on the face image (looks). We removed from our data the extreme cases such as users with less than 10 ratings or with too many ratings.

At the time that picture was taken, I didn't own speakers.

Now, I look even worse and don't own a working webcam.

It's a comb-over you insensitive bastard!

Feel better about yourself: http://imgur.com/uJVnQ9S

Jesus christ.

On a more serious note, training set probably didn't include data from Asia.

Edit: Just scrolled down and saw your question (and the answer you received). :C

It's only relevant if you are interested in dating a robot...

Hi, I'm Rasmus and worked on the algorithms behind faces.ethz.ch. If you have any technical questions, let me know! Sorry for having some stability issues, we got much more traffic than expected and are currently working hard to fix everything!

Hi Rasmus,

Cool work! I'm curious regarding the training dataset. What is the distribution of faces by race/age? Also regarding the raters, what is their distribution? (Race/age/cultural background)?

It's widely known that attractiveness is heavily dependent on cultural upbringing.

Signed, Butthurt dev whose best pic only rated an "OK".

EDIT: You also rated Yoona (a Korean pop star) as just "Nice": http://imgur.com/uJVnQ9S. I guess that makes me feel better about my "OK". I'd stay out of Korea if I were you---I hear their fans aren't very forgiving.

Hi fatjokes,

I am Radu, one of the authors.

As mentioned in our article (http://arxiv.org/abs/1510.07867v1), the training dataset for attractiveness is from Blinq. The underage and above 37 years old face images were discarded. All people are heterosexual in our training dataset and mainly from Zurich area, Switzerland. Therefore, our model of attractiveness fits the cultural bias of Zurich, Switzerland.

We consider faces.ethz.ch a little fun tool. I hope that the fans of Yoona will understand :) With more training data from abroad Switzerland our algorithms will fit their expectations.

For age and gender we used much more and diverse data, therefore are more reliable for the majority of ethnic backgrounds.

From a data privacy point of view I would very much appreciate if you added a prominent "delete upload" button to the result page.

"We do not save the uploaded image." from the take-this-seriously? popup.

Yes a "delete" button would nice.

We do not store any photo on the server, so no worries!

Well, you could still add a delete button to make people feel better (I'm only half joking)

And as a potential user I'm only half-insulted.

But seriously, what purpose would it serve over a promise in TOS? A normal user will trust both (except the button would be bullshitting them). A privacy-obsessed one will trust neither.

You could make a delete button which merely opens a popup to explain that they do not save it in the first place. That way if they're looking for a delete button, they'll find it, but it will give them the right information.

Could this site be used to optimize your dating profile photo (if you have one)? I'm probably "Hmmm" on most photos but possibly "Ok" or even "Nice" on a few. What does the algorithm want?

Yes, it is a foreseen application.

Your website is offline, do you have a paper about your technology?

It is back right now and they list the research.

Hi Rasmus, you described your paper on age estimation on the site. Do you also have something written on the hotness scale? :-)

Edit: Sorry, missed it! http://arxiv.org/pdf/1510.07867v1.pdf

None of the images I am uploading are working. It is saying it cannot detect a face on any of them. Is this a masking of the stability error or some other issue?

It is trying to let you know that you are not quite human.

What does it analyze?

Picture or Extracted Face?

Startup idea #384826 - "Does this outfit make me look fat?" As A Service

I actually thought an interesting idea is an app that recommends you which dating site to use based on your face.

If you're good looking you can pretty much use whatever.

But for someone like myself who is ethnic and not visually attractive, my success rate is really low on certain sites and acceptable on certain apps.

For example, my performance on Match (Graphic I made: http://i.imgur.com/UZuSzD9.png) was pretty woeful in December. But I started using another app in the same week and had much higher success in getting responses relative to effort level.

I'm curious what worked for you, and if you have any similar data for other sites? Explaining cultural differences between dating sites/apps is something that I've tried to do multiple times, but I've never had any data to support my hypothesis.

So far I've only used Match and CoffeeMeetsBagel.

I've noticed that CMB now only matches me up with Asian women despite the fact that I'm open to dating any ethnicitiy.

I'll post my CMB stats in another time but unfortunately I can't read ethnicity/height preferences in the same fashion as I can with Match. It's a shame because it's really important data in helping me determine whether or not I should even pursue someone.

But I've bookmarked your name so that I can contact you when I gain a greater sample size for CMB and Eharmony (which I'll test soon)

When I was aggressively dating, I found that my success was prone to incredible fluctuations that I could never pin down. Keeping the site constant, anyway. I do suffer somewhat for being a certain flavor of queer that excludes a lot of heteronormative people.

I've often wondered how I'm effected. I'm half-Korean so I look Asian but I was born and raised in America and now live close to Georgia Tech. I wonder how many people assume I'm foreign.

Have it always says yes and then throw up an ad for FitBit. We're going to be rich.

"Free week at 24 Hour Fitness!" coupons!

This would be good if you can turn this into a mirror IoT device. (°ロ°) Just imagine how many people would buy this if Kim Kardashian got involved

i want to invest.


Obligatory..."what's new?" Especially now that the site's down.

I mean, is it different from Project Oxford, the Microsoft API that's been around for awhile and is still quite amazing?


I actually tried it out early this morning, to compare it with a stock install of OpenCV 3. It got the faces correct, and the ages very well too.

Here are its guesses for the Star Wars TFA poster: http://imgur.com/XT7RmX6

Of course, perhaps users have trained it...particularly ones sympathetic to Carrie Fisher. Though I'd argue that they would've also corrected Boyega's face.

Oh dear - just put my own picture into that and it guessed I was 34 and my fiance 38...we're 25 and 26.

Edit - tried a different pic, it guessed my fiance was 51.

Well c'mon, have you really double checked? ;)

Lighting obviously plays an effect, but I was pretty surprised at how it got TFA's Han Solo down decently well. In the poster, he looks more in his 50s.

Lighting obviously plays a part. I wonder if race does as well? To use the common stereotype, does the algorithm make a guess if you're Asian, then guess downwards?

Actually age isn't new at all. NIST ran an age estimation test before Microsoft's API. I worked on Cognitec's algorithm.


I think it's just that there is so little of Fisher's face visible. I couldn't even tell who that was until I read your full comment.

the flag for "Adult Content" reminds me of anecdotes about mechanical turk workers standing in to keep uploads child friendly, now mixed with Age Identification this gets a new perspective. Is this in active use somewhere?

Oxford is cool indeed.

See also "What a Deep Neural Network thinks about your #selfie" at http://karpathy.github.io/2015/10/25/selfie/ . To summarise the "What makes a good #selfie" section:

- Be female.

- Face should occupy about 1/3 of the image.

- Cut off your forehead.

- Show your long hair.

- Oversaturate the face.

- Put a filter on it.

- Add a border.

In a mirror in the bathroom or pointing down at you from above your head at an awkward angle, and duckface lots of duckface.

The first processing step consist from (human) face detection. We use the standard OpenCV for our faces.ethz.ch demonstrator. A failure of this step is likely to propagate in the unreliable/wrong attractiveness prediction. For attractiveness, age, and gender prediction we start from a cropped image assumed to contain a (roughly aligned) face as found by the detector.

I hope that this helps to understand the aforementioned result.

well, is it wrong...?

31 cat years is like 3 or 4 human years so it's probably close.

The real question is how old and hot is that popcorn?

I'm sure their algo writers are looking for new work now lol

So this is the next more computationally intense evolution of "Hot or Not"?

crowd sourcing is out and AI is in...

Today you use the crowd to train an AI...

I'd still rely on crowds directly rather than AI. I don't think AI is as accurate. You can post your photos on OkCupid and a crowd of people will rate them and help you select the best. Of course, you pay for this service by rating pictures of strangers.

I have a heart shaped face, a slight hawk nose, green eyes, golden hair, freckles on my cheeks, I have light skin, my forehead is an average length, my eyebrows are medium sized, and my teeth are straight

Very brave of you to put the founder photos on the front page...

You've got to admire their honesty for not putting an image matching algorithm in to automatically say the founders are hot! https://goo.gl/photos/jv82LHNQKxrt1Ce88

I wonder if they were included in the training set...

Does anyone else find it funny that the photo of Jesus didn't rank as Godlike?

Can't have been what the painter was going for...

A photo of Jesus! Is it a daguerreotype?

Well, it was way off on my age, but it correctly gendered me ad female. Which I find quite impressive as I am a transgender woman, I've only been on hormones for 4 months, and most humans aren't even correctly gendering in me yet.

More than anything, I'm curious to know what features it was that registered me as female. Was it as simple as the long hair, or some complicated subtle mix of many small details?

I've been on hormones for a little over two years now. Submitted several pictures from the last few months: it's consistently gendering me female (yay!) and a decade younger than I actually am (yay!), but it's saying that I'm ice cold "Hmm" (aw...).

What direction was it off on your age? I'm 31 (30 in the older pictures I sent), and it said I was 19-22 in all the pictures I tried.

Guessing gender is at worst 50% chance of getting it right. And long hair is one of the big features these algorithms learn.

How do you know it isn't guessing sex instead of gender (and hence it "sexed" you wrongly)?

> most humans aren't even correctly gendering in me yet

This made me laugh really hard. What a positive twist on the fact that the algorithm is clearly a WIP.

This would make a great psychology experiment. Use an algorithm to detect someone's age, then randomly assign them an attractiveness score and see how their behavior differs based on the result. How does the random attractiveness result effect how likely they are to share their score? To retake the test? Does this vary based on the users age?

Hi sethbannon,

I am Radu, one of the authors.

We thought at similar experiments, however psychology is not our expertise. If you check our paper on hotness/attractiveness (http://www.vision.ee.ethz.ch/~timofter/publications/Rothe-ar...), I am sure that you'll find some interesting results on how different age-grouped people rate, a paradox, and more. And yes, there are many interesting experiments to do and questions to answer.

As someone who switches genders I find that it guesses my trans flavor right fairly often - maybe 75% of the time. It always gets my cis flavor right.

Also, the ones it rates the least attractive it, for some reason, tends to identify me as much younger (across both genders). Like more than a decade younger than the picture, and it'll rate it "Hmm..."

As for the highest rated pictures... I can't figure out what it does; though one where someone else did my makeup and it was perfect was among the two it rated stunning. I was surprised that the ones I tried to feed it where my phone's "Beauty face" kicked in (which removes most wrinkles and skin flaws) didn't seem to rate any higher... though makeup did make a difference.

A fun little toy.

Edit: Oh, and other than occasionally docking me a decade as mentioned, it was pretty accurate on age (+-3 years, generally). Which I find interesting as I'm frequently told I look younger than I am.

31-year-old trans MTF here... uploaded my most recent picture, and it said I was 20, female, and ice cold "Hmm...".

Uploaded a picture from about a month ago... same, except it said I was 19.

Well, I'm flattered it thinks I can't drink, and I'm glad I pass. Too bad it doesn't think I'm hot, but I've always preferred to go for cute over hot anyway.

I'm gonna dig through my photos and see how consistent this is... (edit: a couple more, 21-22, female, and still ice cold)

As an example (both me, and I'm in my early 30s): 24, "Hmm...": http://imgur.com/V1AMDfd 27, "Stunning": http://imgur.com/bmmJ2dg

I'm actually a year younger in the second picture than the first. Maybe it's the glasses?

I think the angle might also help. Your jawline is much less prominent in the second one, so that might be why it's "Stunning" rather than "Hmm...".

Here are the ones I used: http://imgur.com/a/yFwN4 -- the first one was taken today, the rest are all from the last couple of months.

Any chance of seeing a photo of you in guymode that it recognises as being in guymode?

In its defence, you do look very different in those photos. I wouldn't recognise you as the same person. That said, I'm surprised that it ranked you so poorly in the first photo.


27/Hot: http://imgur.com/mHpvd5j 36/Nice: http://imgur.com/a2IKkRD 30/Ok: http://imgur.com/Vicvbq3

First is a year and half old, the other two are within the last couple months.

I don't know how much better this thing is than doing: IF detect_face THEN random_guess()

That said, I think I'd agree with its ordering of those photos. shrug

It says I'm either stunning or hot in the 4 photos I tried. Is that real? I don't really get oogled by women or anything like that.


Yeah this site is bogus. So many inconsistent ratings.

Humans are inconsistent as well. We also use attributes beyond looks.

Anyone tried a picture with larged amounts of cash in the background?

I'm "hmmm" and 55, apparently. "Hmmm" I'll agree with, but I'm 32!

Is this a test for my attractiveness or for the AI?

Oh hey, I worked on something similar two years ago: http://www.FaceMyAge.com (note, the age estimator has been taken offline - because, 2 years ago).

A lot of the issues our estimator (just an age estimator) ran into were the standard face recognition problems: occlusion, lighting, and (the obvious) bogus images.

Anyone involved, what data set was the attractiveness scale built from(Labelled Faces in the Wild Dataset (http://vis-www.cs.umass.edu/lfw/))?

Hi tsumnia,

I am Radu, one of the authors.

After 2 years we face almost the same issues, but probably we cope differently with them. Note that our solutions are fully described in the two papers mentioned on our faces.ethz.ch page. For attractiveness we used data from Blinq.

Our apparent age estimation solution is the winner of the latest LAP challenge, ICCV 2015.

And why, exactly, am I supposed to care about how attractive an AI thinks I am?

It said I am 36 and I am 37. I am impressed (and Hot(tm))! However, it guessed a coworker was 34/Ummm and she is in her 50s. I am conflicted about telling her the results.

A side question: Is there a service/project that can identify features of a person by feeding it various photos of that person? Examples:

- body type

- piercing

- tattoos

- eyeglasses

- colored hair

- etc.

This fails for Asian faces. Tested it on a picture of Bing Bing (Chinese Actress) that is absolutely stunning. She got a "Nice" rating. This is a toy.

Check out how Sensetime did a similar feature.

They said it themselves - "Our algorithm is trained on the pictures of the BLINQ community that is mainly based in Switzerland. In other parts of the world the perception might be very different."

A service that upvotes average to below average white people and downvotes average to above average people of color. Is someone anticipating a decline in the international value of Whiteness?

A lot of the sample photos look like they have had filters put on them. One of the things that karpathy found was that convnets were bad at images with filters.

Anyway I only got "connection error".

Did we break it? I can't upload pictures anymore.

Plus: The algorithm thought I was 24 years old. I'm actually 32.

Minus: I got the lowest rating possible. Haha, terribly depressing feedback before a date.

Don't feel too bad -- I ran a photo of Brad Pitt through it and he merely got a "Nice". Granted, I tried a second one and he got "Godlike". I wonder how sensitive the results are to general lighting in the photo.

Experimented more - it appears to be strongly influenced by facial expression. If I have a straight face, I'm hmm or OK. If I scowl, I'm nice. If I grin like an axe-wielding maniac, I'm stunning.

Then again, that bears up in reality. People who smile are perceived as more attractive.

I think ethnicity does play some role too (which maybe works with the lighting hypothesis?).

I put pictures of really attractive Asian guys (specifically men who honestly have a lot of diehard female fans who are interested in them) and at best they got "Nice"

This is actually a very interesting effect -- it exposes opinions people are generally not comfortable at expressing, for fear of being labeled as racist or something.

Sadly, asian guys are considered much less attractive than our female counterparts in Switzerland.

On a more positive note, now I know if a swiss girl likes me, she's probably not superficial. lol

I ran lichtenstein's m-m-maybe, and it returned stunning, and 26, which is probably about right, even if it's a copy of a comic.

I'm 41, it thinks I am 33! YAY!

It was only off by 1 year on my age, and apparently my own opinion is the same as it's opinion ha ha which isn't very good ha ha

Reminds of the those State fair booths where the guy tries to guess your age. http://blog.syracuse.com/cny/2013/08/your_age_weight_and_bir...

I put Mona Lisa and she is second lowest rating. I mean, this is next door to random generator.

It possibly associates the painting texture with poor skin health.

To be honest this thing seems to be most attuned to Euro centric features.

Well that made me feel good. I'm way older than they suggested.

Wow! It missed my age for only 1 year (on the good side :P)!!

I think this thing is all over the place.

http://pasteall.org/pic/show.php?id=97312 Off by almost a decade on age.

It's not a lot more precise than randomly guessing.

Check this album out: http://imgur.com/a/1a1tn

Seems racist and sexist towards men.

Please check our research papers linked on the webpage.

Our data consists only from normal (or natural looking) face images in the wild (from IMDB, Wiki, and/or BLINQ user profiles). On such data we get very good apparent age prediction (better than the human reference) and also very good gender and attractiveness prediction.

The attractiveness is highly subjective and its perception varies from one culture/region to another. We used data from Switzerland.

Our solutions are far from being perfect and the guessed results should not be taken too seriously.

We consider to update our models to explicitly deal with distortions and non-human face contents.

popup says: Sorry, we're currently under heavy load and can't handle your hotness. Please try again later.

I didn't know I was a girl until now!

So this was getting slammed by reddit, and you decided to post it here as well? These poor developers.

looks fun

I uploaded a photo of my (male) personal trainer. It got it exactly right. Age 26 and "stunning."

It no worky?

Finally, it's good to see we are using that AI research for the advancement of mankind. If anything this will help procreation?

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact