Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Today I was unable to take a picture in a photo kiosk (twitter.com/drpragyaagarwal)
30 points by louis-paul on Dec 23, 2019 | hide | past | favorite | 25 comments


People tend to think of this as horrible discriminating decisions made by some horrible people, but most likely it is a very cheap machine developed by an intern who had 3 days to make something work...


That’s much worse. The number of people out there who will do this deliberately is much smaller than the number of people who will do it accidentally, but the harms are the same. We need to avoid carelessly applying technology to areas where carelessness can harm people severely.


Trivially fixable even in that fairly inexcusable case: add an override button.

"by overriding you accept that this photo may ultimately be rejected for passport use."


There is an option to override, it’s just not visible in the picture.

The booth is asking the user to “touch an option” but only 1 option is visible, the option to retake the photo has a “attempts remaining” counter and that the company logo is in the middle of the photo preview (the photo preview is in the middle of two options, which will make the company logo centred on the screen).


If that's the case, then the decision to assign it to an intern and give them 3 days, and then not to test it with a diverse set of subjects was somebody's fault and failure.

If they were not an intern (or even if they were, frankly), the engineer who built it also had a responsibility to think beyond their own demographic; and if the timeline was too short to develop a properly robust system, to at least point that out to management.


As a side point, it could also be like: would someone pay 100$ for a picture in such a machine?

This is sad, but that's how you get fastfoods too


The better question is, why replace a simple photo booth that has worked flawlessly for decades for a few quarters with a machine that requires a massive investment to work properly and do the same thing? It reminds me very much of "smart toasters" and other gadgets that are extremely expensive and arguably don't work nearly as well and aren't nearly as reliable as the old appliances they are supposed to replace.


Passport photos have been issued for over a century without needing this crap. Yes, it meant more rejections, and it meant inspectors allowing photos that had minor rule violations, but convenience isn’t the paramount goal of everything.


This is why we need regulation and can’t leave this kind of stuff up to the market.

If it costs too much to make a machine that is trained on not just a set of white faces so it is not discriminatory, that machine should not be allowed to exist. The only alternative is that if you happen to be part of the privileged majority, you can get a cheap picture from an automated booth, whereas everyone else will have no choice but to pay for a more expensive alternative. This will work to amplify any differences that are already ingrained in society (that probably led to the homogenous training set to begin with).


It doesn’t matter how it’s rationalized, it wouldn’t have shipped with a more observant set of people.

Nobody consciously thinks infrared faucets and hand dryers discriminate, it doesn’t matter about how “hard” contrast on darker people must be for a machine, its just that the edge case when their sensors start failing should have not been considered an edge case at all and never shipped. Nobody should be in a position of playing devils advocate to dismiss minority experiences with technology explaining why companies are tone deaf, it just shouldn't have shipped.


No they don't.

This is without a doubt a product of badly designed software.

That badly designed software is making life difficult for people based on assumptions about race and gender. It's a problem which is worth fixing and deserves attention.


Sounds like Hanlon's razor.


I saw a great talk at a conference about this by a woman from the University of Florida. I am sorry I don't remember her name or the conference. But this is a classic symptom of lack of diversity of developers (or test subjects). It's like the sinks that don't activate with dark hands (one of her examples). It's another reason we need to focus on developing underrepresented talent.


Why does a photo kiosk enforce passport photo requirements?


Because it is a photo kiosk for taking passport photos.


That doesn't clarify much. Is this some kind of official, government-sanctioned kiosk, whose approval of a photo carries some sort of final authority, and it doesn't just pass the photo on to the passport office?

Because otherwise, it would work just fine without this enforcement functionality.


Submitting a bad photo with your passport application can really suck. It delays things a lot and potentially even costs you a submission fee.

The point of passport photo services is to mimic (or try to) the official government photo reviewers criteria. That's the value it provides. Learning that you will be rejected quickly rather than slowly.


Merely informing you of a bad photo would provide the same value. No need for enforcement.


Probably easier to send the photo off to mechanical Turk if it’s not a perfect match.


i guess it's trendy to assume the worst, but my optimistic take is that "the machine is broken, use a different machine". maybe because encountering broken machines in public use is pretty much a daily experience for me.

the company that produces the software or the box will either notice the issue and fix it. or not and be punished by the market.


Well, she does look smiling to me in the picture she posted...


Unlike the title this is for a passport photo.

You can't smile in a passport photo.

If you compare her lips to recommended passport photos she is more smiling.

If you look at the 2nd happy looking woman here, you can see her lips are not actually in a smile as much as the twitter pic - https://www.immigration.govt.nz/new-zealand-visas/apply-for-...

Twitter pic has a dour expression but AI isn't fooled, it looks at lips.

Humans might be fooled and let the photo through though.

Would need more evidence. The open mouth, not sure, stop smiling it might disappear, it could be a follow on assumption.


We don’t need more evidence. The person in the picture is not smiling and their mouth is not open.

The picture meets every criteria listed on the website you linked. It’s totally fine.


>We don’t need more evidence. The person in the picture is not smiling and their mouth is not open.

Well, she looks 100% like she is smiling to me. Her mouth is indeed not open, though, so I'll give you that.

Even if that's her facial structure, and she was consciously not smiling, she does appear to be smiling.


Or their face looks different from faces that you are used to seeing (since all people are different) and the person is in fact not smiling.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: