Hacker News new | past | comments | ask | show | jobs | submit login

Based on the write-up, Samsung has lower quality Iris recognition than could be written by an undergrad in a few hours. I say that, having done so.

Most obviously, the system should not tolerate a constant-size pupil, ever. The pupil has micro-dilations around twice per second, and your system is really terrible if you don't verify that changing diameter.

Also, multi-spectral is a pretty good test, though I don't know enough about the capabilities of the S8 camera to know if that's feasible (shouldn't be that hard.) Capturing the patterns of the iris at 500, 800, and 1200nm results in three templates that are quite different from another.

CCC were able to do this for about the cost of a S8. I would say this is one of the rare situations where defeating the attack would have been even cheaper. It's that simple a programming exercise.

Samsung always seems to me as if they race to match any iPhone feature–but never more than skin-deep.

So when the iPhone gets a fingerprint sensor that saves only a hash of the actual data in a special enclave of a custom chip, Samsung responds with an iris scanner that saves an image of the iris as a world-readable jpeg in your home directory.

Thus, their marketing material can claim feature-parity (or even exceed Apple). But it never seems like they actually care.

It's not like Apple doesn't run into similar problems (not sure if the fingerprint sensor has been defeated–it's a bad idea for 5th amendment reasons in any case). But at least they do the minimum in trying.

>It's not like Apple doesn't run into similar problems (not sure if the fingerprint sensor has been defeated–it's a bad idea for 5th amendment reasons in any case). But at least they do the minimum in trying.

In 2013 a CCC member broke TouchID access within a few hours after release of the IPhone. All needed was a photograph of the fingerprint on a glass surface. https://www.ccc.de/en/updates/2013/ccc-breaks-apple-touchid

Same method worked on the Iphone 6, as Apple hasn't changed a thing. Biometry is fundamentally broken.

>All needed was a photograph of the fingerprint on a glass surface.

And wood glue! Looks like that method proved unreliable, so they expanded it:

"To create the mold, the mask is then used to expose the fingerprint structure on photo-senistive PCB material. The PCB material is then developed, etched and cleaned. After this process, the mold is ready. A thin coat of graphite spray is applied to ensure an improved capacitive response. This also makes it easier to remove the fake fingerprint. Finally a thin film of white wood glue is smeared into the mold. After the glue cures the new fake fingerprint is ready for use."

Yes. They explained all the innovative magic of Apple's fingerprint sensor was better image resolution. So all they had to do was improving that on their end too. I imagine the body changes everywhere over time, so this resolution game has a hard limit. A fingerprint is the worst choice of biometric data, as people leave them everywhere...

So what's better, then? Tongueprint? I'm down to lick my phone to turn it on.

Tongue prints work fine on typical phone fingerprint readers, so it is an option.

I tried my noose tip and and tonge, neither worked (Huawei). Toes work (obviously).

People reported that pet's pawns work.

And you know this because you tried it? LOL :)

Actually, my father-in-law, which was pretty funny because he is typically quite reserved.

Iris scan is probably still our best bet, we just need better hardware and software.

How is the iris scanner on Microsoft phones in comparison?

People don't leave perfect moldae prints everywhere.

AFAIK CCC never had a proof of concept of a real world usage of this. They needed access the original finger.

That isn't to say it's not possible but it is a pretty major asterisk.

Didn't some group pull a politician's finger print off his dinner glass and then include it as an insert with all the issues of some magazine?

Can't find the reference right now, but somebody's gotta remember this, it was when the UK was considering using biometric data as IDs.

That was CCC in 2008. To underscore the inherent problems of biometric authentication, they pulled Wolfgang Schäuble's fingerprints of a dinner glass and published it in their magazine. That issue also included a ready-to-use replica. Schäuble was Germany's interior minister at the time and a strong proponent of biometric data in passports and increased surveillance.

https://www.ccc.de/updates/2008/schaubles-finger (in German)

They have done it with a fingerprint left on the device screen later on, if i remember correctly.

If I remember rightly that was a print applied to scanner glass, much less successful, and required absolutely perfect print transfer.

Which, imo, isn't much different conditions tbh (i.e. not representative of the real world)

Read the original ccc article or watch the video! They got the print off the phone itself. Only restriction is a high resolution scanner OR camera to capture it.

They did it also with a simple press photo of a politician.

There's no need to get a perfect copy.

> Biometry is fundamentally broken.

"fingerprints are usernames, not passwords" is the standard advice I've read.

> In 2013 a CCC member broke TouchID access within a few hours after release of the IPhone.

It's actually the same guy as with the S8, aka Starbug.


They also managed to get the fingerprint of a politician via a high resolution picture taken at a speech. (Also a phone PIN via eye reflection captured by the front camera.)

> It's actually the same guy as with the S8, aka Starbug.

Yes, hopefully he will give another talk at 34C3. Very entertaining guy too!

No need to wait for 34c3, he's giving a 30 minute talk at Gulaschprogrammiernacht in Karlsruhe on Thursday: https://entropia.de/GPN17:hacking_galaxy_S8_iris_recognition

AS i have come to understand it, biometrics is a good identifier (telling who you are) but a lousy authenticator (telling that you are allowed).

The use of biometrics on mobile devices somewhat mix this, with the assumption that if some user was authenticated within a certain time frame (via a pin or some other knowledge bound check), a simple id is enough to extend that authentication.

> authenticator

Authorization is the counterpart to Authentication: authentication proves who you are (with passwords/tokens/biometrics aka something you know/have/are), authorization controls what you can do (with permissions/ACLs/roles/etc.)

To put it another way, the bouncer at a club checks your photo ID to see that you match it (authentication via something you are), then uses it to see if you can enter (authorization by checking your birthdate against a cut-off/name against a guestlist).

Samsung always seems to me as if they race to match any iPhone feature–but never more than skin-deep

Firstly, any biometric technique can be beaten. Against a known, committed foe, it is almost impossible to defend with surety. And for that matter, who can't obtain the pin code of any other user given a short amount of time and focused attention? The notion that "if someone takes an IR high resolution, close photo of your iris they can defeat your security" is asinine given that the same people could obtain your pin in a million and one ways.

These mechanisms are to induce users to use some security, and the primary defense is against lost or stolen phones, making it convenient enough that it isn't disabled.

Secondly, how did this somehow turn into yet another Royal Apple spiel? Aside from the easy beatability of the Apple fingerprint sensor, why wouldn't you compare the fingerprint sensor on a Samsung?

> ...if someone takes an IR high resolution, close photo of your iris they can defeat your security...

There are commercial security products that regularly perform "IR high resolution" iris scans from several meters away and require no cooperation from the target. Stanley CSS sold one that sat on top of a doorway over five years ago, their product literature says that you need to look at it - but having demoed it myself, I can say that is not true.

Have any real security researchers have tried to analyze and break this one?

Until CCC approves your CSS I will consider it insecure.

You've misunderstood my point, which is that distant iris data collection from non-cooperative subjects is so easy now that vendors are selling solutions pretty cheaply. So using it for security, without additional checks in place, is like walking around with your pin written on your forehead.

Samsung always half-asses their new features. They are so desperate to be first to market or to make a splash they rarely take the time to make it work well. It's my biggest frustration with Samsung, they have the resources and market position to take the long view and give their teams more time to polish, but never do.

And this is why it makes it so difficult to choose high end phones. I have been shouting about how my Pixel phone is magnitudes better of a device than any other phone I've ever used, including the S8.

On paper it looks awful, but everything this phone does works 100% of the time quickly and without stuttering or failing.

I'm on my second Samsung. First was a very annoying pre-capacitive touch that never worked well. This current one is an old Galaxy S5. It is better, but it likes to restart periodically (fortunately Android handles that well). Battery life is crap and it gets sluggish easily.

Before I had a Motorola. It was much better. I will not buy another Samsung.

I like my Galaxy S6. Bike shed!

I'd say the Pixel is the best Android phone for sure.

But that should be expected, as it's from Google

I wouldn't say expected, they have made some poor performing phones in the past (for various definitions of "made").

But my point was more that on paper it doesn't look like much. It doesn't have waterproofing, it's not "best in class" in anything except the camera, it runs software which has less "on paper" features than other brands, it doesn't have an SD card or removable battery, etc...

But when you actually go to use it, it's a night and day difference between it and other devices.

> CCC were able to do this for about the cost of a S8.

I think you misunderstood this. The cost was buying the S8.

You only need a laser printer, a decent camera and a contact lens.

That is what he meant. Doing it better is that easy, that the cost of the S8 is more expensive then the hours of programming labor.

What programming labor? They just printed a photo and put a lens on it.

I think that tbihl and bebna are trying to say that it would cost Samsung less to defeat the attack by programming one of the listed countermeasures than it cost the CCC folks to buy one of these phones. (I don't agree.)

That is exactly what I meant. Like you're probably thinking, my estimate IS an amateurish one, taking into account no business admin/healthcare/etc. But I remember implementing this (as a small part of a project) at a time when I couldn't be bothered to put more than a few hours into any project. At any rate, I can't explain their failure to implement these checks, because they're SO trivial.

You're agreeing.

Yeah, well, in that sense, the cost of the S8 was also higher than the hours the researchers spent scrubbing toilets to develop this exploit.

"You only need a laser printer, a decent camera and a contact lens."

No, considering that the picture doesn't have to be that good, you need only a rogue picture from somewhere on the Internet (Facebook, Instagram, or whatever), the means to get it printed cheaply (most likely using a public printing service), and the contact lens. Oh, and the contact lens don't have to be new/hygienic. The total cost boils down to pennies.

Edit: 'micro-oscillation' was a term that I lazily invented to describe a phenomenon with which I am only passingly familiar. It is actually called 'hippus' or ' pupillary athetosis'.

Have you read George Orwell's "Politics and the English Language"? If not you should give it a read, it's short. This comment reminded me of his criticism related to Latin usage.

I have now! Hopefully you're accusing me only of saying 'micro-oscillations' when I ought to have said "shrinks and swells slightly at regular intervals."

This is remarkable, even though it's obvious to any of their phone users (myself included) that Samsung's software development is shoddy at best, despite its excellent hardware. Which is a huge pity, but may well have something to do with the culture of corporate deference and rigid hierarchy. http://www.asianews.network/content/feature-samsung-debacle-...

Let's not pretend that people here in extremely permissive environments don't build hugely buggy things that wouldn't survive a similarly superficial test.

why would that only affect their software, but not hardware?

While I agree that Samsung is at its core a hardware company and software engineers are still treated like second-class citizens, you can't just expect them to compete with Google or Microsoft overnight, IMO. I'm disinclined to believe that their military-like corporate hierarchy is to blame.

I don't know: perhaps that the company's structures work better for their core expertises, hardware, but software development, perhaps something that came somewhat later and lower down the hierarchy, doesn't work so well in that context?

Well, a few hours is quite the euphemism if you're talking about starting from scratch.

Also, the cellphone image resolution is far too low to recognize dilations. Looking at the video, I'm surprised that it works as well as it does, to be honest.

You could also add eye-tracking to have the user follow a pattern and add complexity to faking it.

> It's that simple a programming exercise.

Then it's also fixable in software, no hardware update needed, right?

At least to make it better even if it isn't good yet.

Do you have a source about these micro-dilations? I googled a little bit but couldn't find enlightening results in my superficial search.

The closest I could find was this:


It seems that there are reasonably large changes due to thinking about something, but not due to just looking unless your camera can detect 0.01 mm changes in diameter, which maybe tbihl's could be that seems unreasonable for a smartphone camera.

I would guess tbihl has never tried to get iris recognition working on a cost-constrained consumer device at a huge scale. They might not think it is so trivial then.

It's true that I've only tried iris recognition in dedicated setups. Sadly, I can't speak to whether this test is within the abilities of a standard smartphone camera without any macro lens.

Saccades is not what I was referring to. 'Micro-dilations' was my attempt to describe the phenomenon. After some searching, I believe it's called hippus (maybe specifically 'physiological' hippus, to distinguish from pathological) or ' pupillary athetosis'.

A similar concept that I remember from university are saccades (minimal, involuntary eye movements).

Yes but those are well-known and visible to the naked eye. I've never heard of these 'micro-dilations' and Google didn't show anything.

'hippus', specifically 'physiological hippus', or ' pupillary athetosis', seem to be the correct terms for what I was describing as 'micro-dilations'. Saccades, as I understand it, is rotational shift of focus by 1-2 degrees, rather than repeated contraction/relaxation of the pupil.

I think you're thinking of microsaccades.

Saccades in general are just rapid, ballistic eye movements: they don't have to be small and they're often voluntary. You often make saccades without explicitly thinking about it, but you can countermand these and hold your eyes still when you want to.

I think if they would track an event that's only 2Hz that's too much time to unlock. People would complain that it's slow.

Most obviously, the system should not tolerate a constant-size pupil, ever. The pupil has micro-dilations around twice per second, and your system is really terrible if you don't verify that changing diameter.

What if one of the subject's eyes is a glass eye? What if they're wearing colored contact lenses? Wouldn't both of those situations complicate that?

If someone's wearing a colored contact lens I would expect iris recognition to not work. That's like trying to use a fingerprint reader with gloves on.

Glass eye I'm not sure about either. That seems along the lines of using a fingerprint reader on a prosthetic hand. Sure, you could mold a fingerprint onto one and scan it, but with a sophisticated fingerprint sensor it's going to look like a fake finger. Do we deliberately make everyone's fingerprint scanner weak to allow someone without fingers to use it?

Maybe the answer is to introduce a "weak mode" option where most users could have the scanner verify "yes, this is a real eyeball," and if someone with a glass eye still wanted to use iris scanning in a way that can be copied by a photo, they have the choice to disable the security measures.

If someone with a glass eye is using a smartphone then one would suspect they are seeing their other, real, eye. This can then be scanned as normal.

You'd think so, but iPhones are incredibly useful to blind people.


The first time I saw a blind person using an iPhone on the subway (years ago, I think it was an iPhone 4), I was completely blown away at how much he could do with it.

Personally I think apple should allow you to turn the backlight for the screen off entirely for their use. Perfect over the shoulder privacy and better battery life to boot!

It's called "Screen Curtain" and it's been around for a long time.

I suppose Samsung tuned the system to avoid false negatives, which might be difficult in all the lighting conditions that might come up in real world use. It's a device for the mass market and average user after all.

Might be that they did a bad job with the Iris recognition, but why not give them the benefit of the doubt and consider that they were aware of the trade-offs involved?

Knowingly and willingly giving users a false sense of security? How is that not worse in every way?

If they couldn't manage to get false negatives down to an sensible level without compromising security in such a blatant way, there's two courses of action: Live with it, or don't release it.

Or release it and make money with your cool feature that competitors don't have.

Spot on.

This is a marketing gimmick, not a security feature.

Just as with fingerprint readers[1] the target audience is people who otherwise wouldn't lock their phone at all.

[1] https://arstechnica.com/apple/2013/09/chaos-computer-club-ha...

How is this any different than the swipe to unlock pattern? If you go to a starbucks and sit down for 20 minutes in a higher seat, you can get the passwords to probably 2-5 phones depending on everyone's security and how crowded it is (I inadvertently memorized 3+ coworkers' unlock patterns because they're so easy to see).

If someone has physical access to the user they can probably get into most people's phones (that's just saying people aren't careful enough though).

That could be the case, after all it's not a Diebold ATM.

Silicon imagers don't work very well past 1 µm or so. To go out to 1.2 µm, you're using another semiconductor and that gets expensive rather quickly.

You could also flash a bright LED and measure the pupil response. It's possible that there's something unique in the response too.

at least a decent software update should be pretty easy to push out to fix the issue.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact