I figure either (A) they're trying to carve out a niche of hardcore consumers who do care, or (B) they're trying to play a long game, hoping that broad sentiment shifts towards valuing electronic privacy. If it's the former case, I think they're doing fine; these kinds of whitepapers will reach most of those who care, and periodic news articles ("Terrorist iPhone unable to be unlocked!") will reach the rest.
If it's the latter, I think it's a pretty big risk given the scale of their re-education task (the pool of users willing to sacrifice personal privacy for other benefits, i.e. Google and Facebook's bread and butter) and the potential pushback they'll receive/have been receiving from governmental sources.
What does HN think? Is this a viable business differentiator for them, long term? Or will they have to shift to the 'dark side' of personalized data and services to remain competitive in the future?
Honestly, I cant possibly trust Apple to always do the right thing. A corner stone of digital privacy and security is being in control of your systems and data which Apple takes away from its customers. When I buy and use Apple products - I have to have faith in Apple to do nothing wrong with my data or send me a malicious update that compromises my system. It is all closed source and you only get occasional white papers explaining technology in abstract terms about their intentions. There is no independent person(s) that can vouch for Apple's integrity in building the systems the way they claim so.
That said, based on current status quo, I'd rather warily trust Apple with some of my personal data like GPS location, browsing history, Notes etc than Google or Facebook given the latter have a business motive to sell me out.
So when my friends or relatives ask me what phone/computer I think they should buy, I opinionatedly recommend iPhone + GnuLinux for the tech savvy and iPhone and Mac for the non tech savvy. I also give them a mini 5 minute lecture about
* Not allowing Location access to all the apps they download - Why would a calculator app need your location "Even while not using the app"?,
* Paying for their email service - I use Fastmail. ,
* Enabling tracking prevention in their browser to prevent tracking cookies,
* Using separate browser for FaceBook, Amazon and others while using FireFox only for personal/sensitive browsing needs.
My two cents: half closed is pretty much the same thing as fully closed from a trust perspective, and apple’s incentives are much less in conflict with user privacy than google’s.
I highly recommend an ad blocking VPN so rest of the apps hitting G ad network are blocked as well.
A privacy oriented VPN providers explicit goal is to give you some privacy, and you pay them for it.
Sure, a VPN can turn on you, but then they bite the hand that feeds them. Also, it would contradict what they explicitly say, completely unlike the situation with Google.
A Debian developer might also try to bring a backdoored package to your machine. Is therefore trusting Debian and trusting Apple the same? They are not, as Debians structure is transparent, the developers carry individual responsibility not just towards the Debian project, but towards every body that can see it, i.e. the whole world.
The theoretical possibility of multiple things being equally bad is irrelevant as long as other significant aspects are ignored.
I use Adblock VPN on iOS - it also has a dns proxy.
They have a build on the web; I compiled my own off of their source, though, so can't vouch for that.
Is there any other option?
It can leave connection open which prevents many cellular radio technologies from falling into lower power state.
Really depends on how optimized the cell modem is and how often keepalive packets are being sent by the VPN. Honestly, I haven't seen too much of a battery hit on ipsec VPN on the iphone SE. However, my older android phones used to get shredded on OpenVPN.
EDIT: If anyone's asking, I use the algo.sh script to deploy ipsec vpn on digitalocean (has option to enable adblocking too) https://github.com/trailofbits/algo.
It's designed to not use persistent connections, keepalives may have to be enabled in case of nat traversals though (to enable incoming packets).
How is Apple entirely closed?
Apple even stripped most of iOS related code for their MacOS open source drops. E.g they only recently started to release XNU source with code for iOS.
But as far as Android....
You can use Firefox as your default browser on Android but are limited to a bastardized version on iOS. You can run an OpenStreetMap app as your default map on Android, but you have no choice on iOS. You can build your own apps and run them indefinitely on Android, but you have to pay a $99 yearly fee (on top of the Mac tax) or rebuild every 7 days on iOS.
I don't recommend iOS to anybody. People who think it is more secure don't understand defense in depth, and people who think it is more private don't understand that Pixel and Android One builds actually collect less information by default (before opting in on any of the dialogs).
If that's not it, report it. Every app I use works with microG, and any minor issues I had earlier were being resolved pretty quickly.
500 million is the number of devices which potentially had access to an app store containing apps that had malware. It's not the number actually infected. I mean, come on.
From the article: "XcodeGhost potentially affects more than 500 million iOS users, primarily because messaging app WeChat is very popular in China and the Asia-Pacific region." After that article was published, Angry Birds 2 was also discovered to be infected.
Did you notice how I compared to Google and Amazon app stores? Those are the devices that HN readers would buy (those Chinese app store phones are not available for sale in the US), and they have vastly more users than the iTunes App Store yet in total infected devices can't come anywhere close to the toxic hellstew that is the App Store.
> Compare to Google and Amazon, which run static and dynamic analysis of apps uploaded to their stores and allow third party security research on their stores, enabling both earlier detection of malware and faster takedown of all apps that share the same malware.
You're assuming that Apple doesn't do this?
https://www.google.com/amp/s/www.macrumors.com/2015/09/20/xc... shows 500 million infected from 50 apps.
> You're assuming that Apple doesn't do this?
No need to assume. https://researchcenter.paloaltonetworks.com/2015/09/more-det... not only shows that there were thousands more apps affected by Xcodeghost than originally reported (and thus, more infected users than the approximately 500 million estimate from the earlier link based on the original 50 apps), but also that Apple was still taking down affected apps days later, waiting for third parties to report them. This despite that Xcodeghost represents a single malware that can be detected with a binary grep. That Apple didn't have the infrastructure to deal with even that demonstrates how woefully inadequate their app management infrastructure is for dealing with malware.
Where does the link say this? As far as I know Apple has all the information needed to make this decision themselves. As you said, a binary grep, coupled with many of Apple's static/dynamic analysis tools should be enough to find this issue.
"Starting September 18, Apple began to remove some iOS apps infected by XcodeGhost from its App Store.... As of this writing, on Monday, September 21, we notice that there are still some previously known infected iOS apps available in App Store."
> As you said, a binary grep, coupled with many of Apple's static/dynamic analysis tools should be enough to find this issue.
As I said, it should be so simple if Apple had set up the basic infrastructure for this. Since it had not, Xcodeghost remained on the App Store long after it was initially discovered, allowing researchers to find thousands more affected apps. Compare to Google's Play Store which not only performs static analysis but also crash analysis, battery usage analysis, and dynamic analysis through running the apps in cloud VMs (something Amazon did at launch).
To see proprietary and incredibly locked-down devices such as iPhones advocated for so strongly seems so weird. As if openness and security are at odds or something.
> … or rebuild every 7 days on iOS.
By tossing in a flippant remark, you undermine the legitimacy of the other arguments. I don’t think a reasonable person would conclude that you actually believe the above statement to be true. This type of rhetoric, in which reality is knowingly distorted to make a point, may be less effective than you consider it to be.
For a user who doesn't have the foggiest idea about technology, to whom the whole thing is "magic," why isn't this rational? Do you closely examine the safety of chemicals in all of your cleaning products, household items, clothes, your car, and the devices in your home? For most people, probably no more than in a cursory sense, as it's not normally one's area of expertise.
We have to "outsource" many aspects of daily life to experts.
Paying for their email service - I use Fastmail.
You've effectively outsourced your email security to Fastmail. How is that any different?
"Do you closely examine the safety of chemicals.."
No, but they need to list the ingredients, get it certified by authorities, and labs can freely test for the ingredients without the company's input. Software is just not the same as chemicals.
True, trusting FastMail is no different than trusting Apple. I didnt mean to say that FastMail was magically more secure. Just that they have no incentive to sell me out. I could've used any company for this. Just trying to not put all eggs in one basket. It is certainly a better choice than Google which creepily builds a profile of me to make money.
Everyday users experience massive troubles with security issues, and incur painful losses, at the hands of criminals and black hats! I have no awareness of a major corporation “going rogue” and deliberately harming their users, ever.
The incentive structure is the opposite: companies generally try to help their customers.
That’s not to say they don’t fuck up, particularly when the incentive structure gets “skewed.”
To see that there is actually an inverse correlation between personal control of software and security, just observe the crypto currency space.
The losses suffered by users from bugs, security failures, and hacking are comically large.
English is not my first language, and I was ambiguous in what I wrote. By "Apple could go rogue", I meant that Apple could change their policies on user privacy and muck around with making money of user data. True that it is bad PR, and reversal of their current state, but what if there is a leadership change and the new ones think it is a goldmine waiting to be opened?
After all, we have seen Apple do exact things they said they wont. Big iPhones, mini ipad, give up on user privacy to the government in China...
"The incentive structure is the opposite: companies generally try to help their customers."
I think that's a bit too abstract. Companies try to help customers as long as it is in their business interest.
What do I wish would exist?
A hardware company in the class of Apple that stays with hardware, and a company like RedHat that builds mobile Linux OS with a GNU app store, where users can buy apps from the developers straight and install it themselves. That way no one is in complete control. Phones are pretty powerful these days to start using the features we have developed for laptops..
I see this as an opportunity that Mozilla has, and I'd submit an "idea" to them but I don't know how. Google got big by making ads suck less - they were the first that delivered non-intrusive ads that actually worked (and were personalized too, to boot - without being intrusive!)
I wonder if there isn't a serious market for flipping the paradigm on its head. The marketers do need to deliver relevant messages to relevant persons... but I think they don't need to collect lots of personal data about you. What if someone trustworthy would flip the paradigm on its head? Say that you tell Firefox your interests, and, besides being a browser, it becomes a sort of "local ad exchange" too - when you visit "New Yorker", it says "I can display here an add about cars, or one about sleep pills, or one about cruises etc.". Instead of profiling the user - send extensive ad profiles to the browser, and let the browser pick locally based on user profile & preferences. It would be wasteful in that you potentially send a lot of ad metadata to a single browser... but, theoretically, all that metadata can be cached. And, I dunno, maybe it's not really that much metadata, afterall? It's a rough idea, but I feel it's one worth exploring that could provide genuine benefits both for (non-shady) marketers & for end-users.
To add to that, "private mode" would actually do what people expect it to do - keep them anonymous on the web, and not leak sensitive/ embarrassing information about themselves.
That is, design all hardware and software in-house, and use formal methods (theorem proving, program analysis and model checking) to verify everything. Furthermore, adopt a Qubes-like architecture where external untrusted applications can be run in an isolated way.
Good idea for the separate browsers too, I never considered that. I would just throw a vpn you trust (or roll your own) in that list, along with uBlock Origin and uMatrix.
They own the universe of mobile users who actually spend money on mobile.
I think they’ve established that they can virtually print money without the intrusive and risky data gathering that their competitors engage in. Fundamentally, they operate an honest business — you give them cash, they give you stuff.
They don't need to do the reeducation. Ask yourself, will there be more security breaches / issues in the future or less? Will there be more issues with governments and companies taking advantage of private users data or less? Will there be more demands for your data in the future (health, context, emotional signals) or less? I think the answers there are pretty obvious, and owning the "we care about our users, their privacy and security" frame is going to be an increasingly powerful one.
A recent example of this is Macbook Pro sales rising despite the outcry in nerd circles. Another example was when FCP X dropped and there were countless hot takes about Apple dropping their professional customers and all of them switching to Adobe and Avid.
And just for fun...
So I see no reason not to
The devices that my teams support on Android need 3-5 agents or other tools to meet compliance requirements.
As the richest organization in the world I would argue that Apple is uniquely positioned to act and further their own goals and agenda – especially long term. If they felt that privacy was a cause to fight for... who could really imagine what could be done with those deep pockets? However, you will never get there if you employ a purely economic lens. Economics is a mostly value-free toolset you need to fill with assumptions to get any results.
For what it’s worth I see a stronger path in shaping/making the future than speculating about the infinite possible alternatives that are out there. So I would rephrase the question to something like:
Is security/privacy a worthwhile goal for Apple to pursue – not only economically but in general?
This question is imho a lot more tractable and you can get somewhere with it. For example, the current market climate and possible re-education cost you mention are really important aspects to consider – but they are not the only ones... people are not just pawns but capable of reasoning => we are interested in what SHOULD happen.
I would say on the surface. The latest bugs that allowed you to login as root with an empty passwords could have been avoided with more testing (I would assume).
* The revelations of Snowden and all the many hack attacks in the news have impacted the average Joe
* Many security experts expect more downfalls in the short future. For instance, cyberwarfare has relatively just begun.
* As example, in Germany, people are much more conscious about privacy.
* And think now about people who want to secure their crypto assets investments.
All in all, I think Apple is betting properly on security and it will have an economical reward.
Of course, it's different when your life (or the life of your loved ones) is on the line. But even though we won't quite be at the same point, the stakes are increasing for software. I'd imagine in 20 years we should be beyond hygiene.
I’m struggling a bit to think what this convenience and features might be. What sort of things were you thinking of?
- Amazon's Alexa: I am willing to put a microphone and/or camera into one or more areas of my home so that I can do things without pulling my phone out of my pocket.
- Google Gmail: I am willing to give you full access to the text of my emails so that I don't have to pay for an email service.
- Facebook: I am willing to give you a massive amount of information over what I do on a day-to-day basis in order to facilitate my social life.
Things along those lines.
How much money is Apple investing ?
But again I'm probably a hardcore consumer who cares.
I really hope that there will be a change in mentality. It's crazy what google keeps of you if you enable Google Now.
Their servers are still operating in the US, they don't have a stellar history of protecting their consumer's privacy (for example not so long ago people realized that their iPhones were sending location data to Apple.
Apple's stance on privacy seems more motivated by PR than a genuine core value of the company.
This is why I switched away from Windows and Android. Open source phones don’t really exist, and out of the box usable desktop Linux laptops are rare, so most people in my boat land on Apple products.
Do you have a source please ?
As I remember it people realized that a lot of data was going to Apple's server (I will try to find the original articles again)
That wasn't to any Apple server, it was a local cache that apps couldn't get unless they used a root exploit; was safe on your backups if they were encrypted; and was discussed in the TOS (Ars's sensationalism about "without your consent" notwithstanding).
Also, who cares about core values beyond predicting future actions? I predict Apple will continue to invest in privacy.
One of the major techniques in this field is that in order to create a luxury good, you need to tell a story about a genius creator, creating something that nobody else can create, with unique methods only his company possess.
So when the iPhone was new, this story was partially true(yes the iPhone was really unique. but no Steve Jobs didn't create it with it's bare hands). but once competitors created high quality products with great design, Apple needed new stories.
So there the unique manufacturing method for the metal body, and the glass. And their processor which was really the best(but did the users really use that? ). And now we have privacy.
And marketing wise, the iPhone still remains a status symbol. So what they're doing is working. And i wouldn't bet against them.
As for them needing personalized data and services in the future ? Well they've got Google's apps for that(a good deal for Google too). And that way users can have their privacy cake, and eat it too.
[UPDATE] This comment is getting a lot more attention that I expected it to. I've watched the point count on it go as high as 20 and as low as 0, with several cycles between 0 and 10 and back, so a lot of people are voting on it. So let me say a few additional things.
First, I concede that the way I phrased my position was inartful. I apologize for that.
Second, Apple is probably the best solution on the market in terms of security and privacy. My complaint is not about them per se, it's really about the state of the market. My choices are either to hand my data over to Google or Microsoft, or to hand over my control of what I can and cannot run on my system to Apple. Neither of those is a satisfactory option IMHO.
I really don't get accusations like this. Selling customer data would violate nearly everything about what Apple cares about and believes in. It makes no sense, as soon as it came to light it would completely wreck their business as their customers leave in droves, and their employees would also leave in droves as soon as they heard about it.
Not quite. It would violate everything Apple says they care about. But there's actually reason to suspect that Apple might change this policy. Once upon a time Apple said they cared about building computers that Just Worked, that were the most reliable and easiest to use in the industry, that had the most advanced hardware. That is no longer true. Apple's hardware is way behind the state of the art. Their applications are a horrible mess. Their OSs crash regularly. Security failures are common. Of course, they didn't get up one day and announce that they were abandoning their prior commitment to quality bwahahaha, they just did it. Quietly. Gradually. Probably unintentionally. So worrying about where Apple might go in the future is not just paranoia. Apple has jumped the shark before, they could do it again.
You say "blind", I say "evidence-based". Apple has been putting their money where their mouth is. If they stop, I'll re-evaluate.
Ultimately it has to come down to do you trust Apple. The evidence to date is that their behavior is trustworthy (e.g. they fought against unlocking an iPhone device despite being requested by authorities). However, there is nothing we can do to assert that trust into the future.
Apple does comply with government data requests, as they're legally required to, for data that they already have. What Apple had an issue with was adding a backdoor to products so that it would collect data that Apple currently didn't have to hand over to the government.
Also, there was no backdoor requested, but that is a separate issue. If Apple could install a backdoor, that would make their initial claim even more of a lie.
Are you sure that's exactly what they said? Like you said, this could be construed as a lie and open them up to legal liability.
> Also, there was no backdoor requested
I'm pretty sure that's what the whole FBI thing was about. Write a version of iOS that we can install on this phone to gain access to it without the password.
I supplied a direct quote.
> I'm pretty sure that's what the whole FBI thing was about. Write a version of iOS that we can install on this phone to gain access to it without the password.
That is incorrect. The FBI asked for Apple to install a build that would let them brute force the pin without wiping the device. The device and build would stay on Apple's premises. Again, this is beside the point that Apple is untrustworthy.
The data you're supposing they might sell to the Chinese in the future is mostly stuff they go out of their way not to collect in the first place.
What makes this attack plausible is that it's relatively easy to mount (for the Chinese government) with a potentially huge payout (by their quality metric). It's hard for me to imagine that they haven't tried, and once you get to that realization, it's only slightly harder to imagine that they haven't succeeded.
Every time they boast about privacy features, they increase the penalty for them doing something egregious with user data.
For most values of “me” as a customer, it’s in Apple’s interest to behave as they advertise. They can stray due to government or other action, but so can any party you trust.
Yes, but the difference is that other parties I trust are subject to external verification and audit. If my bank steals my money, it will soon be discovered, either by me the next time I get my statement and it doesn't match my records, or by the bank auditors the next time they audit the books and discover that they don't balance. By way of contrast, if Apple sneaks a back door into a MacOS update it could easily go unnoticed for years.
Some of us here have access to systems, data, or people with high value information. Some folks are involved in things professionally that an external, unethical party could put a dollar value on accessing things or people whom the victim could access.
Your kids’ teacher might have a spouse whose job directly influences a governor, mayor, Union, Senator, etc. Access to the wrong twitter account or TV producer could influence POTUS.
> Some of us here have access to systems, data, or people with high value information. Some folks are involved in things professionally that an external, unethical party could put a dollar value on accessing things or people whom the victim could access.
I think that your reply directly contradicts the assumption contained in my question. It's pretty obvious that if I would have such access I'd adopt a completely different security priorities.
I also think it's possible that Apple could sneak a back door into the OS in cooperation with the U.S. government while maintaining an overt posture of defending their user's privacy. Yes, this would be risky, but if Apple management put patriotism or national security above commerce (which is not a completely unreasonable thing for them to do) they might decide to take such a risk.
Apple has gone to great lengths to make both devices secure and policies. You are awfully critical, so I really want to understand what is the expected behavior that you'd like to see?
Open source the security libraries? But if you don't trust Apple not to backdoor then what's stopping them from putting the backdoor in the kernel? At some point we must trust something.
That would help a lot.
> But if you don't trust Apple not to backdoor then what's stopping them from putting the backdoor in the kernel?
At the moment, nothing. That's kind of my point.
> At some point we must trust something.
Yes, that's true, but the more I can poke around under the hood (or hire someone to do it for me) the warmer and fuzzier I feel.
In other words: I think that the odds are >0 that there is a systemic weakness in Apple's security, either accidental (like the meltdown and spectre issues almost certainly are) or deliberately introduced by a mole, or by Apple's management in service of some higher cause. I think that the odds are further >0 that this weakness could be discovered by a malicious party, and that will end up leading to major problems, not all of which will necessarily be immediately obvious.
I think the odds are >0 that I will get a straight flush in my next hand at poker. Unfortunately, while true, this is not a valuable observation.
could you possibly phrase this with any more condescension, myopia, or overgeneralization?
Do you actually disagree with the substance of what I said, or are you just criticizing my style? Because if you agree with the substance then I would say that the style is justified.
I’ll try to help you say what you should have said: “Apple doesn’t have great security, their major competitors are far better because [reasons] and instead of Apple products you should be using [competitor].”
Since you're on a technical forum but talking about mass-market products, you should also clarify your audience: “if you’re a technical user, [competitor_1] is the best security you can get, while [comptitor_2] beats them on usability and user experience. If you’re looking for the best of both worlds, [competitor_3] is your best bet.”
Because what you did is walk into a discussion, say “nah you’re all wrong and besides that you’re all idiots” and walked away. Back your shit up, man. If you don’t think Apple is the product to use because of their security practices, recommend an alternative. If an alternative doesn’t exist, then keep your mouth shut because some security is better than no security and saying “shit’s all fucked anyway” isn’t going to help anything.
Apple is better than the competition with regards to security and privacy. But that's a very low bar.
Sort of like how companies suddenly started caring about their mobile phones' package design after the iPhone was released with its sleek packaging.
Another example: apparently there is a distinction between "two-factor authentication" and "two-step authentication", the later being a deprecated, but active system. Reading the docs for the older system, you'll soon discover differences in things such as account access and recovery that lead to an entirely different set of consequences and caveats for security. You'll find out that in certain scenarios you could permanently lose access to your iCloud account and iTunes purchases under "two-step authentication*, but not the newer "two-factor authentication". If a user confused the two while reading the Apple online support pages, it could have grave consequences.
Security is something that needs to be documented and marketed in clear terms. Why Apple would adopt names so similar for two distinct implementations of a security mechanism that they could arbitrarily describe either is incoherent with Apple's supposed model of user friendliness. It's what Microsoft does with its products, not Apple. Additionally, all facets of a security feature should be documented, and documented well. It is unacceptable that Apple does not warn users that 2FA can be bypassed in certain scenarios. I hope Apple does further focus on security, and documenting it well.
This is intentional. Otherwise people who only have one device would be unable to wipe their device if it gets lost.
This is not a security/privacy issue–none of your information is leaked.
> It is unacceptable that Apple does not warn users that 2FA can be bypassed in certain scenarios. I hope Apple does further focus on security, and documenting it well.
Should every password field have a disclaimer that says it can be "bypassed" by someone who knows your password?
Some of the google/Android “features” and what they do with your data, make old school keyloggers look like a joke.
> "at the factory"
I suppose the secret key is erased at the factory, however, what if it isn't? Or, is the secret key generated on-chip via a random number generator? If it were stored at the factory somewhere then it would be possible to link it to each iPhone. I'm not familiar with cryptography, so I think it's just a misunderstanding on my part, and I'm not sure if this would be a weakness in the Touch ID sensor.
That key is then used for establishing a session key in the field. The session key uses entropy from both the SEP TRNG and the TouchID TRNG.
Your threat means someone would 1. need to know the TouchID global key 2. slurp off all the keywrapped blobs for later, at all iPhone manufacturing sites. This then gives them to ability to either decrypt fingerprints, or feed 'fake' fingerprints to the SEP. Given that there are far easier ways of stealing fingerprints, that leaves the feeding fake fingerprints to the SEP. If you're in a position to do that you might as well just feed the phone a fake finger.
I see that iCloud Keychain is still secure, but pretty much everything is fucked up, right?
Depends on your definition of "fucked up" in this case.
The change is that data for China-based users will need to reside on domestic (Chinese) servers. Any associated implication derived from that would be opinion-based.
> If you submit a link to a video or pdf, please warn us by appending [video] or [pdf] to the title.
The reasons, as the document outlines, are for added security. But, having recently wiped my iCloud keychain by resetting Safari's privacy settings and inadvertently loosing all my passwords, I was surprised to discover that I couldn't restore my passwords from my own backups. The upside is a compromised iCloud password doesn't also leak all the keychain passwords.
If I understand this correctly, IF they're using Diffie Hellman key exchange to generate the shared session key for every chip, doesn't this mean Apple also owns the session key for every single iDevice out there and can crack into them if they wanted to?
Does this mean the "security" only protects users from men-in-the-middle, but not from Apple (or NSA if they come after them)?
So an 'evil' Apple couldn't crack the session key unless it had been evil all along and storing the generated keywrapped blobs. In which case you're sorta screwed no matter which way you slice it.
(First time using an Apple device, so I might be misunderstanding the 2FA situation, correct me if I'm wrong.)
You can directly transfer files to a Linux PC (via SFTP, WebDav, SMB) with iOS apps like Transmit, Coda, GoodReader, Briefcase and others.
Thanks for the app suggestions, will try them.
Given that Apple is not trustworthy and you need to be able to change and/or inspect a device to have a chance at security, this is a solid strike for a human-thought-free insecure world.
I would love to know the likelihood of this in reality. For example, What about people who look like you? You don't tend to hang around with completely random people, its often parents and siblings who, unlike fingerprints, may bare facial resemblance enough to trick it