Regarding the list of devices, it should be noted those IDs are site-specific, so they can't really be used for cross-site tracking, and clearing cookies also resets them (at least in Firefox). But there are discussions pointing to a consensus around disabling the enumeration before permission is granted. As in other cases, the browsers will probably have to lie to avoid breaking sites.
From an anti-fingerprinting perspective you do much better to restrict what system fonts are available than to remove all of these valuable and common reactive features.
Unless that dependency is specified in a style sheet language, yes. I think that’s exactly how it should be: the web of 'apps' is fine to break this if it wants, the web of 'content' shouldn’t be concerned with extremely detailed matters of presentation.
> From an anti-fingerprinting perspective you do much better to restrict what system fonts are available
Agreed; I’m not saying there isn’t lower hanging fruit, I’m just commenting on the one that interested me.
I'm just saying that simply blocking or limiting that part of the DOM API is probably not a practical solution.
* Chrome on iOS started sending "webp" on it's Accept header when it didn't support inline webp yet.
* Firefox would take CSPs applied to same-origin IFrames and additionally apply them to the parent document.
* Edge wouldn't accept data URLs for IFrames that were more than 4096 characters.
In all of these cases the bug was eventually fixed, but we needed to work around the bug in the meantime. UA parsing to say "if it's Edge < v76" or whatever was the best way to do this.
Note: This only matters for bots that serve to aid your website. Like search engine crawlers being given instructions what pages to ignore.
+ A new set of optional HTTP headers
+ A new JS API
chargingTime : Infinity
level : 0.77
Yes that is indeed my phone's current battery level. And no matter if incognito Brave, Firefox Focus or Chrome browsers, I am unique to the site.
Even Firefox Focus which is meant to be good for privacy by throwing the entire session each time is leaky as eg with the audio context.
The problem is that shitheads exploit every possibility to reduce the utility of society in favour of their own profit.
This is why we can't have nice things; as the aphorism goes.
Maybe programmers and designers should start thinking the other way around: what could be pretty clever that could be used for a feature but that CANNOT be misused for tracking
That sounds like a client concern, not a server concern.
I mean, as a consumer of web content I do not want my power saving profile to be accessible or actively influenced by a third party.
Having said that, I do get that battery level has some potential as part of a very short-lived fingerprint, and I really don't see the need to expose this information to remote sites.
Anyway, if the site had to ask for the fonts, it would be viable for the browsers to impose some sane limitations on their API.
As soon as you introduce a programming environment you introduce the ability to do all sorts of things that aren't builtin.
It's generally not the browser saying, here's the list of my settings, you asked for, but rather a program being executed that pokes around at all the things it can interact with to squeeze out info.
So we're going to ask the profit-motivated bad-actors across the web to pretty-please exempt us from your tracking because we asked nicely?
And we need to hide this away in settings and have it default-off because otherwise how will they know we really meant it?
It's like it was designed by someone with an early 90s understanding of the web, and no comprehension whatsoever of just how awful marketing can be.
Tracking should never be opt-out in the first place, and it needs to be fought with technological and legal measures, not silly http request flags.
But frankly, all of this is just trying to negotiate with black hats. Any company actually acquiescing to rules for privacy will find themselves out-bid by companies who can claim they have better data by operating outside the law. These protocols are using the general public's privacy as sacrificial fuel to accomplish their impossible aims. It's a whole lot of wasted effort.
I wish the EFF would figure this out, It's impossible to law away a technological reality.
There are still companies that hide the opt-outs in places I can't find after a lot of poking around (Oath, I'm looking at you), and a lot who will make the 'no tracking' button on their pop-up small and greyed out, and then ask you to confirm it using weird language like a link saying "Leave" that actually takes you back to what you were trying to read.
Better remove the DNT header from this point of view.
Brilliant and terrible.
That prompts me to reconsider my policy of blocking JS. Until now, I've been blocking only third-party JS. Now I am considering blocking all JS by default.
But of course most browser vendors don't have an incentive to resist fingerprinting. And if only one vendor does this then the anti-fingerprinting is likely to be less effective, as the pool of browsers is much smaller.
I think there are actually plenty of examples where the browser vendors are doing this.
Google gets so much free tracking from analytics, ads, and OpenID, I don't think they really need to resist privacy oriented browser changes.
They might resist some privacy changes, idk, and are probably trailing safari and firefox and brave etc
(Separately, your proposal would break tons of sites.)
Yet I'm uniquely identifiable. One culprit is screen size putting me at <0.01%.
Does that make defeating fingerprinting on mobile hopeless for the casual user?
Edit: more info. All JS is blocked, and I have privacy.resistFingerprinting. The page doesn't detect my adblocker. Still, there are just too many things I can't change:
- hardware concurrency: 1.7%
- audio formats: 0.2%
- navigator properties: 0.2%
- audio data: 0.1%
I was surprised at this one:
- Media devices: Unique
What are media device identifiers for, exactly? Why does the browser supply it without JS?
EDIT: I just discovered ublock origin can disable js by default. Now I wish I could change the user agent...
1. Find out the top non-unique fingerprint [of the year.]
2. Create a Firefox add-on (or modify Firefox source if an add-on is not powerful enough), which uses the most non-unique fingerprint [this year.]
3. Targeted advertising ends for those who use the plug-in/add-on.
It's a cat and mouse game as fingerprint parameters keep increasing, but I think it's possible to win this one.
But I only care about filling captchas on say banking websites, which is approximately happens once a month.
The rest, I just close the tab: the content never worth it.
The most obnoxious ads don't care about captchas, they just blast you with ads from your previous search keywords/browsing history.
p.s. Would the amount of captchas reduce if we take not the top most non-unique fingerprint, but say "slightly below average"? It will be still severely non-unique for ads purposes.
I don't think it's possible to win this one without dramatic changes to the web (e.g. abandon JS). Panopticlick was started 10 years ago, fingerprinting was pretty well known about back then, and though browser vendors have started adding defenses, it has evidently done nothing as the amount of data you can gather via scripts keeps only increasing and blocking some vectors would break some sites.
The situation was dire 10 years ago, it hasn't gotten better, it's not getting better. If anything it's now just worse because of so many idiots writing sites that don't really work at all with JS disabled.
Firefox has an about:config preference that would let you set up a font whitelist yourself, but it doesn’t have a standard set.
Does Safari actually do something similar? On my Mac, amiunique.org reports 344 fonts in Safari, 331 in Firefox (not using "font.system.whitelist"), and 309 in Chrome.
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.130 Safari/537.36
for what it's worth. Can it really be that only two in a thousand people visiting the site in the last seven days have the latest version of Chrome? Or does Chrome just update so often that very few people tend to have the same version at any one time. (If the latter, that suggests that maybe it's not a very strong tracking signal.) It seems more likely to me that the data is just stale but I guess I don't know much about the true distribution of Chrome useragents.
I’m using the current iPad mini, which is 10 months old, with the current os version. I highly doubt I’m that unique.
User-Agent is divulged publicly with all requests to all sites, so no protections exist around using it, because it is freely offered up by the user without requiring consent.
Client Hints have to be specifically requested by the remote website, which demonstrates intent to collect data, and thus falls under data collection laws.
This seems like an argument you'd lose if your website failed to respond to requests that were missing the User-Agent.
Then sites that need some information can get it, but won't be allowed to get so much identifying information that you're unique.
(Disclosure: I work for Google)
I'm running a stock Windows 10. Microsoft Edge with Ublock Origin as only browser extension.
Software installed is Visual Studio, Visual Studio Code, NVidia CUDA development System, Erlang Dev System, and Microsoft Office. That's it.
You'd think there would at least be "dozens" of us. Nope:
> Your full fingerprint is unique among the 1572109 collected so far.
User Agent is < .01%
> Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/81.0.4023.0 Safari/537.36 Edg/81.0.396.0
and my fonts are < .01%
> Agency FB, Aharoni, Algerian, Arial, Arial Black and 171 others
Other < .01% are "WebGL Parameters", "Connection" and "Navigator Properties"
are they tracking that i've run the test before, verifying that i'm me, and telling me i'm still unique, or is my fingerprint differing between two subsequent tests?
also, i'm suspicious about some of these values - only 0.13% of people have a querty keyboard layout? Only 7% of tests have no gyroscope? That doesn't sound right.
'AmIUniqueId', expires Mon, 25 May 2020 12:28:29 GMT
To be useful, fingerprinting techniques need to somehow be robust against always increasing version numbers of evergreen browsers in ways that this website is not.
(Though, to be fair, they go by UTC offset, so it's only UTC-8 for a few months of the year due to DST. That probably affects the number; maybe the percentage is higher for UTC-7.)
For example: I visited this same site a while ago with the same device, and both times it has said I was unique. A new version of iOS came out, so my user agent changed. Unless sites are also storing unique data on your machine (through cookies or localstorage), browser fingerprinting is a crapshoot. This goes double for mobile devices.
It happens a lot in the Tor mailing list. Often, a new major release of Tor Browser comes out, a user is shocked by the upgrade, "OMG! I'm unique and trackable!". Don't panic, just keep using it.
I'm sure someone's also devised a way to guess how to bind two fingerprints together when OS or browser updates but many other parameters (timezone, ip, language, screen size, fonts, etc) remain similar enough.
Plugging in an article and demo I wrote some time ago: https://jwlss.pw/ja3/
Also, I guess TLS fingerprints would change over time, with browser upgrades, although I'd expect changes to be relatively infrequent.
Wait, any website knows what sort of graphics card I have?! Or how many CPUs? That's just ridiculous!
No wonder folks can write and target exploits so perfectly nowadays! If it's an integrated graphics, you have the processor family, easy. Wow.
Why would any random site need to know this sorts of information when all I'm trying to do is read text and view images?
Often, the "images" on the web are increasingly becoming 3D graphics dynamically generated and rendered by OpenGL on-the-fly, perhaps real-time ray-tracing is coming soon... You can see a lot of applications of in-browser 3D graphics on data visualization websites, for example. And naturally, the program needs these information to render graphics, but it can be repurposed as a tracking tool.
I disable WebGL manually, and use it only when it's necessary. There are a few plugins that allow you to manage it as well, for example, NoScript, but it's overkill to use it just for this single task.
I think the solution should be an opt-in option. WebGL shouldn't be activated unless the user gives the permission to a specific, first-party website.
Is there a list of these useless things that any sane user should disable? I already have had fonts disabled with "gfx.downloadable_fonts.enabled" toggled away from the default; any other useless attack vectors with buffer overflows could likewise be disabled?
I was really freaked to discover that all Debian family VMs on a given VirtualBox host have the same WebGL fingerprint. Same for all Windows VMs, all Red Hat family VMs, etc.
But then you can't see all the cool WebGL stuff that gets posted here.
More permission dialogs just train users to click "yes" on every permission dialog, so it really isn't a solution (remember Windows Vista?). I'd prefer it if there were some way to make browsers respond to WebGL queries in a similar predictable way to mitigate fingerprinting.
It's not perfect, but works reasonably well and has successfully protected a large number of users from Adobe Flash trackers.
> Content language 31.51% en-US,en;q=0.5
This is a default Firefox. I didn't change anything. Surely there's probably only Chinese that has more users of this language?
I'm not even an American or in the US... (all time it was just below 30%, not teal-colored, but already orange..)
Also why is the JS result for Content language 35.40%?
Whereas my list of plugins is 46.2% (higher value seems better) - but I'm absolutely sure there's at least one uncommon one among them...
Hardware concurrency 16 is really interesting, 3.32%, probably all fellow Ryzen users.
TLDR: Surely there are some of those properties where it absolutely makes sense to avoid the uniqueness, but others seem a bit like bullshit metrics.
But on the other hand, my user agent (OS+Browser) = 7.37% is a really good sign for me. I find this a very, very high percentage. (Ok, it's Win10+Firefox, but still...)
A basic Microsoft Edge installation with en-GB content language header has a similarity ratio of 0.18% -- really?
Java enabled: true -- 0.26%
Java enabled: false (on Brave) -- 8.75%
en-GB would start below 5%, having en-GB on edge means someone would have to be using edge (under 1%) and not on the mobile for a mobile first site.
Keeping a current java version enabled is difficult. Each browser update, each java update disables java by default now.
Could be a small Brave sample size.
You can browser behind a generic user-agent firewall, but you'll get a severely degraded experience that treats an Apple watch the same as a desktop workstation.
Another approach would be to use standardized VPN's that hide client machine differences.
There are downsides, of course. I'm not sure people care about fingerprinting enough to do all that.
Perhaps their data is under-representing mobiles?
I can see how fingerprinting would work on a Windows PC with its vast number of possible combinations of cpu/gpu/installed system fonts/add ons etc. But how would this works on mobile?
My list of fonts is only 0.55% similar
And user agent is only 0.34% similar
Those are hard to believe on a device where these things really don’t get customized.
The tests you can spoof like the canvas fingerprinting where it doesn't break the canvas and the results are already so spread out being unique isn't an identifier itself are already built into a lot of browsers. The site doesn't really acknowledge this though, it just says "you're unique" without checking if it's a different unique value each time in which case it doesn't identify you at all.
The Canvas fingerprint is the biggest one, as it relies on different forms of hardware acceleration which is based on GPU+CPU+configs for your computer and browser.
But for the last 7 days, Firefox has over 300%, and Windows over 285%. Gecko is over 700%. Screen left of 0 is 250%. -- I'm guessing some kind of calculation error?
It claims my user agent (the latest stable Google Chrome - default browser - on a stock Umidigi android device) is unique.
Considering there are probably 5 million people worldwide with my setup, I am surprised it's unique.
> User agent: Mozilla/5.0 (Windows NT 10.0; rv:68.0) Gecko/20100101 Firefox/68.0 (all time: 2.19% / 30 days: 10.80%)
> Accept: text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8 (all time: 52.15% / 30 days: 37.04%)
> Content encoding: gzip, deflate, br (all time: 66.35% / 30 days: 92.02%)
> Upgrade Insecure Requests: 1 (all time: 27.33% / 30 days: 85.86%)
> Referer: https://amiunique.org/ (all time: 16.55% / 30 days: 60.97%)
But because the sample size of this website is limited, it only sees the fact that these attributes seem to be a very small percentage of the overall traffic, while ignoring the fact that these headers are generic, it makes the conclusion.
> "You can most certainly be tracked"
The website actually tells you that,
> But only 1440 browsers out of the 1560340 observed browsers (<0.01 %) have exactly the same fingerprint as yours.
Interestingly, when a new Tor Browser release comes out, there will always be an user to upgrade the browser before everyone else, then goes to a fingerprint testing website, tests the browser, and says "OMG! I'm unique and trackable!". In this case, don't panic, just keep using it, you won't be unique anymore within a week.
The accelerometer, gyroscope and proximity sensor has a ~9% false ratio here which I intepret as 9% laptops/desktops and 91% phones. That's a bit too many phones. But use of Adblock is over 34%. Also, I see 15.36% Chrome PDF Plugin and 23.81% Native Client both which only exist on PCs and again doesn't really mesh with only 9% non-phone. What's going on...? Are there laptops with all three things listed...?
Also, everything else puts Chrome market share above 60% these days, only 38% here.
Some of the stats are clearly broken. For the 7 day duration, those stats are closer to 95% false. For the 90 days duration, I see values like
Platform 153.00% Linux x86_64
Screen height 253.53% 1080
Media devices 159.95% Timeout
I don't see PDF reader stats at all.
> Also, everything else puts Chrome market share above 60% these days, only 38% here.
Easily explained by bias if it's even correct. The kinds of people who would try this service are also the kinds of people who would use a niche browser.
Is there any way to have a language installed in the operating system without leaking that information to the browser?
You can do that in Chrome from Settings > Languages, remove what you don't need, but i guess you will lose auto-correct for those languages
On Firefox: Options -> Language -> Choose your preferred language for displaying pages
If my unique identifier changes with each browser version + os version + timezone + build id + fontpack + screen size... am I really trackable through this vector? Those things change all the time for me at least. Seems like a very short term method of tracking at best. Why should we be afraid of these things? I had the same question about panopticlick (or whatever EFF called it).
Too many people have similar devices for it to be the primary way to track people.
It is really a backup method of tracking that works some of the time based on a probabilistic match.
Your IP address is also a big factor in finding a match.
I am hesitant to extend the result that I can be uniquely tracked to category “iOS developers” beyond that.
I recommend you to try visiting the same page on your computer, the slightest variation of your hardware or software configuration will pinpoint your identity to 99% certainty: What is the position tolerance of your canvas? What is the list of fonts or browser plugins you have installed? What is the latency of your audio interface?
(Yes, I know how tracking works. It’s ironic to watch people realize that all their protest actions are tracking beacons. I wish it was more widely understood that being obvious is a consequence of being obviously different.)
So if I hit this site from the VM that one of my evil twins uses, its Firefox fingerprint will also be ~unique.
And I can have lots of VMs. Maybe not all running at once, but still. And yeah, they use different exit VPNs.
A bit surprised that it doesn't even need to do any fancy combinations of identifiers to get a uid in my case.
Really wonder how I can hide that information from my browser...
Edit: To block canvas fingerprinting I set `privacy.resistFingerprinting` to true in Firefox's about:config. Now I am trying to figure out how to disable the font list and Media devices, etc.
Wow I didn't know it reveals the GPU
It's therefore like a cookie, but weaker in that it gets cleared on browser restart (and, in private browsing, it's always changing). So it's not very useful for tracking, since a site could just use a cookie instead.
If that's your only Unique attribute, then your browser might be less unique than the website claims.
In case it's not clear, uniqueness should be seen as bad in this case. It means you can be tracked.
Version 1.2.43 Chromium: 79.0.3945.130 (Official Build) (64-bit)
I think it might just mean that we leak waaaay more data than most of us realize...
I'm surprised to find that simply by having Ubuntu that I tweaked to my liking and a laptop, you can uniquely identify me by how much screen space I have left.
Can anyone give me an explanation of how the media devices thing works, though? Is video in and audio in given some random hash which websites can then use as the random hash which identifies me?
I’m running safari, private mode, plus a single content blocker app.
But: seems to me that if you do something unusual (turn off JS, use weird and wonderful browser, hide browser stuff from The Internet) then you make yourself even "more unique". Turning JS off still doesn't really help - and more to the point makes the modern web basically unusable.
I ended up with: meh, nothing anyone can do, why worry?
I have the opposite experience. Turning JS off makes the modern web much more usable!
Regarding "uniqueness", if you browse from a fixed IP address then you are unique anyway and your browser fingerprint does not really matter.
Look at this link:
If Apple cares about privacy, they need to get on this in mobile Safari ASAP.
Android - default Chrome.
Fyi, this whole tracking thing isn't that hard to circomemvent. You only need one parameter to change to generate a unique hash.
With all the parameters they include, this should be done fairly easy.
So anything that's not QWERTY nor "unsupported" makes up the remaining ~95%?
Keyboard layout 0.09% Qwerty
> Content language <0.01% en-US,en,bg,es
Bruh... 0.01%?!!? Seriously?!?!?
For me, it just detects that it's Linux (all time: 8.15%), while Platform is Linux x86_64 (all time: 10.93%). Kinda funny that there are more x86_64 Linux users than Linux users. Makes me wonder whether Ubuntu users are considered Linux users at all..
Updated iPhone Safari. En locale. I have hard time believing it's that unique.
A better site with similar functioning is: https://panopticlick.eff.org/
this tool said it is.
It'll cheerfully tell an unlimited number of identical browsers that they're "unique" because that's the message it is here to sell.
If it gave these supposedly "Unique" browsers an actually unique identifier they'd be able to compare it and see that they're not so "unique" as claimed or worse that the same browser gets different "unique" identifiers and so they aren't identifiers at all. So that's why it doesn't do that.