I'm reading a ton of posts saying how terrible this is, why anyone would do it, and so on. If you don't know, Samy also created the MySpace worm. OWASP built a project called Anti-Samy to combat the work he did on the MySpace worm. He was sentenced to three years probation, 90 days community service and an undisclosed amount of restitution. I'm pretty sure he knows how terrible it is, and that's the point. He spoke at Black Hat 2010 USA as well...
Storing cookies in RGB values of auto-generated, force-cached PNGs using HTML5 Canvas tag to read pixels (cookies) back out
That's pretty "nice". It might be possible to "improve" it by storing metadata inside the PNG, and then reading it by parsing it out of the raw data after the call to getDataURL().
I haven't tried this though, and it's possible browsers drop the metadata when they recreate the image. The spec says A future version of this specification will probably define other parameters to be passed to toDataURL() to allow authors to more carefully control compression settings, image metadata, etc.
It's unlikely they'd use a logo, because of the brittleness of the technique (ie, it relies on sending 304 Not Modified response due to the absence of the special tracking cookie, not due to the actual cache status).
Also, it's not clear if you get access to the actual binary data from the image as it is served, or new data generated from the image as it is displayed - hence my question as to if using the metadata would work.
It seems to work remarkably well. And they do not even use all the tricks imaginable. E.g. you could add lots of more volatile information to the fingerprint, if you also added some statistical intelligence.
What's truly disturbing is that this absolutely meets a need for a paid gig I'm working on now, where the client wants persistent identity tracking for the purposes of marketing and analytics. (I'm part of the problem, aren't I?)
You're not the only one, the SEO people in my company got onto this surprisingly quickly. I sometimes feel black (hat) has become the new white: for whatever reasons companies seem to be more willing to accept the less ethical sides of doing business on the web. Anyone else notice this or is it just me?
This technology clear goes way too far. It might be a good idea to remind employers of when the state of Texas sued doubleclick for stalking. Knowing how to do something doesn't mean that you should do it.
There will be some backlash to this sort of thing seen as an increasing number of surfers stripping back enabled browser functionality.
It totally meets a need I have, too for storing user data on a site that doesn't have accounts. Really, just a couple of numeric IDs that people actually WANT to store.
Nothing evil about it in the slightest. We store it in cookies, people lose them right and left and then are all confused about it. I could use Flash myself, but really, this sounds like a great system.
the essential reason of this kind of persistence is that it has to survive the explicit deletion of the cookie by the user.
Browser offer the user the possibility to remove cookies (manually or delete all), and this is because users want privacy.
This clever library manages to exploit browser features to go around this and store some identification information persistently against the will of the user
If the user doesn't want to delete or disable cookies, then you can use normal cookies for the purpose of legitimate "remember me" functionalities.
However I can understand that not all users know what cookies are, and there are many people who might have cookies disabled by default (sysadmin choice in a company) or somebody told them to do that.
So, you could reason this way:
"There is no point disabling the cookies anymore, since anybody could employ this trick to circumvent it. There are people which disable cookies because an obsolete 'security policies' which is not anymore secure. I want to make a webapp that works for everybody. It requires cookies. People are paranoid but employ obsolete security policies which don't protect them anymore. I can exploit the same trick to circumvent their default security policy for morally good reasons"
Your exploiting a security hole in my browser and overriding my explicit wishes to benefit your company is no more ethical than my exploiting a security hole in your website and "fixing" your database.
How many software include tricks to get around firewalls by punching holes (http://www.h-online.com/security/features/How-Skype-Co-get-r...). Is this unethical because it circumvents an explicit user wish? Do people even know that they have a firewall, or know what a firewall is, or do people even have control on the firewall settings (at work for example)?
Of course they want to run e.g. skype, who doesn't, right?! I know that there is something arbitrary in all that, that's the point.
I didn't say it was ethical to circumvent the user wishes. I said that some people might reason in such a way that it makes them feel morally excused for exploiting something which is perceived as an unethical technique in order to perform a licit goal.
The main points behind this mind setting are:
(here "you" are the application devel, not the evil guy, of course)
* point out the user de facto doesn't have control on his privacy settings by disabling the cookies, since the Bad Guys (TM) already have a hack to go around it.
* point out that the user is not even conscious of what privacy and security risks are, and often run a browser preconfigured by the sysadmin, nephew, whatever, which might decide to conservatively block cookies "because they are bad".
* you are not exploiting the cookies with the purpose to invade user privacy. You are just building an application X (see grand parent question) which exploits the same hack to get around the 'default paranoid settings'.
* you feel stupid to limit your application functionality just to obey some obviously bugged rule. It would be like skype saying "oh, there is a firewall, I know how to get around it, but I won't because it's unethical since people have the right to setup a firewall according to their wishes".
(of course these points are valid once this technique becomes mainstream, and all tracking sites employ it)
I'm not saying that behaving this way is ethical or not or less unethical. I'm just supposing that there might be some uses of this technique which are not directly intended to trace the identity of a user for malicious reasons (marketing etc) but for providing some functionality to the average user of a particular product (who asks for it).
People might be pissed out because some features don't work. They don't care why. Application providers are also pissed off when half of their users cannot use a given feature because some sysadmin/security software/nephew hacker decided to impose some restriction (settings, firewall rules etc), even if there are valid reasons for the restriction (settings, firewall, etc) to be be there.
All of the methods he uses have been known to the web-app security community for a while. He's simply raising awareness of what's already broken.
Keeping these things quiet helps nobody. We need more privacy and security issues to be publicly demonstrated so that they'll get fixed instead of ignored.
As an example, his work exploiting wireless routers to get location is genius. Who would have thought that having your router's wireless MAC available to your internal network allows a website to determine your location to within a few hundred feet? It uses well known and oft ignored attack methods to produce a sensational result with which everyone can immediately identify.
It's not evil. It just shows that "Clear cookies" button is no longer an effective privacy tool.
Browser vendors are aware of this already and working to make evercookie no worse than regular cookie, e.g. Mozilla blocked reading of visited link history, Chrome privacy window has link to Flash LSO controls. All vendors are working towards making it better integrated and more effective against all "evercookies".
I know that in Chrome's incognito mode, nothing gets written to the disk at all (including Flash's Shared Objects). So if I open an incognito window, browse, then close Chrome, then open another incognito window and return to the page, does this defeat all this?
Did you completely quit the browser in between your visits? Because if the instance of Chrome from which you opened the incognito windows have been running between your visits, it may have retrieved these from the memory, even if you closed the incognito window after the first visit.
Why doesn't anyone try this? I did, and it seems that incognito mode does defeat this. However, since this always sets the same cookie, I couldn't tell if it read it or set it. From the looks of things, it only read the cookie, which means that closing incognito mode deletes it.
I think the take-away here is that if you're going to use a trick like this, it might be in your best interest to be transparent with your users and offer a way for them to remove all of this information. Of course, if you're using this particular hack then you probably don't want your users to remove the cookie to begin with.
Just tried this in Opera. Having never visited the site before, I opened it in a "New Private Tab". Set Opera to reject all [normal] cookies from that domain. Saw an ID number on the page; recorded it. Opened another private tab. Saw a different ID number. Refreshed page; got yet another (different) ID number. Revisited page within same [private] tab (pressed Enter in address bar): got yet another (different) ID number. Did the same in Chrome (regular tabs): saw same behaviour.
Using two different private tabs in Opera, I get two different IDs to start with, but when using the "click to rediscover" buttons, both allegedly private tabs [eventually] end up with the same ID.
for me that the major question here. what are the legal implications since using this sort of cookie involves a set of hacks that derive the normal use of various systems for a purpose they were not intended for in the first place, and since it is intended to defeat some of the privacy protections of browsers. I am not condemning this clever system but I am curious of the privacy and other legal issues here...
"Starting with Flash Player 10.1, Flash Player actively supports the browser's private browsing mode, managing data in local storage so that it is consistent with private browsing. So when a private browsing session ends, Flash Player will automatically clear any corresponding data in local storage."
Well, if you're not writing anything to disk, nothing should get cached. So PNG caching wouldn't work. But I'm not so sure about flash storage ... if it's using a bit of flash to store that, then flashblock should stop it, if it's allowed to run in incognito, but if there's some other way then perhaps not.
Seems like this will be pretty effective even if widely known, since clearing one of these 'cookies' will require deleting a lot of info, including some you were probably using.
You'll be simultaneously clearing history and cache, logging yourself out of every site you're logged into, clearing all offline state in every web app you use, etc. Most people won't want to do that often.
The privacy tools built into all current browsers can clear all but one of evercookie's storage methods. Specifically, cookies, cache, history & HTML5 storage should all be included in your browser's "clear private data" feature. Flash cookies are a bit more of a problem: they're in a plugin, so the browser doesn't know about them. A tool like CCleaner would work, or you could clear them manually with Adobe's Flash control panel: http://www.macromedia.com/support/documentation/en/flashplay...
Flash Cookies: I imagine that if he can create them, then you can remove them, though I'm doubtful that extensions have full access to Flash internals.
HTML5 Storage: I'm not an expert on the different types of HTML5 storage, though since this is at the browser-level, I imagine the it would be easy for an extension to access them.
Regular Cookies: Obviously extensions have access to these.
Force-Cached PNGs: Not sure what access extensions have, though I imagine that Firefox extensions have a higher likelihood of access than Chrome/Chromium extensions. This is also hard to detect automatically though, unless you want to take the NoScript route and block all force-cached images unless they meet a whitelist.
Web History: Extensions obviously have access to web history, though this is something that would vary from implementation to implementation of evercookie, so it would be fighting an endless battle, like spam email filters. The best fix here would be to close the css history hack hole.
I don't know how you work but I keep the browser open 12 hours a day which is a very long session. During that time they can track you across their network of sites if you allow those types of storage by default in the first place.
There was a hack to see what pages someone visited. I see that they actually link to a page about that. The problem with that approach would seem to be that it is a cross-domain vulnerability so other domains could detect the history thus ever-cookie data.
Not sure if they have some other mechanism for preventing this problem. I actually thought this problem had been resolved in some manner in many browsers but that doesn't appear to be the case.