>Storing cookies in RGB values of auto-generated, force-cached PNGs using HTML5 Canvas tag to read pixels (cookies) back out
>Storing cookies in Web History (seriously. see FAQ)
Brilliant and EVIL. Wow.
Cookie stored as a simple variable in a cached JS file is IMHO better solution if you're trying to be sneaky — there's nothing unusual in variable assignment or cacheable JS file.
That's pretty "nice". It might be possible to "improve" it by storing metadata inside the PNG, and then reading it by parsing it out of the raw data after the call to getDataURL().
I haven't tried this though, and it's possible browsers drop the metadata when they recreate the image. The spec says A future version of this specification will probably define other parameters to be passed to toDataURL() to allow authors to more carefully control compression settings, image metadata, etc.
http://www.nihilogic.dk/labs/imageinfo/ shows how to extract EXIF data from JPEG files, so using EXIF + the cache hack is possible for sure.
Can anyone think why just using the cache hack + a JSON data file wouldn't work?
Also, it's not clear if you get access to the actual binary data from the image as it is served, or new data generated from the image as it is displayed - hence my question as to if using the metadata would work.
If all you want to do is track users, it's far easier to use UserAgent/screensize/plugins/etc to uniquely identify users.
You can then store anything heavier server side.
There will be some backlash to this sort of thing seen as an increasing number of surfers stripping back enabled browser functionality.
Nothing evil about it in the slightest. We store it in cookies, people lose them right and left and then are all confused about it. I could use Flash myself, but really, this sounds like a great system.
I'm curious more than anything else. For example, using this persistent cookie as an alternative to having users login?
Browser offer the user the possibility to remove cookies (manually or delete all), and this is because users want privacy.
This clever library manages to exploit browser features to go around this and store some identification information persistently against the will of the user
If the user doesn't want to delete or disable cookies, then you can use normal cookies for the purpose of legitimate "remember me" functionalities.
However I can understand that not all users know what cookies are, and there are many people who might have cookies disabled by default (sysadmin choice in a company) or somebody told them to do that.
So, you could reason this way:
"There is no point disabling the cookies anymore, since anybody could employ this trick to circumvent it. There are people which disable cookies because an obsolete 'security policies' which is not anymore secure. I want to make a webapp that works for everybody. It requires cookies. People are paranoid but employ obsolete security policies which don't protect them anymore. I can exploit the same trick to circumvent their default security policy for morally good reasons"
Of course they want to run e.g. skype, who doesn't, right?! I know that there is something arbitrary in all that, that's the point.
I didn't say it was ethical to circumvent the user wishes. I said that some people might reason in such a way that it makes them feel morally excused for exploiting something which is perceived as an unethical technique in order to perform a licit goal.
The main points behind this mind setting are:
(here "you" are the application devel, not the evil guy, of course)
* point out the user de facto doesn't have control on his privacy settings by disabling the cookies, since the Bad Guys (TM) already have a hack to go around it.
* point out that the user is not even conscious of what privacy and security risks are, and often run a browser preconfigured by the sysadmin, nephew, whatever, which might decide to conservatively block cookies "because they are bad".
* you are not exploiting the cookies with the purpose to invade user privacy. You are just building an application X (see grand parent question) which exploits the same hack to get around the 'default paranoid settings'.
* you feel stupid to limit your application functionality just to obey some obviously bugged rule. It would be like skype saying "oh, there is a firewall, I know how to get around it, but I won't because it's unethical since people have the right to setup a firewall according to their wishes".
(of course these points are valid once this technique becomes mainstream, and all tracking sites employ it)
I'm not saying that behaving this way is ethical or not or less unethical. I'm just supposing that there might be some uses of this technique which are not directly intended to trace the identity of a user for malicious reasons (marketing etc) but for providing some functionality to the average user of a particular product (who asks for it).
People might be pissed out because some features don't work. They don't care why. Application providers are also pissed off when half of their users cannot use a given feature because some sysadmin/security software/nephew hacker decided to impose some restriction (settings, firewall rules etc), even if there are valid reasons for the restriction (settings, firewall, etc) to be be there.
Perhaps one day Samy will look back and reflect that he isn't evil man, though he has done evil things.
(The thing is I'm not even sure how serious I am. On the one hand, damn, clever. But on the other hand, I can see some truly miserable privacy issues at play here.)
Keeping these things quiet helps nobody. We need more privacy and security issues to be publicly demonstrated so that they'll get fixed instead of ignored.
As an example, his work exploiting wireless routers to get location is genius. Who would have thought that having your router's wireless MAC available to your internal network allows a website to determine your location to within a few hundred feet? It uses well known and oft ignored attack methods to produce a sensational result with which everyone can immediately identify.
Still, with it all packed up so tidily, a few rascals will do something interesting with it.
>400 Bad Request
Cross Site Action detected!
Sweet :) Though that's vs the vanilla script. Anyone know if there's one that works against DD-WRT?
Browser vendors are aware of this already and working to make evercookie no worse than regular cookie, e.g. Mozilla blocked reading of visited link history, Chrome privacy window has link to Flash LSO controls. All vendors are working towards making it better integrated and more effective against all "evercookies".
It has never been. the vast majority (90%+) of browsers are uniquely identifiable simply from useragent, plugins, capabilities etc.
Will have to repeat this experiment with the current version.
I think the take-away here is that if you're going to use a trick like this, it might be in your best interest to be transparent with your users and offer a way for them to remove all of this information. Of course, if you're using this particular hack then you probably don't want your users to remove the cookie to begin with.
I don't know why people allow cookies to persist between browser sessions. I've been clearing them on exit for years now and it really doesn't make it more difficult to use the Web.
could a grease monkey script automatically clean up the supercookies after they have been planted?
Before you do that though, have a look at what info is stored under samy.pl. Nice to see Chrome list HTML5 storage and cookies etc in one place.
* I'm not sure whether the above is really effective.
* Repeating this for N sites that uses this is going to be fun. There's always the whitelisting approach, which is available in Chrome too.
Using two different private tabs in Opera, I get two different IDs to start with, but when using the "click to rediscover" buttons, both allegedly private tabs [eventually] end up with the same ID.
"Starting with Flash Player 10.1, Flash Player actively supports the browser's private browsing mode, managing data in local storage so that it is consistent with private browsing. So when a private browsing session ends, Flash Player will automatically clear any corresponding data in local storage."
You'll be simultaneously clearing history and cache, logging yourself out of every site you're logged into, clearing all offline state in every web app you use, etc. Most people won't want to do that often.
Would a browser extension be able to clear everything?
i.e. the escapist thinks you're a scraping bot and bans you from viewing their videos for a week.
HTML5 Storage: I'm not an expert on the different types of HTML5 storage, though since this is at the browser-level, I imagine the it would be easy for an extension to access them.
Regular Cookies: Obviously extensions have access to these.
Force-Cached PNGs: Not sure what access extensions have, though I imagine that Firefox extensions have a higher likelihood of access than Chrome/Chromium extensions. This is also hard to detect automatically though, unless you want to take the NoScript route and block all force-cached images unless they meet a whitelist.
Web History: Extensions obviously have access to web history, though this is something that would vary from implementation to implementation of evercookie, so it would be fighting an endless battle, like spam email filters. The best fix here would be to close the css history hack hole.
Firefox has the BetterPrivacy extension. https://addons.mozilla.org/en-US/firefox/addon/6623/
about:config -> dom.storage.enabled -> false
The above causes evercookie to fail for me.
Not sure if they have some other mechanism for preventing this problem. I actually thought this problem had been resolved in some manner in many browsers but that doesn't appear to be the case.
// sorry google.
var url = 'http://www.google.com/evercookie/cache/ + this.getHost() + '/' + name;
Chrome Incognito mode.
New cookie every F5.