Hacker News new | past | comments | ask | show | jobs | submit login

Workers sites seems nice but is still limited to 2MB pages (same as the max. key size for KV) https://developers.cloudflare.com/workers/sites/reference/

Depending on the frameworks used and how big your application's own code base is, webpack and other build systems can easily produce artifacts above 2mb, not even to mention that images can also reach above 2mb in size.

FD: I worked on a Workers-based file hosting project before this first-party solution came out; my system does file splitting and stitching, making it more expensive for files >2mb but possible. https://github.com/judge2020/cloudflare-file-hosting




I'm the PM on KV, and you're absolutely right. I am interested in possibly raising this limit. Large media like videos will probably never make sense for KV, but discounting situations like that, what would the limit be that you'd never have to worry about running into it 99.9% of the time?


Perhaps 5-10mb, but there's a wide range of websites that go from really heavy to really light. I don't think workers should accommodate for installing all 10 of the top 10 most-used frameworks with tree shaking disabled, but 2mb is limiting and raising it would make it viable for many more existing sites.


Thanks for the input :)


Building a non-prod Angular project produces a 7 MB js file, so the 2 MB limit definitely needs to be raised. I think it would make sense to have a higher limit for certain file types, like js and html.

I am very interested in this feature, but the limit is a bit concerning right now.


Thanks! To give some context here, KV is also implemented as a worker, and we encrypt your contents, so we’re also subject to CPU limits. There’s a trade off here and experience reports are super valuable, so thanks :)


As a workaround maybe you could, behind the scenes, split large files into 2 (or more) key-value pairs, and then bill appropriately for the use of 2 key-value pairs instead of just 1.


Possibly, yeah! You won't necessarily get a ton of speed that way...


This is how my file stitching logic works, and it's actually pretty fast (some reported 2mbps-30mbps depending if the DC had those keys locally cached), it's just incrementally more expensive for each 2mb increase in file size. https://github.com/judge2020/cloudflare-file-hosting/tree/dd... But it might be a little much for this to be a first-party solution.


Thanks for the tip. By the way, line 84 introduces some unneeded serialization; you could get some more speed there by fetching the bits in parallel.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: