Hacker News new | past | comments | ask | show | jobs | submit login

Loving R2. I am having an issue of uploading larger files though, like 100MB+. The error I get is:

Unable to write file at location: JrF3FnkA9W.webm. An exception occurred while uploading parts to a multipart upload. The following parts had errors: - Part 17: Error executing "UploadPart" on {URL}

with the message:

"Reduce your concurrent request rate for the same object."

Is this an issue on my end or CloudFlare's? I'm not doing anything aggressive, trying to upload 1 video at a time using Laravel's S3 filesystem driver. It works great on smaller files.




Known issue. Currently multipart uploads can only be uploaded at 2 parts per upload ID concurrently. We have a fix pending that should fix that bottleneck within the next month or so (maybe sooner). The change will show up on https://developers.cloudflare.com/r2/platform/changelog/

For now there are typically settings you can configure in whatever client you're using to lower the concurrency for uploads.


Hey does this mean there's currently a max upload size of 1GB since each part can only be 500MB and you can only upload in 2 parts?


To answer my question the answer is no, this is about concurrent requests, so it's 2 at once but it will do upto 500 for each as many times as it needs to, to upload a 20GB file.


Awesome. thank you for the answer :)




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: