Loving R2. I am having an issue of uploading larger files though, like 100MB+. The error I get is:
Unable to write file at location: JrF3FnkA9W.webm. An exception occurred while uploading parts to a multipart upload. The following parts had errors: - Part 17: Error executing "UploadPart" on {URL}
with the message:
"Reduce your concurrent request rate for the same object."
Is this an issue on my end or CloudFlare's? I'm not doing anything aggressive, trying to upload 1 video at a time using Laravel's S3 filesystem driver. It works great on smaller files.
Known issue. Currently multipart uploads can only be uploaded at 2 parts per upload ID concurrently. We have a fix pending that should fix that bottleneck within the next month or so (maybe sooner). The change will show up on https://developers.cloudflare.com/r2/platform/changelog/
For now there are typically settings you can configure in whatever client you're using to lower the concurrency for uploads.
To answer my question the answer is no, this is about concurrent requests, so it's 2 at once but it will do upto 500 for each as many times as it needs to, to upload a 20GB file.
Unable to write file at location: JrF3FnkA9W.webm. An exception occurred while uploading parts to a multipart upload. The following parts had errors: - Part 17: Error executing "UploadPart" on {URL}
with the message:
"Reduce your concurrent request rate for the same object."
Is this an issue on my end or CloudFlare's? I'm not doing anything aggressive, trying to upload 1 video at a time using Laravel's S3 filesystem driver. It works great on smaller files.