DigitalOcean Space Backup Failed - SlowDown: Reduce your request rate.
-
@andirahmat I don't see any downtime from their status page. Mmm, I am not sure why it thinks you need to slow down since you are only uploading tar.gz even (so it's really just one backup file per app). Do you think you can ask their support?
-
You're getting rate limited. https://www.digitalocean.com/docs/spaces/#limits clarifies when they kick in. The one that's probably coming into play is "150 PUTs, 150 DELETEs, 150 LISTs, and 240 other requests per second to any individual Space" and if you're using the rsync backup mechanism, it's dead easy to hit if you've got a reasonably sized collection of files, especially small ones that the box can just rip through faster than that rate limit. DO spaces is very...technically acceptable...in most cases, but this is one where it just isn't up to the job. The "limits" section goes on and on about the various limits. That's the tradeoff for the simplicity in pricing (e.g. lack of complex/nuanced billing logic) it offers.
-
@jimcavoli Ah I see. Is there anything we can do on cloudron side? I do not think i use rsycn btw. I've also enable cdn. Or deleting existing backup files will solve with the limitation?
-
@andirahmat Unfortunately, no, there's not much to be done. The limitation of the service is there all the time. The backup mechanism could be made to back off a little more aggressively in the case of 503 errors specifically, trading speed for success, I suppose @girish could help make the call on whether this has been an issue with other backends as well
-
@andirahmat It's quite strange that you hit rate limits when using tgz format. Are you able to reach out to us on support@cloudron.io so we can debug this further? We should be far below the request limit. Also, AFAIK, the rate limits are per account/token. Could there be other things other than Cloudron using the same DO Space?
-
@girish said in DigitalOcean Space Backup Failed - SlowDown: Reduce your request rate.:
ere be other things oth
Hi. I only have 1 spaces and only use cloudron for it.
Thank you Girish i'm opening ticket on cloudron support.And also here is the reponse from DO support :
Thank you for your cooperation.
Can you please confirm exactly how you are trying to upload the content? Is it trough control panel? If yes, then can you please try with the s3cmd tool?
Please look at below article which will help you to understand how to upload files using s3cmd.
https://www.digitalocean.com/docs/spaces/resources/s3cmd-usage/#upload-files-to-a-space
-
@girish I've send emails to support@cloudron.io
There is follow up response from do support, i tried again but still failed :
Thanks for reaching out to us. We have escalated this issue to our engineering team. They mentioned that there was a dip in API availability around the time of this issue. Just to confirm, are you still experiencing these issues? If so, we will need you to try to upload using s3cmd, and we will need to see the output of any errors that occur:
https://www.digitalocean.com/docs/spaces/resources/s3cmd/
https://www.digitalocean.com/docs/spaces/resources/s3cmd-usage/ -
@andirahmat Are you still seeing the rate limit error regularly?
-
This turned out of be a DO SGP region issue. If you try to multi-part upload with a part size > 15MB, it fails (!). I was able to reproduce this with the s3cmd tool.
root@localhost:~# s3cmd --access_key=key --secret_key=secret --host=sgp1.digitaloceanspaces.com --host-bucket=bucket --multipart-chunk-size-mb=20 put 500MB.bin s3://bucket/500MB.bin upload: '500MB.bin' -> 's3://bgsmd/500MB.bin' [part 1 of 25, 20MB] [1 of 1] 20971520 of 20971520 100% in 0s 42.81 MB/s done WARNING: Upload failed: /500MB.bin?partNumber=1&uploadId=2~mT5ZeflCYTchMXHNZc5EZUyc9ABNpT- (500 (UnknownError)) WARNING: Waiting 3 sec... upload: '500MB.bin' -> 's3://bgsmd/500MB.bin' [part 1 of 25, 20MB] [1 of 1] 20971520 of 20971520 100% in 5s 3.82 MB/s done WARNING: Upload failed: /500MB.bin?partNumber=1&uploadId=2~mT5ZeflCYTchMXHNZc5EZUyc9ABNpT- (500 (UnknownError)) WARNING: Waiting 6 sec...
Reduce chunk size all the way down to 15MB, makes it work.
-
@andirahmat if you open a support ticket with them, let us know the outcome. I cannot open a ticket from our account because SGP1 Spaces is disabled for new spaces creation. Looks like they hit some capacity problems there and are thus limiting uploads.