Backup failed: Task 39 timed out.
-
Who know if this is related, but I had backups start failing a little while ago. I turned off a few apps that seemed to be culprits, and narrowed it down by running backups. If it failed, I turned off another app until it succeeded. Then I turned back on those apps to see which would trigger the fail. Long story short, I tracked it down to some images in a Wordpress Uploads directory, deleted those images, added the app back onto the Backup list, and then it worked. If I remember correctly it had something to do with a corrupted jpeg/jpg/JPG/JPEG (the suffixes matter, sometimes). Anyways, it is a hassle but you might have to do similar sleuthing. This is one reason I use another 'cloud-' service, cloudinary.com. The free tier has been perfect.
-
Do you have an idea how long that task ran before timing out? The current task timeout should be set to 12h, maybe that is not enough in your case?
-
@zjuhasz It's hardcoded. For now, you can edit
/home/yellowtent/box/src/backups.js
. In line 1217 it saystasks.startTask(taskId, { timeout: 12 * 60 * 60 * 1000 /* 12 hours */ }, function (error, backupId) {
Change the above time to say 20 hours or something
tasks.startTask(taskId, { timeout: 20 * 60 * 60 * 1000 /* 20 hours */ }, function (error, backupId) {
After changing, do a
sudo systemctl restart box
.(I am looking for a better fix to this in Cloudron 5.5)
-
@zjuhasz It's hardcoded. For now, you can edit
/home/yellowtent/box/src/backups.js
. In line 1217 it saystasks.startTask(taskId, { timeout: 12 * 60 * 60 * 1000 /* 12 hours */ }, function (error, backupId) {
Change the above time to say 20 hours or something
tasks.startTask(taskId, { timeout: 20 * 60 * 60 * 1000 /* 20 hours */ }, function (error, backupId) {
After changing, do a
sudo systemctl restart box
.(I am looking for a better fix to this in Cloudron 5.5)
@girish OK, just finished backup after changing the timeout value and it's not timing out anymore but I am getting a new error. I put in *** instead of the long encrypted app name.
Error uploading backup/snapshot/app.***.tar.gz.enc. Message: Part number must be an integer between 1 and 10000, inclusive HTTP Code: InvalidArgument
-
@girish OK, just finished backup after changing the timeout value and it's not timing out anymore but I am getting a new error. I put in *** instead of the long encrypted app name.
Error uploading backup/snapshot/app.***.tar.gz.enc. Message: Part number must be an integer between 1 and 10000, inclusive HTTP Code: InvalidArgument
-
I am guessing this is because of the large file size (> 5GB is the S3 limit) and we do a multi-part upload and Wasabi is unable to handle it. Can you check with them how many parts of a multi-part upload they support? And what part size they recommend? We have to then find out if they are capable of handling 300GB files as well.
-
I am guessing this is because of the large file size (> 5GB is the S3 limit) and we do a multi-part upload and Wasabi is unable to handle it. Can you check with them how many parts of a multi-part upload they support? And what part size they recommend? We have to then find out if they are capable of handling 300GB files as well.
@girish Seems to be the same <5GB part size and 1000 part limit. They support some other mechanisms though not included in s3 composing objects and appending to objects. Why is Cloudron is going over 1000 5GB parts on a 300GB upload?
I don't have any particular preference for Wasabi though I just chose them because they were the cheapest I could find and I'm expecting my backups to get quite large over time. While we investigate Wasabi could you recommend a storage provider? The one that is the most well tested. Would it be Amazon?
-
@girish Seems to be the same <5GB part size and 1000 part limit. They support some other mechanisms though not included in s3 composing objects and appending to objects. Why is Cloudron is going over 1000 5GB parts on a 300GB upload?
I don't have any particular preference for Wasabi though I just chose them because they were the cheapest I could find and I'm expecting my backups to get quite large over time. While we investigate Wasabi could you recommend a storage provider? The one that is the most well tested. Would it be Amazon?
@zjuhasz Guessing that all the S£ compatible providers may all have the same limits. We have 120TB on Wasabi with other systems and always been happy with Backblaze B2 too, also S3 compatible and Cloudron integrated and tested now (I only tested smaller though) if that helps.
-
@zjuhasz It's hardcoded. For now, you can edit
/home/yellowtent/box/src/backups.js
. In line 1217 it saystasks.startTask(taskId, { timeout: 12 * 60 * 60 * 1000 /* 12 hours */ }, function (error, backupId) {
Change the above time to say 20 hours or something
tasks.startTask(taskId, { timeout: 20 * 60 * 60 * 1000 /* 20 hours */ }, function (error, backupId) {
After changing, do a
sudo systemctl restart box
.(I am looking for a better fix to this in Cloudron 5.5)
-
@girish I've been using this method of manually editing the js file to a longer timeout for a while now for my uploads and as of the most recent update this no longer works. Any ideas?
-
G girish marked this topic as a question on
-
G girish has marked this topic as solved on