Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Support
  3. Backup failed: Task 39 timed out.

Backup failed: Task 39 timed out.

Scheduled Pinned Locked Moved Solved Support
backupstimeoutwasabi
12 Posts 7 Posters 2.2k Views 7 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • zjuhaszZ Offline
    zjuhaszZ Offline
    zjuhasz
    wrote on last edited by girish
    #1

    My backups are failing to complete with the error task 39 timed out. I'm using Wasabi and have about 300GB backup size. It always fails at a little less than 83%.

    1 Reply Last reply
    0
    • nebulonN Offline
      nebulonN Offline
      nebulon
      Staff
      wrote on last edited by
      #2

      Do you have an idea how long that task ran before timing out? The current task timeout should be set to 12h, maybe that is not enough in your case?

      zjuhaszZ 1 Reply Last reply
      0
      • scookeS Offline
        scookeS Offline
        scooke
        wrote on last edited by scooke
        #3

        Who know if this is related, but I had backups start failing a little while ago. I turned off a few apps that seemed to be culprits, and narrowed it down by running backups. If it failed, I turned off another app until it succeeded. Then I turned back on those apps to see which would trigger the fail. Long story short, I tracked it down to some images in a Wordpress Uploads directory, deleted those images, added the app back onto the Backup list, and then it worked. If I remember correctly it had something to do with a corrupted jpeg/jpg/JPG/JPEG (the suffixes matter, sometimes). Anyways, it is a hassle but you might have to do similar sleuthing. This is one reason I use another 'cloud-' service, cloudinary.com. The free tier has been perfect.

        A life lived in fear is a life half-lived

        1 Reply Last reply
        1
        • nebulonN nebulon

          Do you have an idea how long that task ran before timing out? The current task timeout should be set to 12h, maybe that is not enough in your case?

          zjuhaszZ Offline
          zjuhaszZ Offline
          zjuhasz
          wrote on last edited by
          #4

          @nebulon that's probably it. It seems like it's happening right around 12 hours. Is there a setting I can change for the timeout?

          1 Reply Last reply
          0
          • girishG Offline
            girishG Offline
            girish
            Staff
            wrote on last edited by girish
            #5

            @zjuhasz It's hardcoded. For now, you can edit /home/yellowtent/box/src/backups.js . In line 1217 it says

                    tasks.startTask(taskId, { timeout: 12 * 60 * 60 * 1000 /* 12 hours */ }, function (error, backupId) {
            

            Change the above time to say 20 hours or something

                    tasks.startTask(taskId, { timeout: 20 * 60 * 60 * 1000 /* 20 hours */ }, function (error, backupId) {
            

            After changing, do a sudo systemctl restart box.

            (I am looking for a better fix to this in Cloudron 5.5)

            zjuhaszZ S 2 Replies Last reply
            2
            • girishG girish

              @zjuhasz It's hardcoded. For now, you can edit /home/yellowtent/box/src/backups.js . In line 1217 it says

                      tasks.startTask(taskId, { timeout: 12 * 60 * 60 * 1000 /* 12 hours */ }, function (error, backupId) {
              

              Change the above time to say 20 hours or something

                      tasks.startTask(taskId, { timeout: 20 * 60 * 60 * 1000 /* 20 hours */ }, function (error, backupId) {
              

              After changing, do a sudo systemctl restart box.

              (I am looking for a better fix to this in Cloudron 5.5)

              zjuhaszZ Offline
              zjuhaszZ Offline
              zjuhasz
              wrote on last edited by
              #6

              @girish OK, just finished backup after changing the timeout value and it's not timing out anymore but I am getting a new error. I put in *** instead of the long encrypted app name.

              Error uploading backup/snapshot/app.***.tar.gz.enc. Message: Part number must be an integer between 1 and 10000, inclusive HTTP Code: InvalidArgument

              d19dotcaD 1 Reply Last reply
              0
              • zjuhaszZ zjuhasz

                @girish OK, just finished backup after changing the timeout value and it's not timing out anymore but I am getting a new error. I put in *** instead of the long encrypted app name.

                Error uploading backup/snapshot/app.***.tar.gz.enc. Message: Part number must be an integer between 1 and 10000, inclusive HTTP Code: InvalidArgument

                d19dotcaD Offline
                d19dotcaD Offline
                d19dotca
                wrote on last edited by
                #7

                @zjuhasz Shouldn't it just be backup/snapshot/app.*.tar.gz.enc (one asterisk)?

                --
                Dustin Dauncey
                www.d19.ca

                1 Reply Last reply
                0
                • girishG Offline
                  girishG Offline
                  girish
                  Staff
                  wrote on last edited by girish
                  #8

                  I am guessing this is because of the large file size (> 5GB is the S3 limit) and we do a multi-part upload and Wasabi is unable to handle it. Can you check with them how many parts of a multi-part upload they support? And what part size they recommend? We have to then find out if they are capable of handling 300GB files as well.

                  zjuhaszZ 1 Reply Last reply
                  0
                  • girishG girish

                    I am guessing this is because of the large file size (> 5GB is the S3 limit) and we do a multi-part upload and Wasabi is unable to handle it. Can you check with them how many parts of a multi-part upload they support? And what part size they recommend? We have to then find out if they are capable of handling 300GB files as well.

                    zjuhaszZ Offline
                    zjuhaszZ Offline
                    zjuhasz
                    wrote on last edited by
                    #9

                    @girish Seems to be the same <5GB part size and 1000 part limit. They support some other mechanisms though not included in s3 composing objects and appending to objects. Why is Cloudron is going over 1000 5GB parts on a 300GB upload?

                    I don't have any particular preference for Wasabi though I just chose them because they were the cheapest I could find and I'm expecting my backups to get quite large over time. While we investigate Wasabi could you recommend a storage provider? The one that is the most well tested. Would it be Amazon?

                    marcusquinnM 1 Reply Last reply
                    0
                    • zjuhaszZ zjuhasz

                      @girish Seems to be the same <5GB part size and 1000 part limit. They support some other mechanisms though not included in s3 composing objects and appending to objects. Why is Cloudron is going over 1000 5GB parts on a 300GB upload?

                      I don't have any particular preference for Wasabi though I just chose them because they were the cheapest I could find and I'm expecting my backups to get quite large over time. While we investigate Wasabi could you recommend a storage provider? The one that is the most well tested. Would it be Amazon?

                      marcusquinnM Offline
                      marcusquinnM Offline
                      marcusquinn
                      wrote on last edited by
                      #10

                      @zjuhasz Guessing that all the S£ compatible providers may all have the same limits. We have 120TB on Wasabi with other systems and always been happy with Backblaze B2 too, also S3 compatible and Cloudron integrated and tested now (I only tested smaller though) if that helps.

                      Web Design https://www.evergreen.je
                      Development https://brandlight.org
                      Life https://marcusquinn.com

                      1 Reply Last reply
                      0
                      • girishG girish

                        @zjuhasz It's hardcoded. For now, you can edit /home/yellowtent/box/src/backups.js . In line 1217 it says

                                tasks.startTask(taskId, { timeout: 12 * 60 * 60 * 1000 /* 12 hours */ }, function (error, backupId) {
                        

                        Change the above time to say 20 hours or something

                                tasks.startTask(taskId, { timeout: 20 * 60 * 60 * 1000 /* 20 hours */ }, function (error, backupId) {
                        

                        After changing, do a sudo systemctl restart box.

                        (I am looking for a better fix to this in Cloudron 5.5)

                        S Offline
                        S Offline
                        shan
                        wrote on last edited by shan
                        #11

                        @girish I've been using this method of manually editing the js file to a longer timeout for a while now for my uploads and as of the most recent update this no longer works. Any ideas?

                        girishG 1 Reply Last reply
                        0
                        • S shan

                          @girish I've been using this method of manually editing the js file to a longer timeout for a while now for my uploads and as of the most recent update this no longer works. Any ideas?

                          girishG Offline
                          girishG Offline
                          girish
                          Staff
                          wrote on last edited by
                          #12

                          @shan I will mark this thread as solved. Let's follow up at https://forum.cloudron.io/topic/7750/backup-uploads-time-out-after-12-hours-can-no-longer-manually-adjust-timeout-to-be-longer

                          1 Reply Last reply
                          0
                          • girishG girish marked this topic as a question on
                          • girishG girish has marked this topic as solved on
                          Reply
                          • Reply as topic
                          Log in to reply
                          • Oldest to Newest
                          • Newest to Oldest
                          • Most Votes


                          • Login

                          • Don't have an account? Register

                          • Login or register to search.
                          • First post
                            Last post
                          0
                          • Categories
                          • Recent
                          • Tags
                          • Popular
                          • Bookmarks
                          • Search