Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Navigation

    Cloudron Forum

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular

    Tar Backups timing out on too large part number

    Support
    backblaze backups feature-request
    3
    28
    162
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • A
      adrw @girish last edited by

      @girish Yes, pretty sure it was.

      1 Reply Last reply Reply Quote 0
      • girish
        girish Staff last edited by

        @adrw In that case, the real issue is https://git.cloudron.io/cloudron/box/-/blob/master/src/backups.js#L1235 . There is a timeout of 12 hours for backup tasks. Can you bump it to like 36 or something?

        For copy concurrency, there is already a slider for that. Don't you see it under advanced? Note that where it is failing now is that it has to copy the parts of the same file (basically, the single file is so big that we have to split it up into many parts). I have to check if parallel multi-part copy is allowed. If it is, it's easy to do.

        A 3 Replies Last reply Reply Quote 0
        • girish
          girish Staff last edited by

          @adrw It seems parallel multi-part copy will work per https://dzone.com/articles/amazon-s3-parallel-multipart . Looks like a good change to make.

          1 Reply Last reply Reply Quote 1
          • A
            adrw @girish last edited by

            @girish The only advanced features I see for the tar backups is the memory, none of the parallel copy or upload/download like I see when I'm in rsync mode.

            girish 1 Reply Last reply Reply Quote 0
            • A
              adrw @girish last edited by

              @girish Thanks! That's what I was looking for, I'll give that a shot.

              1 Reply Last reply Reply Quote 0
              • girish
                girish Staff @adrw last edited by

                @adrw Ah indeed. When doing tar backup, there is only one file to upload and copy. Nothing to do in parallel (apart from the multi-part copy which I think can be hardcoded).

                A 1 Reply Last reply Reply Quote 0
                • A
                  adrw @girish last edited by

                  @girish Could the copy that happens at the end of the task (I think from the snapshot folder to the timestamped one) be done in parallel? It seems to be done serially right now which contributes to the longer task time to some extent.

                  1 Reply Last reply Reply Quote 0
                  • A
                    adrw @girish last edited by

                    @girish Could this timeout be configurable in a future release?

                    1 Reply Last reply Reply Quote 0
                    • girish
                      girish Staff last edited by

                      @adrw I made the timeout now 24 hours. The timeout is really just there to kill "stuck" backups. This can usually only happen when there is a bug in our code.

                      A 1 Reply Last reply Reply Quote 1
                      • A
                        adrw @girish last edited by

                        @girish Great! Thanks again for your help debugging this and adding more configuration. Huge help for larger backups like mine.

                        1 Reply Last reply Reply Quote 0
                        • First post
                          Last post