Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Support
  3. Backup fails

Backup fails

Scheduled Pinned Locked Moved Support
8 Posts 2 Posters 659 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C Offline
    C Offline
    charlesnw
    wrote on last edited by
    #1

    timestamp] /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712
    [no timestamp] resp.error = AWS.util.error(new Error(), {
    [no timestamp] ^
    [no timestamp]
    [no timestamp] Part number must be in the range 1 - 10000

    1 Reply Last reply
    0
    • C Offline
      C Offline
      charlesnw
      wrote on last edited by
      #2

      https://github.com/s3tools/s3cmd/issues/613

      I can provide the full log if needed.

      1 Reply Last reply
      0
      • nebulonN Offline
        nebulonN Offline
        nebulon
        Staff
        wrote on last edited by
        #3

        looks like a long fixed issue. Which S3 provider are you using?

        For the moment maybe you can set a higher part-size in the advanced backup settings if something causes a part number higher than 10000 in your case. Maybe that helps for your case.

        C 1 Reply Last reply
        0
        • nebulonN nebulon

          looks like a long fixed issue. Which S3 provider are you using?

          For the moment maybe you can set a higher part-size in the advanced backup settings if something causes a part number higher than 10000 in your case. Maybe that helps for your case.

          C Offline
          C Offline
          charlesnw
          wrote on last edited by
          #4

          @nebulon I am using Backblaze

          1 Reply Last reply
          0
          • C Offline
            C Offline
            charlesnw
            wrote on last edited by
            #5

            I tried to adjust the part size. I now get:

            Task 3427 crashed with code 1

            Looking at the logs, no footprint of a crash.

            1 Reply Last reply
            0
            • nebulonN Offline
              nebulonN Offline
              nebulon
              Staff
              wrote on last edited by
              #6

              That is strange to not see any errors, maybe also increase the backup memory limit to ensure it doesn't run out of memory. Overall we also got news that Backblaze introduced rate-limits, so not sure how viable that is in the long run. Might be worth looking into another backup storage sadly.

              C 1 Reply Last reply
              0
              • nebulonN nebulon

                That is strange to not see any errors, maybe also increase the backup memory limit to ensure it doesn't run out of memory. Overall we also got news that Backblaze introduced rate-limits, so not sure how viable that is in the long run. Might be worth looking into another backup storage sadly.

                C Offline
                C Offline
                charlesnw
                wrote on last edited by
                #7

                @nebulon I have bumped up the memory as well. My nextcloud has gotten quite large and I didn't empty the recycle bin. Trying a backup after the delete permanently runs.

                Any recommended backup host?

                1 Reply Last reply
                0
                • nebulonN Offline
                  nebulonN Offline
                  nebulon
                  Staff
                  wrote on last edited by
                  #8

                  Especially if you have a lot of files, we have had good experience with Hetzner Storage boxes using the rsync SSH backend on Cloudron.

                  1 Reply Last reply
                  0
                  Reply
                  • Reply as topic
                  Log in to reply
                  • Oldest to Newest
                  • Newest to Oldest
                  • Most Votes


                  • Login

                  • Don't have an account? Register

                  • Login or register to search.
                  • First post
                    Last post
                  0
                  • Categories
                  • Recent
                  • Tags
                  • Popular
                  • Bookmarks
                  • Search