Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Minio
  3. Backup Failures on Large Installations – MinIO Multipart-Upload Limit

Backup Failures on Large Installations – MinIO Multipart-Upload Limit

Scheduled Pinned Locked Moved Minio
4 Posts 3 Posters 29 Views 3 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • C Offline
      C Offline
      CBCUN
      wrote last edited by CBCUN
      #1

      Hello everyone,

      I ran into an issue with a large Cloudron/Box installation where the backup aborts with this log entry:

      … Upload progress: {"loaded":104878571520,"part":10002,…}
      BoxError: backuptask exited with code 1
      reason: “Internal Error”
      

      Investigation
      • Both AWS S3 and MinIO enforce a maximum of 10,000 parts per multipart upload.
      • Default part size in Cloudron/Box is 10 MB → max transferable per upload ≈ 100 GB (10 000 × 10 MB).
      • In my case part number 10 002 was attempted → limit exceeded → upload aborted.

      Part Size vs. Maximum Backup Size

      Part Size Max Parts Max Data Volume
      10 MB 10 000 100 GB
      50 MB 10 000 500 GB
      100 MB 10 000 1 000 GB
      150 MB 10 000 1 500 GB

      Solution: Increase Part Size
      This reduces the number of parts and allows backups up to ~500 GB without hitting the multipart limit.

      Sources
      • AWS S3 Multipart-Upload Overview: https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html
      • MinIO Limits (10 000 parts, part size 5 MiB–5 TiB): https://github.com/minio/minio/blob/master/docs/minio-limits.md

      Greetings
      Christian

      BrutalBirdieB 1 Reply Last reply
      2
      • C CBCUN

        Hello everyone,

        I ran into an issue with a large Cloudron/Box installation where the backup aborts with this log entry:

        … Upload progress: {"loaded":104878571520,"part":10002,…}
        BoxError: backuptask exited with code 1
        reason: “Internal Error”
        

        Investigation
        • Both AWS S3 and MinIO enforce a maximum of 10,000 parts per multipart upload.
        • Default part size in Cloudron/Box is 10 MB → max transferable per upload ≈ 100 GB (10 000 × 10 MB).
        • In my case part number 10 002 was attempted → limit exceeded → upload aborted.

        Part Size vs. Maximum Backup Size

        Part Size Max Parts Max Data Volume
        10 MB 10 000 100 GB
        50 MB 10 000 500 GB
        100 MB 10 000 1 000 GB
        150 MB 10 000 1 500 GB

        Solution: Increase Part Size
        This reduces the number of parts and allows backups up to ~500 GB without hitting the multipart limit.

        Sources
        • AWS S3 Multipart-Upload Overview: https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html
        • MinIO Limits (10 000 parts, part size 5 MiB–5 TiB): https://github.com/minio/minio/blob/master/docs/minio-limits.md

        Greetings
        Christian

        BrutalBirdieB Online
        BrutalBirdieB Online
        BrutalBirdie
        Partner
        wrote last edited by
        #2

        @CBCUN said in Backup Failures on Large Installations – MinIO Multipart-Upload Limit:

        Run in your Cloudron/Box environment:

        export CLOUDRON_BACKUP_S3_PART_SIZE=50mb

        Could you follow up on that with a bit more explanation?

        Run in your Cloudron/Box environment so ssh into the server and as root just export CLOUDRON_BACKUP_S3_PART_SIZE=50mb?
        This sounds wrong 🤔
        So could you please provide a more in depth guide, that be awesome!

        Like my work? Consider donating a drink. Cheers!

        1 Reply Last reply
        1
        • J Offline
          J Offline
          joseph
          Staff
          wrote last edited by
          #3

          If you want to increaes the part size, it's in Backups -> Configure -> Advanced . The suggestion solution does not seem correct as @BrutalBirdie pointed out

          1 Reply Last reply
          1
          • C Offline
            C Offline
            CBCUN
            wrote last edited by
            #4

            Yes, can be solved. Just for others with this kind of problem.

            1 Reply Last reply
            0
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


              • Login

              • Don't have an account? Register

              • Login or register to search.
              • First post
                Last post
              0
              • Categories
              • Recent
              • Tags
              • Popular
              • Bookmarks
              • Search