Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps | Demo | Docs | Install
  1. Cloudron Forum
  2. Support
  3. Error 400 in backup process with Ionos S3 Object Storage

Error 400 in backup process with Ionos S3 Object Storage

Scheduled Pinned Locked Moved Unsolved Support
ionosbackups
14 Posts 4 Posters 454 Views 4 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • dsp76D dsp76

    Hi,
    1.) we used Hetzner Storage Box before pretty stable. But thats a while ago. We needed to switch, as we didn't want our backup location to be the same provider as our server.

    2.) Region of IONOS S3 is: eu-central-3 (Berlin)
    3.) Not sure what this means?
    4.) Just now, not the days before.

    P Offline
    P Offline
    p44
    translator
    wrote last edited by
    #4

    @dsp76

    1. I was intending which was origin of data, in which data center (EU, US, and so on) is stored source, if far away from Ionos S3 Region (destination).

    What I have in mind, it could be some link issues between source and destination...

    Il I'm not wrong Hetzner Storage Box service does not have any S3 endpoint, .... Just sFtp, webdav, and others.

    So what I can suggest to you is to debug using other alternative S3 service. You could try backblaze, idrive e2, or Exoscale.

    Can you tell also us more about Cloudron instance resources? How much Ram do you have? Cpu? is Bare metal server or Vps?

    I think all those informations help to better understand.

    1 Reply Last reply
    0
    • nebulonN Offline
      nebulonN Offline
      nebulon
      Staff
      wrote last edited by
      #5

      If we have a way to reliably reproduce the issue, we can see if there is a workaround for that S3 provider, but most likely it is some issue on provider end, if this is not happening all the time. Sadly the S3 implementations between provider are not always behaving the same way. We use the AWS S3 SDK for all requests, which is the standard.

      1 Reply Last reply
      2
      • dsp76D Offline
        dsp76D Offline
        dsp76
        wrote last edited by
        #6

        Hi there,
        its a Virtual Dedicated Server at Hetzner. Decently sized with 32GB RAM and 8 Cores.
        When I drop the error message into AI, it also turns towards the provider. Suddenly HTTP400 when uploading ...
        The concerned App ID is the largest - our own docker registry with about 40GB.

        https://docs.ionos.com/cloud/storage-and-backup/ionos-object-storage/overview/limitations

        Currently I load it in multi upload parts of max. 512MB size. The backup process may use up to 10GB memory, which should be enough. For testing I reduced the size zu 128MB for the parts. Lets see if thats fixing it.

        I also opened a ticket with IONOS to find out, what the cause for error 400 was at the given timestamp. Hope they find some helpful information in the logs.

        (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

        1 Reply Last reply
        0
        • dsp76D Offline
          dsp76D Offline
          dsp76
          wrote last edited by
          #7

          OK, with reducing the size of the parts to 128MB it now ended successfully for the manual backup. Lets see if it also works with the regular backup at night from now on.

          (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

          1 Reply Last reply
          1
          • J joseph has marked this topic as solved
          • dsp76D Offline
            dsp76D Offline
            dsp76
            wrote last edited by
            #8

            Sorry, it still happens. Even reducing the chunk sizes to 128MB didn't solve it. For whatever reason, the backup task crashed on Cloudron due to "internal error". See the last lines...

            What else could be the reason? Can I help to identify the root cause?

            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] directory
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:34 box:tasks update 7013: {"percent":77.66666666666667,"message":"Uploading backup 4803M@15MBps ([REDACTED])"}
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]
            Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file
            Sep 05 03:34:39 box:storage/s3 Upload progress: {"loaded":4966055936,"part":38,"key":"cloudron_tz/snapshot/app_[UUID].tar.gz.enc"}
            Sep 05 03:34:39 /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712
            Sep 05 03:34:39 resp.error = AWS.util.error(new Error(), {
            Sep 05 03:34:39 ^
            Sep 05 03:34:39
            Sep 05 03:34:39 400: null
            Sep 05 03:34:39 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35)
            Sep 05 03:34:39 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
            Sep 05 03:34:39 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
            Sep 05 03:34:39 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14)
            Sep 05 03:34:39 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10)
            Sep 05 03:34:39 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12)
            Sep 05 03:34:39 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10
            Sep 05 03:34:39 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9)
            Sep 05 03:34:39 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12)
            Sep 05 03:34:39 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) {
            Sep 05 03:34:39 code: 400,
            Sep 05 03:34:39 region: null,
            Sep 05 03:34:39 time: 2025-09-05T01:34:41.375Z,
            Sep 05 03:34:39 requestId: null,
            Sep 05 03:34:39 extendedRequestId: undefined,
            Sep 05 03:34:39 cfId: undefined,
            Sep 05 03:34:39 statusCode: 400,
            Sep 05 03:34:39 retryable: false,
            Sep 05 03:34:39 retryDelay: 20000
            Sep 05 03:34:39 }
            Sep 05 03:34:39
            Sep 05 03:34:39 Node.js v20.18.0
            Sep 05 03:34:41 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_[UUID] tgz {"localRoot":"/home/yellowtent/appsdata/[UUID]","layout":[]} errored BoxError: backuptask exited with code 1 signal null
            Sep 05 03:34:41 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19)
            Sep 05 03:34:41 at ChildProcess.emit (node:events:519:28)
            Sep 05 03:34:41 at ChildProcess.emit (node:domain:488:12)
            Sep 05 03:34:41 at ChildProcess._handle.onexit (node:internal/child_process:294:12) {
            Sep 05 03:34:41 reason: 'Shell Error',
            Sep 05 03:34:41 details: {},
            Sep 05 03:34:41 code: 1,
            Sep 05 03:34:41 signal: null
            Sep 05 03:34:41 }
            Sep 05 03:34:41 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null
            Sep 05 03:34:41 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19)
            Sep 05 03:34:41 at ChildProcess.emit (node:events:519:28)
            Sep 05 03:34:41 at ChildProcess.emit (node:domain:488:12)
            Sep 05 03:34:41 at ChildProcess._handle.onexit (node:internal/child_process:294:12) {
            Sep 05 03:34:41 reason: 'Shell Error',
            Sep 05 03:34:41 details: {},
            Sep 05 03:34:41 code: 1,
            Sep 05 03:34:41 signal: null
            Sep 05 03:34:41 }
            Sep 05 03:34:41 box:backuptask fullBackup: app [REDACTED] backup finished. Took 331.629 seconds
            Sep 05 03:34:41 box:locks write: current locks: {"backup_task":null}
            Sep 05 03:34:41 box:locks release: app_[UUID]
            Sep 05 03:34:41 box:taskworker Task took 2080.827 seconds
            Sep 05 03:34:41 box:tasks setCompleted - 7013: {"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}}
            Sep 05 03:34:41 box:tasks update 7013: {"percent":100,"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}}
            Sep 05 03:34:41 BoxError: Backuptask crashed
            Sep 05 03:34:41 at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)
            Sep 05 03:34:41 at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
            Sep 05 03:34:41 at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)
            Sep 05 03:34:41 at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)
            

            (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

            1 Reply Last reply
            0
            • P Offline
              P Offline
              p44
              translator
              wrote last edited by
              #9

              @dsp76 did you try to debug other S3 storage provider?

              1 Reply Last reply
              0
              • dsp76D Offline
                dsp76D Offline
                dsp76
                wrote last edited by
                #10

                I don't have others...

                (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

                1 Reply Last reply
                0
                • dsp76D Offline
                  dsp76D Offline
                  dsp76
                  wrote last edited by
                  #11

                  The logging doesn't show, that its related to the external system. More like an error in the internal backup task in cloudron?

                  (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

                  1 Reply Last reply
                  0
                  • dsp76D Offline
                    dsp76D Offline
                    dsp76
                    wrote last edited by dsp76
                    #12

                    Its still happening. Everyday a different app in backup process. At different times.
                    @nebulon could we please remove the "solved" tag? I'm happy to help with debugging, if I know what to look for.
                    I'm now also looking into timing and potentially overlaps with other tasks.

                    (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

                    1 Reply Last reply
                    0
                    • dsp76D Offline
                      dsp76D Offline
                      dsp76
                      wrote last edited by dsp76
                      #13

                      Moved it to another time, made sure no update processes are overlapping. Still:

                      "Backuptask crashed\n at runBackupUpload"

                      As mentioned before - there was no error in the log at Ionos S3.

                      The root cause may be this:

                      Sep 10 04:00:28 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_UUID tgz {"localRoot":"/home/yellowtent/appsdata/UUID","layout":[]} errored BoxError: backuptask exited with code 1 signal null
                      Sep 10 04:00:28 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19)
                      Sep 10 04:00:28 at ChildProcess.emit (node:events:519:28)
                      Sep 10 04:00:28 at ChildProcess._handle.onexit (node:internal/child_process:294:12) {
                      Sep 10 04:00:28 reason: 'Shell Error',
                      Sep 10 04:00:28 details: {},
                      Sep 10 04:00:28 code: 1,
                      Sep 10 04:00:28 signal: null
                      

                      System log:

                      {
                        "taskId": "7060",
                        "errorMessage": "Backuptask crashed",
                        "timedOut": false,
                        "backupId": null
                      }
                      

                      The Problem: The system Backup log still shows the complete number of apps (supposed to be backed up). Checking them shows, they havn't been backed up. Looks like, that instead of only one task crashed, the complete backup task crashed and ended the complete operation in the same minute it started, @nebulon ...

                      (Ask me about B2B marketing automation & low code business solutions, if thats interesting for you.)

                      1 Reply Last reply
                      1
                      • J joseph has marked this topic as unsolved
                      • J Online
                        J Online
                        joseph
                        Staff
                        wrote last edited by
                        #14

                        @dsp76 I have marked this as unsolved. Is there nothing else in the task/backup logs? i.e right above the crash messages.

                        1 Reply Last reply
                        1
                        Reply
                        • Reply as topic
                        Log in to reply
                        • Oldest to Newest
                        • Newest to Oldest
                        • Most Votes


                        • Login

                        • Don't have an account? Register

                        • Login or register to search.
                        • First post
                          Last post
                        0
                        • Categories
                        • Recent
                        • Tags
                        • Popular
                        • Bookmarks
                        • Search