Error 400 in backup process with Ionos S3 Object Storage
-
Hi there,
we backup to Ionos S3 Storage.Aug 29 03:04:58 /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712 Aug 29 03:04:58 resp.error = AWS.util.error(new Error(), { Aug 29 03:04:58 ^ Aug 29 03:04:58 Aug 29 03:04:58 400: null Aug 29 03:04:58 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Aug 29 03:04:58 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Aug 29 03:04:58 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Aug 29 03:04:58 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Aug 29 03:04:58 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Aug 29 03:04:58 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Aug 29 03:04:58 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Aug 29 03:04:58 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Aug 29 03:04:58 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Aug 29 03:04:58 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Aug 29 03:04:58 code: 400, Aug 29 03:04:58 region: null, Aug 29 03:04:58 time: 2025-08-29T01:04:58.491Z, Aug 29 03:04:58 requestId: null, Aug 29 03:04:58 extendedRequestId: undefined, Aug 29 03:04:58 cfId: undefined, Aug 29 03:04:58 statusCode: 400, Aug 29 03:04:58 retryable: false, Aug 29 03:04:58 retryDelay: 20000 Aug 29 03:04:58 } Aug 29 03:04:58 Aug 29 03:04:58 Node.js v20.18.0 Aug 29 03:04:58 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_2315967d-42f4-4e64-9935-f62c3e6e858e tgz {"localRoot":"/home/yellowtent/appsdata/2315967d-42f4-4e64-9935-f62c3e6e858e","layout":[]} errored BoxError: backuptask exited with code 1 signal null Aug 29 03:04:58 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Aug 29 03:04:58 at ChildProcess.emit (node:events:519:28) Aug 29 03:04:58 at ChildProcess.emit (node:domain:488:12) Aug 29 03:04:58 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Aug 29 03:04:58 reason: 'Shell Error', Aug 29 03:04:58 details: {}, Aug 29 03:04:58 code: 1, Aug 29 03:04:58 signal: null Aug 29 03:04:58 } Aug 29 03:04:58 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Aug 29 03:04:58 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Aug 29 03:04:58 at ChildProcess.emit (node:events:519:28) Aug 29 03:04:58 at ChildProcess.emit (node:domain:488:12) Aug 29 03:04:58 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Aug 29 03:04:58 reason: 'Shell Error', Aug 29 03:04:58 details: {}, Aug 29 03:04:58 code: 1, Aug 29 03:04:58 signal: null Aug 29 03:04:58 } Aug 29 03:04:58 box:backuptask fullBackup: app www.REDACTED.com backup finished. Took 131.683 seconds Aug 29 03:04:58 box:locks write: current locks: {"backup_task":null} Aug 29 03:04:58 box:locks release: app_2315967d-42f4-4e64-9935-f62c3e6e858e Aug 29 03:04:58 box:taskworker Task took 298.068 seconds Aug 29 03:04:58 box:tasks setCompleted - 6937: {"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Aug 29 03:04:58 box:tasks update 6937: {"percent":100,"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Aug 29 03:04:58 BoxError: Backuptask crashed Aug 29 03:04:58 at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15) Aug 29 03:04:58 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) Aug 29 03:04:58 at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5) Aug 29 03:04:58 at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)
Is there a reason why it failed?
It doesn't always crash...
-
Hello @dsp76
- Have you tested with alternative storage providers or alternative regions to isolate if this is Ionos S3/Zone-specific?
- Which region are you using for your S3 bucket?
- Where data origin is stored?
- How frequently does this issue occur?
-
Hi,
1.) we used Hetzner Storage Box before pretty stable. But thats a while ago. We needed to switch, as we didn't want our backup location to be the same provider as our server.2.) Region of IONOS S3 is: eu-central-3 (Berlin)
3.) Not sure what this means?
4.) Just now, not the days before. -
Hi,
1.) we used Hetzner Storage Box before pretty stable. But thats a while ago. We needed to switch, as we didn't want our backup location to be the same provider as our server.2.) Region of IONOS S3 is: eu-central-3 (Berlin)
3.) Not sure what this means?
4.) Just now, not the days before.- I was intending which was origin of data, in which data center (EU, US, and so on) is stored source, if far away from Ionos S3 Region (destination).
What I have in mind, it could be some link issues between source and destination...
Il I'm not wrong Hetzner Storage Box service does not have any S3 endpoint, .... Just sFtp, webdav, and others.
So what I can suggest to you is to debug using other alternative S3 service. You could try backblaze, idrive e2, or Exoscale.
Can you tell also us more about Cloudron instance resources? How much Ram do you have? Cpu? is Bare metal server or Vps?
I think all those informations help to better understand.
-
If we have a way to reliably reproduce the issue, we can see if there is a workaround for that S3 provider, but most likely it is some issue on provider end, if this is not happening all the time. Sadly the S3 implementations between provider are not always behaving the same way. We use the AWS S3 SDK for all requests, which is the standard.
-
Hi there,
its a Virtual Dedicated Server at Hetzner. Decently sized with 32GB RAM and 8 Cores.
When I drop the error message into AI, it also turns towards the provider. Suddenly HTTP400 when uploading ...
The concerned App ID is the largest - our own docker registry with about 40GB.https://docs.ionos.com/cloud/storage-and-backup/ionos-object-storage/overview/limitations
Currently I load it in multi upload parts of max. 512MB size. The backup process may use up to 10GB memory, which should be enough. For testing I reduced the size zu 128MB for the parts. Lets see if thats fixing it.
I also opened a ticket with IONOS to find out, what the cause for error 400 was at the given timestamp. Hope they find some helpful information in the logs.
-
-
Sorry, it still happens. Even reducing the chunk sizes to 128MB didn't solve it. For whatever reason, the backup task crashed on Cloudron due to "internal error". See the last lines...
What else could be the reason? Can I help to identify the root cause?
Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] directory Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:32 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:32 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:34 box:tasks update 7013: {"percent":77.66666666666667,"message":"Uploading backup 4803M@15MBps ([REDACTED])"} Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:38 box:backupformat/tgz tarPack: processing /home/yellowtent/appsdata/[UUID]/data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256] Sep 05 03:34:38 box:backupformat/tgz addToPack: added ./data/storage/docker/registry/v2/blobs/sha256/[XX]/[SHA256]/data file Sep 05 03:34:39 box:storage/s3 Upload progress: {"loaded":4966055936,"part":38,"key":"cloudron_tz/snapshot/app_[UUID].tar.gz.enc"} Sep 05 03:34:39 /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712 Sep 05 03:34:39 resp.error = AWS.util.error(new Error(), { Sep 05 03:34:39 ^ Sep 05 03:34:39 Sep 05 03:34:39 400: null Sep 05 03:34:39 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Sep 05 03:34:39 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Sep 05 03:34:39 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Sep 05 03:34:39 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Sep 05 03:34:39 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Sep 05 03:34:39 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Sep 05 03:34:39 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Sep 05 03:34:39 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Sep 05 03:34:39 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Sep 05 03:34:39 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Sep 05 03:34:39 code: 400, Sep 05 03:34:39 region: null, Sep 05 03:34:39 time: 2025-09-05T01:34:41.375Z, Sep 05 03:34:39 requestId: null, Sep 05 03:34:39 extendedRequestId: undefined, Sep 05 03:34:39 cfId: undefined, Sep 05 03:34:39 statusCode: 400, Sep 05 03:34:39 retryable: false, Sep 05 03:34:39 retryDelay: 20000 Sep 05 03:34:39 } Sep 05 03:34:39 Sep 05 03:34:39 Node.js v20.18.0 Sep 05 03:34:41 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_[UUID] tgz {"localRoot":"/home/yellowtent/appsdata/[UUID]","layout":[]} errored BoxError: backuptask exited with code 1 signal null Sep 05 03:34:41 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Sep 05 03:34:41 at ChildProcess.emit (node:events:519:28) Sep 05 03:34:41 at ChildProcess.emit (node:domain:488:12) Sep 05 03:34:41 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Sep 05 03:34:41 reason: 'Shell Error', Sep 05 03:34:41 details: {}, Sep 05 03:34:41 code: 1, Sep 05 03:34:41 signal: null Sep 05 03:34:41 } Sep 05 03:34:41 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Sep 05 03:34:41 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Sep 05 03:34:41 at ChildProcess.emit (node:events:519:28) Sep 05 03:34:41 at ChildProcess.emit (node:domain:488:12) Sep 05 03:34:41 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Sep 05 03:34:41 reason: 'Shell Error', Sep 05 03:34:41 details: {}, Sep 05 03:34:41 code: 1, Sep 05 03:34:41 signal: null Sep 05 03:34:41 } Sep 05 03:34:41 box:backuptask fullBackup: app [REDACTED] backup finished. Took 331.629 seconds Sep 05 03:34:41 box:locks write: current locks: {"backup_task":null} Sep 05 03:34:41 box:locks release: app_[UUID] Sep 05 03:34:41 box:taskworker Task took 2080.827 seconds Sep 05 03:34:41 box:tasks setCompleted - 7013: {"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Sep 05 03:34:41 box:tasks update 7013: {"percent":100,"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Sep 05 03:34:41 BoxError: Backuptask crashed Sep 05 03:34:41 at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15) Sep 05 03:34:41 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) Sep 05 03:34:41 at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5) Sep 05 03:34:41 at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)
-
Its still happening. Everyday a different app in backup process. At different times.
@nebulon could we please remove the "solved" tag? I'm happy to help with debugging, if I know what to look for.
I'm now also looking into timing and potentially overlaps with other tasks. -
Moved it to another time, made sure no update processes are overlapping. Still:
"Backuptask crashed\n at runBackupUpload"
As mentioned before - there was no error in the log at Ionos S3.
The root cause may be this:
Sep 10 04:00:28 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_UUID tgz {"localRoot":"/home/yellowtent/appsdata/UUID","layout":[]} errored BoxError: backuptask exited with code 1 signal null Sep 10 04:00:28 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Sep 10 04:00:28 at ChildProcess.emit (node:events:519:28) Sep 10 04:00:28 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Sep 10 04:00:28 reason: 'Shell Error', Sep 10 04:00:28 details: {}, Sep 10 04:00:28 code: 1, Sep 10 04:00:28 signal: null
System log:
{ "taskId": "7060", "errorMessage": "Backuptask crashed", "timedOut": false, "backupId": null }
The Problem: The system Backup log still shows the complete number of apps (supposed to be backed up). Checking them shows, they havn't been backed up. Looks like, that instead of only one task crashed, the complete backup task crashed and ended the complete operation in the same minute it started, @nebulon ...
-
-
Hi @joseph - the complete log is in #8 https://forum.cloudron.io/topic/14253/error-400-in-backup-process-with-ionos-s3-object-storage/8?_=1757661856982
This is todays error log of the backup task:
Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg3.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-1024x335.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-1200x393.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-150x150.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-1536x502.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-177x142.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-200x65.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-300x214.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-300x98.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-320x202.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-400x131.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-460x295.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-540x272.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-600x196.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-669x272.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-66x66.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-700x441.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-768x251.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-800x262.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4-940x400.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg4.jpg file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-1024x342.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-1200x401.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-150x150.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-1536x513.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-177x142.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-200x67.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-2048x684.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-300x100.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-300x214.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-320x202.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-400x134.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-460x295.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-540x272.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-600x200.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-669x272.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-66x66.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-700x441.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-768x256.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-800x267.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-940x400.png file Sep 12 04:17:21 box:backupformat/tgz addToPack: added ./data/public/wp-content/uploads/2024/03/testbg5-scaled.jpg file Sep 12 04:17:21 /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712 Sep 12 04:17:21 resp.error = AWS.util.error(new Error(), { Sep 12 04:17:21 ^ Sep 12 04:17:21 Sep 12 04:17:21 400: null Sep 12 04:17:21 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Sep 12 04:17:21 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Sep 12 04:17:21 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Sep 12 04:17:21 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Sep 12 04:17:21 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Sep 12 04:17:21 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Sep 12 04:17:21 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Sep 12 04:17:21 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Sep 12 04:17:21 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Sep 12 04:17:21 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Sep 12 04:17:21 code: 400, Sep 12 04:17:21 region: null, Sep 12 04:17:21 time: 2025-09-12T02:17:21.579Z, Sep 12 04:17:21 requestId: null, Sep 12 04:17:21 extendedRequestId: undefined, Sep 12 04:17:21 cfId: undefined, Sep 12 04:17:21 statusCode: 400, Sep 12 04:17:21 retryable: false, Sep 12 04:17:21 retryDelay: 20000 Sep 12 04:17:21 } Sep 12 04:17:21 Sep 12 04:17:21 Node.js v20.18.0 Sep 12 04:17:21 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_{UUID} tgz {"localRoot":"/home/yellowtent/appsdata/{UUID}","layout":[]} errored BoxError: backuptask exited with code 1 signal null Sep 12 04:17:21 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Sep 12 04:17:21 at ChildProcess.emit (node:events:519:28) Sep 12 04:17:21 at ChildProcess.emit (node:domain:488:12) Sep 12 04:17:21 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Sep 12 04:17:21 reason: 'Shell Error', Sep 12 04:17:21 details: {}, Sep 12 04:17:21 code: 1, Sep 12 04:17:21 signal: null Sep 12 04:17:21 } Sep 12 04:17:21 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Sep 12 04:17:21 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Sep 12 04:17:21 at ChildProcess.emit (node:events:519:28) Sep 12 04:17:21 at ChildProcess.emit (node:domain:488:12) Sep 12 04:17:21 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Sep 12 04:17:21 reason: 'Shell Error', Sep 12 04:17:21 details: {}, Sep 12 04:17:21 code: 1, Sep 12 04:17:21 signal: null Sep 12 04:17:21 } Sep 12 04:17:21 box:backuptask fullBackup: app {DOMAIN} backup finished. Took 364.294 seconds Sep 12 04:17:21 box:locks write: current locks: {"backup_task":null} Sep 12 04:17:21 box:locks release: app_{UUID} Sep 12 04:17:21 box:taskworker Task took 1041.04 seconds Sep 12 04:17:21 box:tasks setCompleted - 7083: {"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Sep 12 04:17:21 box:tasks update 7083: {"percent":100,"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Sep 12 04:17:21 BoxError: Backuptask crashed Sep 12 04:17:21 at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15) Sep 12 04:17:21 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) Sep 12 04:17:21 at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5) Sep 12 04:17:21 at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)
-
Switching back to SSHFS in Hetzner Storage Box seems to run the backup reliable.
So whats wrong with S3 in combination with Ionos then? It looks like the process dies in cloudron, but the error log for the task itself seems not to help. Ionos found no issues in the log. -
Hello @dsp76
Good to read that you found a solution for you that works for now.
Regarding the issues with IONOS S3, we will have to look into it.Just read the following topic Backup to Hetzner Object Storage failing regularly and now reading that you have no issues with Hetzner SSHFS.
It seems the experience between users are worlds apart.Maybe some insights about:
- amount of apps
- what apps
- tgz or rsync
- backup size
- schedule
- full encryption
- file name encryption
from you @dsp76, @p44 and @ccfu might reveal some common issues?
-
Hello @dsp76
Good to read that you found a solution for you that works for now.
Regarding the issues with IONOS S3, we will have to look into it.Just read the following topic Backup to Hetzner Object Storage failing regularly and now reading that you have no issues with Hetzner SSHFS.
It seems the experience between users are worlds apart.Maybe some insights about:
- amount of apps
- what apps
- tgz or rsync
- backup size
- schedule
- full encryption
- file name encryption
from you @dsp76, @p44 and @ccfu might reveal some common issues?
In this case, it seems that something is interfering with backup scripts... what I would like to test – if possible – to stop all the apps, and retry with a "lighter" backup to see what will happens.
I've suspect that problem is from Hetzner side, but, of course, need more search to prove this.
Because of lacking of time, when I experienced issues (eg. unable to restore backup, different errors), I moved to other Isp and solved all the problems.