Error 400 in backup process with Ionos S3 Object Storage
-
Hello @dsp76
Good to read that you found a solution for you that works for now.
Regarding the issues with IONOS S3, we will have to look into it.Just read the following topic Backup to Hetzner Object Storage failing regularly and now reading that you have no issues with Hetzner SSHFS.
It seems the experience between users are worlds apart.Maybe some insights about:
- amount of apps
- what apps
- tgz or rsync
- backup size
- schedule
- full encryption
- file name encryption
from you @dsp76, @p44 and @ccfu might reveal some common issues?
Hi @james ,
Hetzner Object Storage ≠ Hetzner Storage Box (thats what I was using as alternative backup target). I didn't try the Object Storage.@halima632 that was a good hint! Just switched to the IONOS (Profitbricks) entry and it seems to run fine. I didn't know it existed and thought it was "S3 compatible" and fine

-
D dsp76 has marked this topic as solved on
-
Unfortunately even with the correct IONOS Settings from my Hetzner based Cloudron.

It started crashing again every night since a week now. Thats the anonymised log:
Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/index.html file Oct 15 04:12:36 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/cache/borlabs-cookie Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1 directory Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/borlabs-cookie_1_de.css file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/borlabs-cookie_1_en.css file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/geo-ip-database-********.mmdb file Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/cache/borlabs-cookie/1 Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-1-de.css file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-1-en.css file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-config-de.json.js file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-config-en.json.js file Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/cache/matomo Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/infinitewp Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups directory Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/temp directory Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/infinitewp/backups Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clCPUUsage.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clMemoryPeak.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clMemoryUsage.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clTimeTaken.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/index.php file Oct 15 04:12:42 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 134M@9MBps (example.com)"} Oct 15 04:12:42 box:storage/s3 Upload progress: {"loaded":134217728,"part":1,"key":"snapshot/app_********.tar.gz.enc"} Oct 15 04:12:51 box:storage/s3 Upload progress: {"loaded":268435456,"part":2,"key":"snapshot/app_********.tar.gz.enc"} Oct 15 04:12:52 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 277M@14MBps (example.com)"} Oct 15 04:13:02 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 435M@16MBps (example.com)"} Oct 15 04:13:03 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/example.com-de_backup_full_****-**-**_********.zip file Oct 15 04:13:08 box:storage/s3 Upload progress: {"loaded":402653184,"part":4,"key":"snapshot/app_********.tar.gz.enc"} Oct 15 04:13:12 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 584M@15MBps (example.com)"} Oct 15 04:13:12 AWS SDK Error: 400 Bad Request Oct 15 04:13:12 code: 400, statusCode: 400, retryable: false Oct 15 04:13:12 Node.js v20.18.0 Oct 15 04:13:19 box:shell backuptask: errored BoxError: backuptask exited with code 1 signal null Oct 15 04:13:19 reason: 'Shell Error', code: 1 Oct 15 04:13:19 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Oct 15 04:13:19 box:backuptask fullBackup: app example.com backup finished. Took 67.438 seconds Oct 15 04:13:19 box:tasks update ****: {"percent":100,"error":{"message":"Backuptask crashed"}} Oct 15 04:13:19 BoxError: Backuptask crashedNot sure what to do with it. Just started a manual backup, it also crashed again while uploading to IONOS.
Oct 15 11:00:05 400: null Oct 15 11:00:05 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Oct 15 11:00:05 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Oct 15 11:00:05 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Oct 15 11:00:05 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Oct 15 11:00:05 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Oct 15 11:00:05 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Oct 15 11:00:05 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Oct 15 11:00:05 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Oct 15 11:00:05 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Oct 15 11:00:05 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Oct 15 11:00:05 code: 400, Oct 15 11:00:05 region: null, Oct 15 11:00:05 time: 2025-10-15T09:00:05.031Z, Oct 15 11:00:05 requestId: null, Oct 15 11:00:05 extendedRequestId: undefined, Oct 15 11:00:05 cfId: undefined, Oct 15 11:00:05 statusCode: 400, Oct 15 11:00:05 retryable: false, Oct 15 11:00:05 retryDelay: 20000 Oct 15 11:00:05 }and
Oct 15 11:00:05 box:tasks setCompleted - 7460: {"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Oct 15 11:00:05 box:tasks update 7460: {"percent":100,"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Oct 15 11:00:05 BoxError: Backuptask crashedThe snapshot folder within the bucket (created by cloudron?) contains a couple of up to date uploads.
- Who should I with - is it IONOS? Is it cloudron? Is it Hetzner?
- Whats causing the "internal error" while uploading?
- Why is it "retryable: false"?
-
J james has marked this topic as unsolved on
-
Unfortunately even with the correct IONOS Settings from my Hetzner based Cloudron.

It started crashing again every night since a week now. Thats the anonymised log:
Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/autoptimize_snippet_********.js file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/autoptimize/js/index.html file Oct 15 04:12:36 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/cache/borlabs-cookie Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1 directory Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/borlabs-cookie_1_de.css file Oct 15 04:12:36 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/borlabs-cookie_1_en.css file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/geo-ip-database-********.mmdb file Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/cache/borlabs-cookie/1 Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-1-de.css file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-1-en.css file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-config-de.json.js file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/cache/borlabs-cookie/1/borlabs-cookie-config-en.json.js file Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/cache/matomo Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/infinitewp Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups directory Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/temp directory Oct 15 04:12:37 box:backupformat/tgz tarPack: processing /home/***/appsdata/********/data/public/wp-content/infinitewp/backups Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clCPUUsage.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clMemoryPeak.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clMemoryUsage.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/DE_clTimeTaken.****.txt file Oct 15 04:12:37 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/index.php file Oct 15 04:12:42 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 134M@9MBps (example.com)"} Oct 15 04:12:42 box:storage/s3 Upload progress: {"loaded":134217728,"part":1,"key":"snapshot/app_********.tar.gz.enc"} Oct 15 04:12:51 box:storage/s3 Upload progress: {"loaded":268435456,"part":2,"key":"snapshot/app_********.tar.gz.enc"} Oct 15 04:12:52 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 277M@14MBps (example.com)"} Oct 15 04:13:02 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 435M@16MBps (example.com)"} Oct 15 04:13:03 box:backupformat/tgz addToPack: added ./data/public/wp-content/infinitewp/backups/example.com-de_backup_full_****-**-**_********.zip file Oct 15 04:13:08 box:storage/s3 Upload progress: {"loaded":402653184,"part":4,"key":"snapshot/app_********.tar.gz.enc"} Oct 15 04:13:12 box:tasks update ****: {"percent":21.3,"message":"Uploading backup 584M@15MBps (example.com)"} Oct 15 04:13:12 AWS SDK Error: 400 Bad Request Oct 15 04:13:12 code: 400, statusCode: 400, retryable: false Oct 15 04:13:12 Node.js v20.18.0 Oct 15 04:13:19 box:shell backuptask: errored BoxError: backuptask exited with code 1 signal null Oct 15 04:13:19 reason: 'Shell Error', code: 1 Oct 15 04:13:19 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Oct 15 04:13:19 box:backuptask fullBackup: app example.com backup finished. Took 67.438 seconds Oct 15 04:13:19 box:tasks update ****: {"percent":100,"error":{"message":"Backuptask crashed"}} Oct 15 04:13:19 BoxError: Backuptask crashedNot sure what to do with it. Just started a manual backup, it also crashed again while uploading to IONOS.
Oct 15 11:00:05 400: null Oct 15 11:00:05 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Oct 15 11:00:05 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Oct 15 11:00:05 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Oct 15 11:00:05 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Oct 15 11:00:05 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Oct 15 11:00:05 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Oct 15 11:00:05 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Oct 15 11:00:05 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Oct 15 11:00:05 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Oct 15 11:00:05 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Oct 15 11:00:05 code: 400, Oct 15 11:00:05 region: null, Oct 15 11:00:05 time: 2025-10-15T09:00:05.031Z, Oct 15 11:00:05 requestId: null, Oct 15 11:00:05 extendedRequestId: undefined, Oct 15 11:00:05 cfId: undefined, Oct 15 11:00:05 statusCode: 400, Oct 15 11:00:05 retryable: false, Oct 15 11:00:05 retryDelay: 20000 Oct 15 11:00:05 }and
Oct 15 11:00:05 box:tasks setCompleted - 7460: {"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Oct 15 11:00:05 box:tasks update 7460: {"percent":100,"result":null,"error":{"stack":"BoxError: Backuptask crashed\n at runBackupUpload (/home/yellowtent/box/src/backuptask.js:170:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async uploadAppSnapshot (/home/yellowtent/box/src/backuptask.js:369:5)\n at async backupAppWithTag (/home/yellowtent/box/src/backuptask.js:391:5)","name":"BoxError","reason":"Internal Error","details":{},"message":"Backuptask crashed"}} Oct 15 11:00:05 BoxError: Backuptask crashedThe snapshot folder within the bucket (created by cloudron?) contains a couple of up to date uploads.
- Who should I with - is it IONOS? Is it cloudron? Is it Hetzner?
- Whats causing the "internal error" while uploading?
- Why is it "retryable: false"?
@dsp76 the retryable comes the aws sdk . It's indicating whether we should retry a failing request or not. Since retryable is false, the code gives up immediately.
Can you write to us at support@cloudron.io ? We have to debug this further to understand which request is failing and why .
-
Hello @dsp76 I have marked to topic as unsolved again.
Where you unable to do this yourself or was this more of a courtesy question? -
I've had the same problem since a week ago. The strange thing is that everything had been running flawlessly for a year. However, we recently set up a second Cloudron. The backup is stored on a different bucket and also has its own key. The problem occurs on both servers. Unfortunately, it's not deterministic.
03:00:33 box:storage/s3 Upload progress: {"loaded":95400493056,"part":920,"key":"backup/snapshot/app_da39dd94-29b5-4049-9aa5-76864ebc4608.tar.gz.enc"} Oct 16 03:00:33 /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712 Oct 16 03:00:33 resp.error = AWS.util.error(new Error(), { Oct 16 03:00:33 ^ Oct 16 03:00:33 Oct 16 03:00:33 400: null Oct 16 03:00:33 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Oct 16 03:00:33 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Oct 16 03:00:33 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Oct 16 03:00:33 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Oct 16 03:00:33 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Oct 16 03:00:33 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Oct 16 03:00:33 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Oct 16 03:00:33 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Oct 16 03:00:33 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Oct 16 03:00:33 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Oct 16 03:00:33 code: 400, Oct 16 03:00:33 region: null, Oct 16 03:00:33 time: 2025-10-16T01:00:34.559Z, Oct 16 03:00:33 requestId: null, Oct 16 03:00:33 extendedRequestId: undefined, Oct 16 03:00:33 cfId: undefined, Oct 16 03:00:33 statusCode: 400, Oct 16 03:00:33 retryable: false, Oct 16 03:00:33 retryDelay: 20000 Oct 16 03:00:33 } Oct 16 03:00:33 Oct 16 03:00:33 Node.js v20.18.0 Oct 16 03:00:34 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_da39dd94-29b5-4049-9aa5-76864ebc4608 tgz {"localRoot":"/home/yellowtent/appsdata/da39dd94-29b5-4049-9aa5-76864ebc4608","layout":[]} errored BoxError: backuptask exited with code 1 signal null Oct 16 03:00:34 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Oct 16 03:00:34 at ChildProcess.emit (node:events:519:28) Oct 16 03:00:34 at ChildProcess.emit (node:domain:488:12) Oct 16 03:00:34 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Oct 16 03:00:34 reason: 'Shell Error', Oct 16 03:00:34 details: {}, Oct 16 03:00:34 code: 1, Oct 16 03:00:34 signal: null Oct 16 03:00:34 } Oct 16 03:00:34 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Oct 16 03:00:34 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Oct 16 03:00:34 at ChildProcess.emit (node:events:519:28) Oct 16 03:00:34 at ChildProcess.emit (node:domain:488:12) Oct 16 03:00:34 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Oct 16 03:00:34 reason: 'Shell Error', Oct 16 03:00:34 details: {}, Oct 16 03:00:34 code: 1, Oct 16 03:00:34 signal: null Oct 16 03:00:34 } -
I've had the same problem since a week ago. The strange thing is that everything had been running flawlessly for a year. However, we recently set up a second Cloudron. The backup is stored on a different bucket and also has its own key. The problem occurs on both servers. Unfortunately, it's not deterministic.
03:00:33 box:storage/s3 Upload progress: {"loaded":95400493056,"part":920,"key":"backup/snapshot/app_da39dd94-29b5-4049-9aa5-76864ebc4608.tar.gz.enc"} Oct 16 03:00:33 /home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712 Oct 16 03:00:33 resp.error = AWS.util.error(new Error(), { Oct 16 03:00:33 ^ Oct 16 03:00:33 Oct 16 03:00:33 400: null Oct 16 03:00:33 at Request.extractError (/home/yellowtent/box/node_modules/aws-sdk/lib/services/s3.js:712:35) Oct 16 03:00:33 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:106:20) Oct 16 03:00:33 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:78:10) Oct 16 03:00:33 at Request.emit (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:686:14) Oct 16 03:00:33 at Request.transition (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:22:10) Oct 16 03:00:33 at AcceptorStateMachine.runTo (/home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:14:12) Oct 16 03:00:33 at /home/yellowtent/box/node_modules/aws-sdk/lib/state_machine.js:26:10 Oct 16 03:00:33 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:38:9) Oct 16 03:00:33 at Request.<anonymous> (/home/yellowtent/box/node_modules/aws-sdk/lib/request.js:688:12) Oct 16 03:00:33 at Request.callListeners (/home/yellowtent/box/node_modules/aws-sdk/lib/sequential_executor.js:116:18) { Oct 16 03:00:33 code: 400, Oct 16 03:00:33 region: null, Oct 16 03:00:33 time: 2025-10-16T01:00:34.559Z, Oct 16 03:00:33 requestId: null, Oct 16 03:00:33 extendedRequestId: undefined, Oct 16 03:00:33 cfId: undefined, Oct 16 03:00:33 statusCode: 400, Oct 16 03:00:33 retryable: false, Oct 16 03:00:33 retryDelay: 20000 Oct 16 03:00:33 } Oct 16 03:00:33 Oct 16 03:00:33 Node.js v20.18.0 Oct 16 03:00:34 box:shell backuptask: /usr/bin/sudo -S -E --close-from=4 /home/yellowtent/box/src/scripts/backupupload.js snapshot/app_da39dd94-29b5-4049-9aa5-76864ebc4608 tgz {"localRoot":"/home/yellowtent/appsdata/da39dd94-29b5-4049-9aa5-76864ebc4608","layout":[]} errored BoxError: backuptask exited with code 1 signal null Oct 16 03:00:34 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Oct 16 03:00:34 at ChildProcess.emit (node:events:519:28) Oct 16 03:00:34 at ChildProcess.emit (node:domain:488:12) Oct 16 03:00:34 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Oct 16 03:00:34 reason: 'Shell Error', Oct 16 03:00:34 details: {}, Oct 16 03:00:34 code: 1, Oct 16 03:00:34 signal: null Oct 16 03:00:34 } Oct 16 03:00:34 box:backuptask runBackupUpload: backuptask crashed BoxError: backuptask exited with code 1 signal null Oct 16 03:00:34 at ChildProcess.<anonymous> (/home/yellowtent/box/src/shell.js:137:19) Oct 16 03:00:34 at ChildProcess.emit (node:events:519:28) Oct 16 03:00:34 at ChildProcess.emit (node:domain:488:12) Oct 16 03:00:34 at ChildProcess._handle.onexit (node:internal/child_process:294:12) { Oct 16 03:00:34 reason: 'Shell Error', Oct 16 03:00:34 details: {}, Oct 16 03:00:34 code: 1, Oct 16 03:00:34 signal: null Oct 16 03:00:34 } -
J james has marked this topic as solved on
-
@james it still happens sometimes. I did more investigation in the log. It says in between it couldn't find the file ("Old backup not found").
Jan 14 05:26:29 box:storage/s3 Upload progress: {"loaded":40692513140,"part":304,"Key":"snapshot/app_APP_UUID_01.tar.gz.enc","Bucket":"ACME-BACKUP"} Jan 14 05:47:09 box:storage/s3 Upload finished. {"$metadata":{"httpStatusCode":200,"requestId":"REQUEST_ID_01-ACCOUNT_01-REGION_01","attempts":3,"totalRetryDelay":40000},"Bucket":"ACME-BACKUP","ETag":"\"\"","Key":"snapshot/app_APP_UUID_01.tar.gz.enc","Location":"S3_ENDPOINT_01/ACME-BACKUP/snapshot/app_APP_UUID_01.tar.gz.enc"} Jan 14 05:47:09 box:backuptask upload: path snapshot/app_APP_UUID_01.tar.gz.enc site SITE_UUID_01 uploaded: {"fileCount":11571,"size":40692513140,"transferred":40692513140} Jan 14 05:47:09 box:tasks updating task TASK_ID_01 with: {"percent":75.1935483870967,"message":"Uploading integrity information to snapshot/app_APP_UUID_01.tar.gz.enc.backupinfo (REGISTRY.DOMAIN.TLD)"} Jan 14 05:47:10 box:storage/s3 Upload progress: {"loaded":146,"total":146,"part":1,"Key":"snapshot/app_APP_UUID_01.tar.gz.enc.backupinfo","Bucket":"ACME-BACKUP"} Jan 14 05:47:10 box:storage/s3 Upload finished. {"$metadata":{"httpStatusCode":200,"requestId":"REQUEST_ID_02-ACCOUNT_02-REGION_01","attempts":1,"totalRetryDelay":0},"ETag":"\"ETAG_01\"","Bucket":"ACME-BACKUP","Key":"snapshot/app_APP_UUID_01.tar.gz.enc.backupinfo","Location":"https://ACME-BACKUP.s3.REGION_01.ionoscloud.com/snapshot/app_APP_UUID_01.tar.gz.enc.backupinfo"} Jan 14 05:47:10 box:backupupload upload completed. error: null Jan 14 05:47:10 box:backuptask runBackupUpload: result - {"result":{"stats":{"fileCount":11571,"size":40692513140,"transferred":40692513140},"integrity":{"signature":"SIGNATURE_01"}}} Jan 14 05:47:10 box:backuptask uploadAppSnapshot: REGISTRY.DOMAIN.TLD uploaded to snapshot/app_APP_UUID_01.tar.gz.enc. 4202.695 seconds Jan 14 05:47:10 box:backuptask backupAppWithTag: rotating REGISTRY.DOMAIN.TLD snapshot of SITE_UUID_01 to path 2026-01-14-030000-896/app_REGISTRY.DOMAIN.TLD_VERSION_01.tar.gz.enc Jan 14 05:47:10 box:tasks updating task TASK_ID_01 with: {"percent":75.1935483870967,"message":"Copying (multipart) snapshot/app_APP_UUID_01.tar.gz.enc"} Jan 14 05:47:10 box:tasks updating task TASK_ID_01 with: {"percent":75.1935483870967,"message":"Copying part 1 - ACME-BACKUP/snapshot/app_APP_UUID_01.tar.gz.enc bytes=0-1073741823"} Jan 14 05:47:10 box:tasks updating task TASK_ID_01 with: {"percent":75.1935483870967,"message":"Copying part 2 - ACME-BACKUP/snapshot/app_APP_UUID_01.tar.gz.enc bytes=1073741824-2147483647"} Jan 14 05:47:10 box:tasks updating task TASK_ID_01 with: {"percent":75.1935483870967,"message":"Copying part 3 - ACME-BACKUP/snapshot/app_APP_UUID_01.tar.gz.enc bytes=2147483648-3221225471"} Jan 14 05:47:10 box:tasks updating task TASK_ID_01 with: {"percent":75.1935483870967,"message":"Aborting multipart copy of snapshot/app_APP_UUID_01.tar.gz.enc"} Jan 14 05:47:10 box:storage/s3 copy: s3 copy error when copying snapshot/app_APP_UUID_01.tar.gz.enc: NoSuchKey: UnknownError Jan 14 05:47:10 box:backuptask copy: copy to 2026-01-14-030000-896/app_REGISTRY.DOMAIN.TLD_VERSION_01.tar.gz.enc errored. error: Old backup not found: snapshot/app_APP_UUID_01.tar.gz.enc Jan 14 05:47:10 box:backuptask fullBackup: app REGISTRY.DOMAIN.TLD backup finished. Took 4203.103 seconds Jan 14 05:47:10 box:locks write: current locks: {"full_backup_task_SITE_UUID_01":null} Jan 14 05:47:10 box:locks release: app_backup_APP_UUID_01 Jan 14 05:47:10 box:tasks setCompleted - TASK_ID_01: {"result":null,"error":{"message":"Old backup not found: snapshot/app_APP_UUID_01.tar.gz.enc","reason":"Not found"},"percent":100} Jan 14 05:47:10 box:tasks updating task TASK_ID_01 with: {"completed":true,"result":null,"error":{"message":"Old backup not found: snapshot/app_APP_UUID_01.tar.gz.enc","reason":"Not found"},"percent":100} Jan 14 05:47:10 box:taskworker Task took 6429.474 seconds Jan 14 05:47:10 BoxError: Old backup not found: snapshot/app_APP_UUID_01.tar.gz.enc Jan 14 05:47:10 at throwError (/home/yellowtent/box/src/storage/s3.js:387:49) Jan 14 05:47:10 at copyInternal (/home/yellowtent/box/src/storage/s3.js:454:16) Jan 14 05:47:10 at process.processTicksAndRejections (node:internal/process/task_queues:105:5) Jan 14 05:47:10 at async Object.copy (/home/yellowtent/box/src/storage/s3.js:488:12) Jan 14 05:47:10 at async Object.copy (/home/yellowtent/box/src/backupformat/tgz.js:282:5) Jan 14 05:47:10 Exiting with code 0I checked the bucket and can see its still there:
app_APP_UUID_01.tar.gz.enc 37.90 GB 14.1.2026, 05:36:49 app_APP_UUID_01.tar.gz.enc.backupinfo 146 bytes 14.1.2026, 05:47:09Please also check the timestamps.
Whats causing the the process is not finding the file and stopping the process?
-
J james has marked this topic as unsolved
-
So the actual
app_APP_UUID_01.tar.gz.encwas created prior to the multiparty copy attempt. Also it seems first two parts were copied fine and suddenly the S3 object cannot be found anymore for a brief period? Is this easily reproducible or more of an occasional hiccup?