Backup fails repeatedly because of mongodbdump timeout
-
Hi all
The backups in my Cloudron fail repeatedly with the following message:
Backup failed: Could not pipe http://172.18.0.5:3000/databases/<omitted>/backup?access_token=<omitted> to /home/yellowtent/appsdata/<omitted>/mongodbdump: Request timedout. Logs are available here.In the last 7 days the backup failed 5 times.
Usually, it is enough to restart the mongodb service and initiate a backup right after the service has started up. Unfortunately, this workaround stopped to work and the backups fail anyway.
Memory wise I already upgraded mongodb two times (it is now 8GB!):

I somehow doubt increasing the memory further will be a longterm solution.The app that is causing the mongodbdump timeout is a Rocketchat instance.
Any ideas on how to tackle and solve this?
-
Hi all
The backups in my Cloudron fail repeatedly with the following message:
Backup failed: Could not pipe http://172.18.0.5:3000/databases/<omitted>/backup?access_token=<omitted> to /home/yellowtent/appsdata/<omitted>/mongodbdump: Request timedout. Logs are available here.In the last 7 days the backup failed 5 times.
Usually, it is enough to restart the mongodb service and initiate a backup right after the service has started up. Unfortunately, this workaround stopped to work and the backups fail anyway.
Memory wise I already upgraded mongodb two times (it is now 8GB!):

I somehow doubt increasing the memory further will be a longterm solution.The app that is causing the mongodbdump timeout is a Rocketchat instance.
Any ideas on how to tackle and solve this?
@gml do you actually get out of memory notifications in your dashboard for the mongodb service? Setting the limit in Cloudron is just the max upper limit, this does not mean 8GB are actually reserved for it, the system as a whole has to have memory available as well.
-
N nebulon marked this topic as a question on
-
@gml do you actually get out of memory notifications in your dashboard for the mongodb service? Setting the limit in Cloudron is just the max upper limit, this does not mean 8GB are actually reserved for it, the system as a whole has to have memory available as well.
-
@gml that is sensible. You could enalbe remote SSH support for us https://docs.cloudron.io/support/#remote-support and send us a mail to support@cloudron.io with your dashboard domain, then we can locally patch that timeout value, which should make it work again for you to update finally.
-
G gml has marked this topic as solved on
Hello! It looks like you're interested in this conversation, but you don't have an account yet.
Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.
With your input, this post could be even better 💗
Register Login