Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • 0 Votes
    7 Posts
    2k Views
    rmdesR
    glad to have helped
  • ODrive Sync can use any number of backends!

    Discuss backups
    3
    2 Votes
    3 Posts
    703 Views
    robiR
    @girish Oh, I had the URL copied in the paste buffer but got distracted. It's much simpler - https://odrive.com I've updated the post with the link.
  • Best aws s3 backup storage class

    Discuss backups glacier aws
    15
    0 Votes
    15 Posts
    3k Views
    robiR
    Good points @mehdi @CarbonBee See if you can find another project that already integrates with Glacier and how they handle it. If it's OSS, the code will be reusable, easing integration for devs.
  • Why is my backup drive full?

    Solved Support backups backup
    20
    1 Votes
    20 Posts
    2k Views
    girishG
    @ei8fdb said in Why is my backup drive full?: I've moved the backup directories from /var/backups/ to my external drive. If I want to restore from those backup directories, do I need the contents of the /var/backups/snapshot directory also? The snapshot directory is not required for restoring. BUT it's required for the actual the backups to work (so think of it as a working directory for backups). It's important to not remove that directory when doing backups! Since, Cloudron will use hard links from the actual backups to the snapshot directory, it's not really taking up extra space.
  • 0 Votes
    5 Posts
    1k Views
    LonkleL
    Thank you for the quick reply. But even in my testing environment, telling me my backup method is unsafe once a day is just a hassle to weed them out of the Cloudron update (valuable) notifications. I bet I can just trick Ubuntu into seeing the path as a volume to get rid of the notification. But for normal Cloudron users, I bet an option to acknowledge and disable the “backup is unsafe notification” would be valuable for them especially in their first non-production use of Cloudron (where I’m personally at for now).
  • 0 Votes
    5 Posts
    1k Views
    nebulonN
    Those sync.cache files are state files of the rsync backup uploader. That has to run as root to be able to list and read the app's files, the app file permissions here depend on the app itself and how it manages permissions. The yellowtent user as such does not have root rights on the system and uses helper scripts in such cases. The list of scripts which need and thus run with elevated permissions can be found at https://git.cloudron.io/cloudron/box/-/blob/master/setup/start/sudoers
  • 0 Votes
    10 Posts
    3k Views
    M
    I can confirm that, now that the lifecycle settings are correct, backups get physically deleted from B2.
  • Deleting Backups - unexpected behaviour

    Solved Support backups
    6
    3
    0 Votes
    6 Posts
    1k Views
    luckowL
    Reading documentation is sysadmins nightmare Thank you for the clarification. That makes sense and it looks like a clever solution. I guess that in case of a Cloudron trivia night, the question "How many backups do you expect in your storage location?" will be answered in many different ways according to the "important rules" from the documentation
  • Backups: protocol connection lost

    Solved Support backups mysql
    3
    0 Votes
    3 Posts
    742 Views
    luckowL
    ok. I gave mysql more RAM. Since then the backups have been running smoothly.
  • 4 Votes
    11 Posts
    2k Views
    girishG
    @Jan-Macenka OK, I have made filename encryption optional in 7.3. [image: 1656387319637-fbb205df-5917-403b-92c5-9084a856c656-image-resized.png]
  • 0 Votes
    28 Posts
    6k Views
    A
    @girish Great! Thanks again for your help debugging this and adding more configuration. Huge help for larger backups like mine.
  • Backup feedback over sshfs

    Solved Support sshfs backups
    35
    0 Votes
    35 Posts
    7k Views
    avatar1024A
    @girish Sounds great, thanks Girish.
  • 0 Votes
    3 Posts
    772 Views
    nebulonN
    Ok this is fixed, however requires a new Cloudron release, since the tasks API needs a proper pending state.
  • 1 Votes
    18 Posts
    4k Views
    mfcodeworksM
    Hey guys, sorry for the delay I've updated the fork and the providers function, if I get permission to create a new branch or you open one I can open an MR with the new branch and let you review
  • 0 Votes
    11 Posts
    2k Views
    girishG
    @d19dotca I have added some docs now. https://cloudron.io/documentation/backups/#schedule has info on the timeout and nice. https://cloudron.io/documentation/backups/#concurrency-settings on the concurrency settings.
  • Backup Fails: "Unknown system error -74"

    Support backups sshfs
    10
    0 Votes
    10 Posts
    3k Views
    nebulonN
    @rlp10 there is no clear cut between which backup target is preferred. We try to cover a few options to cover a range or use-cases, but all should work. If you own the backup machine, I would anyways not advise to encrypt backups. Unencrypted files on trusted storage is a lot easier to restore if worst case trouble hits
  • Faster backups or more settings options for backups

    Solved Feature Requests backups
    8
    5 Votes
    8 Posts
    2k Views
    girishG
    For tgz, what we found is that the slowness is mostly because of the gz part and most of the cloud VPS are not very fast at this. And the whole tgz by it's nature is single core. For rsync, parallelism and buffer size were indeed a constraint. But these are both now configurable in the 5.5 and 5.6. Note that the concurrency her also depends much on the storage backend. For example, DO can handle only 20 at a time. But S3 can handle 1000s at a time. One has to experiment with the values a bit to figure the right number. Mostly s3 connectors don't publish ideal sizes and concurrency unfortunately.
  • 0 Votes
    26 Posts
    984 Views
    d19dotcaD
    @girish that’s perfect, great detective work! Thanks Girish for working to solve that. I really appreciate it. I’ll keep an eye on it in my server you patched and see if the issue comes back. Thanks again!
  • Backup failed: Task 39 timed out.

    Solved Support backups timeout wasabi
    12
    0 Votes
    12 Posts
    3k Views
    girishG
    @shan I will mark this thread as solved. Let's follow up at https://forum.cloudron.io/topic/7750/backup-uploads-time-out-after-12-hours-can-no-longer-manually-adjust-timeout-to-be-longer
  • Cloudron Backups to GitLab/GitHub Private Repos

    Feature Requests backups
    7
    1 Votes
    7 Posts
    2k Views
    girishG
    @marcusquinn said in Cloudron Backups to GitLab/GitHub Private Repos: Not a high priority - but might be nice for both the really tight and multi-location/provider redundancy aims. I have to think through the rest but multi-location backups is in our radar. We have a long pending issue about this https://git.cloudron.io/cloudron/box/-/issues/528