Cloudron makes it easy to run web apps like WordPress, Nextcloud, GitLab on your server. Find out more or install now.


Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • Bookmarks
  • Search
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

Cloudron Forum

Apps - Status | Demo | Docs | Install
  1. Cloudron Forum
  2. Support
  3. Long backups, local and remote, failing consistently

Long backups, local and remote, failing consistently

Scheduled Pinned Locked Moved Unsolved Support
backupsshfsrsync
1 Posts 1 Posters 6 Views 1 Watching
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • jadudmJ Offline
    jadudmJ Offline
    jadudm
    wrote last edited by jadudm
    #1

    (I have a suspicion that this is a variation on this post from a while back.)

    I have configured backups as follows:

    backup set encr? target day time files size
    bitwarden Y storage box daily 20:00 800 7MB
    photos N storage box S 03:00 300K 200GB
    photos N NAS Su 03:00 300K 200GB
    full (- music, -photos) Y NAS MWF 03:00 18K 12GB
    music N NAS T 03:00 ? 600GB

    What I'm finding is that my Immich (photos) instance does not want to backup. To be more precise: Immich consistently fails a long way into the backup. In both the case where it is talking to a storage box (overseas, for me) and to my local NAS, it is configured as an SSHFS mount. In each location I have set up a folder called $HOME/backups, and used a subpath for each backup (e.g. photos, so that the full path becomes $HOME/backups/photos, $HOME/backups/vaults, etc.). In all cases, I'm using rsync with hardlinks.

    I removed the photos (which is large/has many files) and the music from the full backup set, because I want to target them separately for backup. And, I want to make sure my full backup completes.

    I can backup the bitwarden instance, because it is small. I have not yet seen the photos complete. I end up somewhere around 290K files, and there's an SSH error that drops. I don't know what the root cause is. (And, I'm now waiting for another backup, because Immich kicked off an update... so, I have to wait.)

    I'll update this thread if/when it fails again. Possible root causes (that would be difficult for me to work around):

    1. Too many files. I would think rsync would have no problems.
    2. Files changing. Immich likes to touch things. Is it paused during backup? If not, could that be the problem? (There are tempfiles that get created as part of its processes; could those be in the set, then get processed/deleted before the backup gets to them, and then it breaks the backup? But, pausing during backups is disruptive/not appropriate for a live system, so... that's not actually a solution path. Ignore me.)
    3. Not enough RAM. Do I need to give the backup process more RAM?

    The NAS is a TrueNAS (therefore Debian) machine sitting next to the Cloudron host. Neither seems to be under any kind of RAM pressure that I can see. Neither is doing anything else of substance while the backups are happening.

    Unrelated: I do not know what happens when Immich updates, because I am targeting it with two backup points. Does that mean an app update will trigger a backup to both locations? Will it do so sequentially, or simultaneously?

    possible other solutions

    I would like the SSHFS backup to "just work." But, I'm aware of the complexity of the systems involved.

    Other solutions I could consider:

    1. Use object storage. I don't like this one. When using rsync with many files, I discovered that (on B2) I could end up paying a lot for transactions if I had a frequent backup, because rsync likes to touch so many things. This was the point of getting the NAS.
    2. Run my own object storage on the NAS. I really don't want to do that. And, it doesn't solve my off-site photos backup.
    3. Introduce JuiceFS on the Cloudron host. I could put JuiceFS on the Cloudron host. I dislike this for all of the obvious reasons. But, it would let me set up an SSHFS mount to my remote host, and Cloudron/rsync would think it was a local filesystem. This might only be pushing the problems downwards, though.
    4. Backup locally, and rsync the backup. I think I have the disk space for this. This is probably my most robust answer, but it is... annoying. It means I have to set up a secondary layer of rsync processes. On the other hand, I have confidence that if I set up a local volume, the Cloudron backup will "just work."

    Ultimately, I'm trying to figure out how to reliably back things up. I think #4 is my best bet.

    I use Cloudron on a DXP2800 NAS w/ 8TB in ZFS RAID1

    1 Reply Last reply
    0
    Reply
    • Reply as topic
    Log in to reply
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes


    • Login

    • Don't have an account? Register

    • Login or register to search.
    • First post
      Last post
    0
    • Categories
    • Recent
    • Tags
    • Popular
    • Bookmarks
    • Search