-
Now I am still having troubles finding the optimal backup setting and I am not well-versed in this, so maybe someone can help:
- Rsync: I'd love using rsync, because it allows me to just go to Backblaze in my case, find an app, find a specific file and download it to my computer. But it is painfully slow, think 10 hours per backup, which blogs installing and updating other apps, and sometimes I need that. My current workaround: setting it fully to non-working hours starting in the evening. At least with backblaze, I can't make it faster, because I apparently hit their rate limit already with relatively low settings.
- Tarball: I used to do this and it worked faster and more reliable. However, it creates extremely large files. One tgz file, for example, was 170 GB, and it is a bit difficult to download this to my computer.
Despite of how easy it is to restore backups with cloudron, I'd actually love to be able to access them conveniently without it too. But I am not really sure how to organise this effectively.
Background information about my setup: I use Nextcloud and Immich, that both create lots of small files, and I also have e-mails running via cloudron. So, we are talking about 240 GB of data per server.
-
-
@ekevu123 said in Backup settings:
which blogs installing and updating other apps, and sometimes I need that
A fix for this is coming. Backup had a global lock which is being removed.
As for the bigger question, there is no clear answer. Backups take time, it is supposed to be something in the background and taking less resources. Not something actively monitored. Even in AWS and DO where they have complete control of their servers, if you take a snapshot of a 100GB machine, it takes a LONG time - they suggest 1 to 3 minutes per GB . It just the way it is. (240GB is worst case 720 mins or 12 hours where they have complete control of disk and server).