Maybe the Synology encryption is changing the timestamps so Duplicacy thinks that a lot of files have changed. Yes, but it will probably take some hours -) duplicacy_linux backup -threads 8 -stats and send the output to Jan 9 2:18PM 2018 Maybe it can't access timestamps or store them like it's used to and therefore falls back to hashing? Can I check this somehow?Īlso maybe interesting in this regard: MyLib is encrypted with the Synology encryption for shared folders (but Duplicacy obviously is backing up the already unlocked folder) and it is on a btrfs volume.Ĭan you run. Update: Hmm, looking at my CPU I'm quite sure Duplicacy actually is rehashing. I was thinking of separating MyLib in 4 evenly big parts and running 4 separate duplicacy tasks in parallel for each one, but I don't really want to go through the big initial upload again and don't know if I would run in other problems -)ītw, regardless of the speed improvements (which would be nice to have but if it's not possible I could live with it), do you see other problems with this scenario or do you think it is okay? Keys and passwords are held in environment variables. duplicacy_linux init -e -c 8M MyLib b2://Synology-MyLib duplicacy_linux backup -threads 8Īnd this was my init. Nope, this is all there is to my backup command. Is this in general a good setup or should I look for other solutions (which I don't like as I like the principles of Duplicacy very much! -))Īre you running the backup with the -hash option? By default Duplicacy will find changed files by comparing timestamps and sizes, but if the -hash option is specified it will rescan every file and 10 to 15 hours is a reasonable time to rescan 1.5 TB.Can I achieve somehow faster follow-up backups / use more CPU?.(Needless to say I'm using Duplicacy with encryption.) So as my Synology has an Intel Quad Core my hunch is it is using only one core and the process could maybe be accelerated a lot when it would use all or at least 3 cores. The CPU is constant at 25% for the duplicacy task. However now the process for follow-up backups takes about 10 to 15 hours (2 tests so far and the third currently running) where it is noteworthy that there is very little upload (about 1 to 10 GB which should be uploaded in under one hour) and also nearly no disk IOPS, read/write or utilization. So no bottleneck there and good job so far. The initial backup (~1,5 TB of files with very varying size) took about 5 days which is exactly what was to expect when my uploads speed was used at nearly 100%. So it is super easy to set up a manual scheduled task to backup the NAS with Duplicacy which is great. So first of all I was happy to see the linux binary just work on my NAS when I SSH to it. Recently I started evaluating if the Duplicacy CLI (linux binary) can be an option to backup my Synology NAS to Backblaze B2 as the Cloud Sync Synology implementation for B2 has so many flaws (mainly it doesn't encrypt file names, it is not block based in any way so it reuploads whole files even if 1KB change in a 100GB file and it is wasteful in other ways with B2 space).
0 Comments
Leave a Reply. |