Duplicacy sometimes using insane memory/cpu

borgqueenx     Apr 5 6:42AM 2018 GUI

Here is a screenshot: http://prntscr.com/j17mem

This causes me unable to almost perform basic tasks on my pc anymore.


borgqueenx    Apr 5 6:43AM 2018

I should also note that i given the command to duplicacy to close after this happens> So it doesnt actually close but the task remains.


gchen    Apr 5 9:32AM 2018

How much memory was it using? It is hard to tell from the screenshot.

And how many files are in the directory to be backed up?


borgqueenx    Apr 8 6:40AM 2018

It uses 12,3 gigabytes RAM from that screenshot. Its backing up almost all files on PC exept for temp/windows files. So tens of thousands at least but out of those many are probaly skipped due initial backup being completed.


gchen    Apr 9 1:38PM 2018

This is a design flaw. The entire file list has to be loaded into memory to create the current snapshot and to compare to last snapshot. I plan to fix this issue in the next major update.


Chris    Apr 18 11:54PM 2018

I think I may be seeing this problem on a FreeNAS box. After a while, the machine hard locks as a result of running duplicacy. Similar to out of memory, but the machine is in such a terrible state I can't do much to check RAM usage.

is there a work around?


gchen    Apr 19 9:02PM 2018

One workaround is to split the repository into multiple smaller repositories. This should definitely reduce the amount of memory usage.


Chris    Apr 23 11:24PM 2018

They can then all share the same b2 bucket eh?

Do you have an ETA on the above-mentioned next major release?