thedaveCA Feb 4 3:09AM 2017 GUI
I'm currently running some local tests of Duplicity, one thing I observed is that if I remove a chunk file I get an error on the next backup. While I understand that this is a less than ideal situation, the reality is that periodically things happen and a file may be lost/damaged, especially in cloud-hosted environments where the storage is under control of a third party.
I'm glad that Duplicity noticed, but I was hoping that perhaps since the data was still available locally there would be some way to recover from this situation (possibly even just replacing the missing chunks on the fly).
Is there any way I could discover that a chunk file has gone missing in the real world, and if so, what would be my options to recover?
Would this situation stop me from restoring files (assuming that the files which rely on that chunk are gone forever, could I still restore other files)
gchen Feb 4 8:40PM 2017
If you delete a chunk that belongs to a file, then this only affects the restoration of that file which can be excluded from the restoration, not affecting other files. If that file still exists locally you can do a backup and the missing chunk will be recreated.
But if you delete a chunk that is part of a snapshot file, then the snapshot file will be corrupted, and you won't be able to restore any file. However, if you're lucky enough (for instance local files do not change at all) then when you do a new backup then the missing chunk will be recreated (snapshot files are deduplicated too).
There is a check command to verify that snapshots are complete.