B2 missing chunks (hidden files)

JarnoP     Mar 4 9:56PM 2018 CLI

One of my computers backing up to B2 is returning missing chunks errors:

URL request 'https://f001.backblazeb2.com/file/0020XAXAXAXAXAXAXA/chunks/CACACACACAXCXCXCXC53a325b2fb10f3337cXXXX.fsl' returned status code 404

There are multiple of these files, all I have checked so far are still at B2, but there is a "hidden" version of those files.

1) Why this has happened?

2) How can I get the backup working again? I ran prune -exhaustive, but that didn't help.

Pls note that there are two computers backing up to the same storage. The B2 Bucket has Lifecycle settings to keep all the file versions (default).

gchen    Mar 4 10:09PM 2018

This should be the same as https://github.com/gilbertchen/duplicacy/issues/323, and fixed by https://github.com/gilbertchen/duplicacy/commit/771323510d8718ce58705a1d244b90e86db06ec8.

If you build the source on the git master branch this error should go away.

JarnoP    Mar 4 11:03PM 2018

Yes, looks familiar. Thanks!

JarnoP    Mar 5 12:00AM 2018

Is version 2.0.11 far away? I tried to compile from the source, but the execution of the command stopped after few minutes for:

package context: unrecognized import path "context" (import path does not begin with hostname)

It seems the go get command would compile a large number of packages. I prefer waiting the official release, I don't want to bother with this one.

JarnoP    Mar 5 12:37AM 2018

I am having two versions of the same file on B2. In some cases the newest file is hidden and has zero-size, but in other cases the files are the same size, but SHA1 differs. Are these the same bug as GitHub #323?

What should I do for the files with non-zero size, but different SHA1? The files are shown as uploaded at the same time.

If I delete those chunks, Is there a way to force Duplicacy to re-upload a missing chunk?

gchen    Mar 5 10:11PM 2018

If two version are the same size then it means two clients uploaded the same chunk. This can happen, if the file uploaded by one client isn't found by the second client due to some propagation delay on the B2 servers. The SHA1 hashes will be different, since every time a random salt is generated to encrypt the chunk before uploading it.

It should be safe to delete all but one version of each file, but it would make more sense for Duplicacy to do this when the -exhaustive option is provided to the prune command.

JarnoP    Mar 7 4:27AM 2018

Thanks for clarifying this! I forgot the encryption part causing the SHA1 checksums to differ.

Copyright © Acrosync LLC 2016-2017