Setting up Duplicacy for a headless job on Linux

nedtwigg     Mar 30 3:07PM 2018 CLI

Using Duplicacy CLI 2.1.0, I'm having a hard time setting up a headless task. I'm trying to use our existing CI infrastructure, so we can see that our backups are working in our dev dashboard.

If I do env | grep 'DUPLICACY', I see that I have variables set for:


Then when I run sudo -preserve-env duplicacy backup -stats I still get:

Storage set to s3://<my bucket>`
Enter S3 Access Key ID:

Any guesses what I'm doing wrong? Thanks!

gchen    Mar 30 10:44PM 2018

You can run duplicacy -d list to see which environment variable it is trying to read from.

Most likely you assigned a name to the storage (either by the add command or the -storage-name option of the init command). If the storage name is not default, then the name of the variable includes the storage name. For example, if the storage name is second, and the variable for storing the S3 id will be DUPLICACY_SECOND_S3_ID.

nedtwigg    Apr 1 12:06AM 2018

Thanks! My problem has evolved. When I run this script:


sudo --preserve-env duplicacy -d list

I get this result:

panic: close of closed channel

goroutine 1 [running]:*Conn).Close(0xc42014d680, 0x12bdee0, 0xc4201e0000)
        /Users/chgang/zincbox/go/src/ +0x52, 0x12bdee0, 0xc4201e0000)
        /Users/chgang/zincbox/go/src/ +0x16d
        /Users/chgang/zincbox/go/src/ +0x34
        /Users/chgang/zincbox/go/src/ +0xcb
        /Users/chgang/zincbox/go/src/ +0x127
        /Users/chgang/zincbox/go/src/ +0x93

gchen    Apr 2 3:02PM 2018

This is a bug in the dbus implementation, but I think if you add unset DBUS_SESSION_BUS_ADDRESS to the script before running the duplicacy command it should be able to avoid this bug.

nedtwigg    Apr 9 6:43PM 2018

Thanks! That worked perfectly.

Copyright © Acrosync LLC 2016-2017