Duplicati integration on Windows is less stable than on Debian

Per title, it looks to me Duplicati integration on Windows is less stable than on Debian.

On Windows, when you try to run a backup of a folder where many files of various size (my case: 750 files, from 250 KB to 130 MB) exists, running the backup often causes an error that many of them are failed to be uploaded.

On Debian, on the contrary, I have never experienced that kind of problem; the task with the same amount of files goes smoothly without any glitch. It looks to me Duplicati is integrated on Linux better than Windows.

Does anyone know what causes this difference, and how is it possible to debug the issue on Windows if any? Should I rather report this to Duplicati?

Thanks,

I have similar problems recently but I have them on debian as well. The strange thing is that my old backup job is still running fine. Only new jobs are failing. I was playing around with a lot of settings and was thinking that it could have been my fault. Now I am not sure anymore.

Maybe @TopperDEL can help us.

Regardless of the root cause the next question is why does the repair button don’t work? I would expect it to reupload the missing files.

I’m no Duplicati-expert myself though I made the Duplicati-integration for Tardigrade.
I can only say that my Windows-Backups run very smoothly since months now. But I also did not create new backup jobs since then.

Not sure how I can help.

The repair button-problem should be a duplicati-one, though. The Tardigrade-backend only handles the transfer and listing of files, but does not do any managing-stuff.

1 Like

Would you mind lettimg me know where to submit this issue? Issues · duplicati/duplicati · GitHub, GitHub - storj/storj: Ongoing Storj v3 development. Decentralized cloud object storage that is affordable, easy to use, private, and secure., or somewhere else?

For your information;

You are currently running Duplicati - 2.0.5.111_canary_2020-09-26

Use the duplicati github repo or the duplicati forums for this.
The version should be the latest - at least contains it the latest tardigrade-backend.

That helped a lot already :smiley:

I also see a few similar looking issues in the duplicati forum. I don’t feel like I could open a bug report. I would start a thread in the duplicati forum and see how that works out. First I need to repeat my test with a higher logging level. Last time the only information I was getting is that the backup job failed and that was all.

1 Like

I did some additional tests with a higher log level and something is off. I have the feeling duplicati is now trying to upload 4 blocks at the same time. It seems to ignore my concurrent upload limit. I would expect uploads to fail in that situation. I might have to try an older version and see if that will behave different.

Ohhh… Now I see there are 2 options with similar names. I didn’t know that and might have picked the wrong one. Ok let’s try again and see if it works this time.

1 Like

Did you find any clue?

Yes my last backup was running fine. It looks like I foolded myself. From now on I will simply include both options to make sure I always have the correct one.

asynchronous-concurrent-upload-limit=1
asynchronous-upload-limit=1

@Andisers could you retry your test with these options and see if that solves it for you as well?

I am not sure if that was actually the root cause of my problem. I repeated my test with these settings on a different machine. On that one I also specified only one of these options but it was the correct one. This would mean that I always need both settings or that something else has changed that I am not aware of.

Sorry, I have just completed replacement my Windows machine with Debian, so unfortunately I could not try your solution :frowning_face: