Folder with more than 1000 object the datetime of new objects are 0001-01-01 01:00:00

I have a directory with more than 16324 objects.
Now when I try to add an object I have the date of the object which is 0001-01-01 01:00:00.
I have the problem with rclone/uplink/webinterface.
I have no problem in a directory with less objects.
I can repeat this problem with creating a folder with more than 1000 objects

Have you already encountered this problem?

Hello @zoltix ,
Welcome to the forum!

How is rclone configured? Using a native connection or Gateway MT?

How did you check the date of the objects? Do the date showed as described in the subject when you use each of the tool?

If you can reproduce an issue, I suggest to submit it on our GitHub.

with rclone, I made a test by configuring it with tardigrade and AWS. I copied a file on my bucket and also with uplink. I have the same result.

As I said before, when I have less than 1000 files, I have no problem. With Linux and Windows platforms, I have the same result.
ok I’ll go post this on github

ps : it’s not a big deal, I don’t care about dates.

test script with uplink

echo “\n”
echo “\n”
echo “\n”
echo “##################################################”
echo “################# small folder ###################”
echo “##################################################”
echo "folder list "
uplink ls sj://backuppcloud/configuration/
echo “upload test file”
uplink cp test.data sj://backuppcloud/configuration/test.data
echo “looking for the date”
uplink ls sj://backuppcloud/configuration/
echo “clean”
uplink rm sj://backuppcloud/configuration/test.data
echo “##################################################”
echo “################# huge folder ###################”
echo “##################################################”
echo “count number of objects”
uplink ls sj://backuppcloud/duplicatibackup/ | wc -l
echo “count number of objects with the bad date”
uplink ls sj://backuppcloud/duplicatibackup/ | grep ‘0001-01-01 00:17:30’ | wc -l
echo “count number of objects with other date”
uplink ls sj://backuppcloud/duplicatibackup/ | grep -v ‘0001-01-01 00:17:30’ | wc -l
echo “upload test file”
uplink cp test.data sj://backuppcloud/duplicatibackup/test.data
echo “looking for the date”
uplink ls sj://backuppcloud/duplicatibackup/ | grep test.data
echo “clean”
uplink rm sj://backuppcloud/duplicatibackup/test.data
echo “############# end test ############################”
echo “##################################################”

1 Like

Thanks to report the issue.

Sounds like the bug fixed (and discovered) recently: Filtering metadata doesn't work well with more than one listing object page · Issue #85 · storj/uplink · GitHub

2 Likes