Deleted files appear as locked objects

This error occurred while transferring a few GB from another cloud to Storj using Rclone.

To analyze it better, I created a new empty bucket and only copied 2 files. When you delete one of the files, it “becomes” a locked object:



The above files were transferred to Storj using Rclone, but the file was deleted using the web interface.

Any ideas?

1 Like

Adding: if I copy the files again from the other cloud, one by one, using Rclone, they “cease to be” locked objects. The images below are in chronological order:



+1. I have seen deleted files showing up “locked with different passphrase” occasionally as well, but could not reliably reproduce.

Partial uploads could be displayed as locked objects too.
Please check that:

uplink ls --partial --recursive sj://demo-bucket
1 Like
uplink ls --recursive sj://demo-bucket | wc -l

uplink ls --recursive --encrypted sj://demo-bucket | wc -l


uplink ls --partial --recursive sj://demo-bucket
    argument error: unknown flag: "--partial"

@TowerBR You can try --pending instead of --partial. The latter flag could be from an older version of uplink CLI.


It worked:

uplink ls --pending --recursive sj://demo-bucket
KIND    CREATED                SIZE    KEY
OBJ     2023-10-02 18:01:54    0       Enigma.mp4

So the “3 objects” from the previous command are the 2 files + this one with zero size?

Yes, this is a pending multipart upload (Understanding Multipart Upload - Storj Docs). rclone uses the multipart upload interface to upload files, as it’s often more reliable than a single-call upload. It’s possible that rclone didn’t clear a failed upload (for example) and it’s the reason you are seeing that. You can try removing it manually by using the uplink rm command with the --pending flag.


In fact, AFAIR, the default is 4.

I will remove the partial file and redo the tests with --transfers 1.

1 Like

I ran the command to remove:

uplink rm --pending --recursive --access all sj://demo-bucket/Enigma.mp4
removed sj://demo-bucket/Enigma.mp4

but it still shows:

uplink ls --recursive sj://demo-bucket | wc -l
uplink ls --recursive --encrypted sj://demo-bucket | wc -l

Shouldn’t it show “2 objects”?

wc -l is counting all output lines including the header line

 uplink ls --recursive sj://demo-bucket
KIND    CREATED                SIZE         KEY

You can use -o json and it’ll print your objects without a header and be the expected output of 2

uplink ls --recursive -o json sj://demo-bucket | wc -l


uplink ls --recursive -o json sj://demo-bucket | wc -l

But when I delete the PDF using the web interface:

uplink ls --pending --recursive sj://demo-bucket
uplink ls --recursive -o json sj://demo-bucket | wc -l

After ~1h:

What is strange is that immediately after deleting the file the message about object locked appears. Is it some kind of cache? (no, it’s not in my browser, I tested it in 3 different ones)

I will resume testing.

After several tests, both with the new test bucket and with the files that originally led me to discover this bug (?), I give up. I’m just going to trust what uplink and rclone tell me, the web interface definitely has something fishy about it.

I copied exactly 294 files, and confirmed this number with rclone, with uplink, with direct mounting via mountainduck and using the web interface itself:

uplink ls --recursive -o json --access all sj://bucket_name | wc -l


I made sure, using uplink, that there were no “ghost files”:

uplink ls --pending --recursive -o json --access all sj://bucket_name | wc -l

Still, the web interface shows 231 locked files:


Note: the copy of these files was made yesterday.

Even though this was done yesterday with rclone’s default number of simultaneous transfers (4), and may have created the multipart upload issue mentioned above, why does the ls --pending command show zero results? Is it the command that is returning wrong or the web interface? IMHO is the web interface. Therefore, I will continue to use only command line access.


I think it’s a bug in UI, a weird one, I agree.