"cannot delete the bucket because it's being used by another process"

I try to delete bucket, supposedly empty but it should not matter, I’m getting this reply:

c:\Users\dell\Desktop>uplink rb --force sj://temp
Bucket temp has NOT been deleted
uplink: bucket: metaclient: cannot delete the bucket because it’s being used by another processError: uplink: bucket: metaclient: cannot delete the bucket because it’s being used by another process
c:\Users\dell\Desktop>uplink ls
BKT 2021-09-22 17:21:43 temp
c:\Users\dell\Desktop>uplink ls sj://temp
c:\Users\dell\Desktop>
c:\Users\dell\Desktop>uplink ls --encrypted sj://temp
c:\Users\dell\Desktop>
c:\Users\dell\Desktop>uplink version
Release build
Version: v1.34.3

What I’m doing wrong? How to delete?

Can you please try uplink ls sj://temp --pending

blank reply

C:\Users\dell\Desktop>uplink ls sj://temp --pending

C:\Users\dell\Desktop>

Just to make sure, I tried again all other commands above, with same result as before.

Btw. on web interface it shows:

Thanks for the tip, however I don’t use Telegram. I expect storj will reply here…
If it is a glitch in storj systems, it’s good to let them know so they can investigate.

Hi @xsys I’ve forwarded your question to the team who should be able to help with this issue.

Could you please update the uplink and try again?

By the way, do you have rclone mount somewhere for that bucket? or s3fs?

I do not use rclone, s3 or any other app.
uplink updated, result the same.

c:\Users\dell\Desktop>uplink rb --force sj://temp
Bucket temp has NOT been deleted
uplink: bucket: metaclient: cannot delete the bucket because it’s being used by another processError: uplink: bucket: metaclient: cannot delete the bucket because it’s being used by another process
c:\Users\dell\Desktop>uplink version
Release build
Version: v1.40.4
Build timestamp: 06 Oct 21 11:19 CEST
Git commit: 9cb33b0ab79b044c6da6b9d38b2dde6440dad781
c:\Users\dell\Desktop>

I can give you full access if it helps, the account is testing only and empty anyway.
This is actually result of my playing with the tools, I created various Access grants, buckets, etc., and later removed everything, in random order. I guess I broke it :wink:

This is interesting results, I never saw such a error before.

Hi xsys, I’m trying to reproduce it. When you play with permissions - did you play with access grant duration?

nope, I always left defaults here


and “Continue in browser”

It will be hard to reproduce, I played with it many many times, on eu1 and us2, and it never happened, and I don’t remember exactly what steps I performed with this folder before it got locked…

Seems like this error means the bucket is not empty after object deletes.
Actually weird, because --force should remove any existing objects

I also tried to reproduce with uploading files with two different encryption secret. But for me it worked well. --force deleted everything from the bucket…

the only place where such an error can occur, as Andrii said - your bucket is not empty. Try to delete everything from it. It’s an interesting bug, I’ll try to figure out how to prevent it

yes I deleted all content, is show empty, on web interface it says 0.00GB but 725 objects, not sure what it means though

when trying to delete on web, as it usually says something like “Bucket not empty”, in this case it says “Internal error” instead:

try to run uplink ls sj://bucket --recursive and uplink ls sj://bucket --recursive --pending please

in both cases empty reply:

c:\Users\dell\Desktop>uplink ls sj://temp --recursive

c:\Users\dell\Desktop>uplink ls sj://temp --recursive --pending

update:
I created new Access grant, with the same passphrase, and tried to create folder “testing new folder” and upload file “apache-maven-3.3.9-bin.zip”, in order to see if it even works.
The newly created folder and file show up correctly.
Interestingly, the result of the commands above has changed since:
c:\Users\dell\Desktop>uplink ls sj://temp --recursive
OBJ 2021-10-12 16:49:24 0 testing new folder/.file_placeholder
OBJ 2021-10-12 16:49:41 8617253 apache-maven-3.3.9-bin.zip
c:\Users\dell\Desktop>
c:\Users\dell\Desktop>uplink ls sj://temp --recursive --pending
OBJ 2021-09-23 21:37:18 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/openssl/LICENSE
OBJ 2021-09-23 21:44:43 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/openssl/crypto/des/COPYRIGHT
OBJ 2021-09-23 21:44:42 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/openssl/crypto/bf/COPYRIGHT
OBJ 2021-09-23 21:37:31 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/sysvinit/COPYRIGHT
OBJ 2021-09-23 21:36:35 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/dhcp/LICENSE
OBJ 2021-09-23 21:36:51 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/iptables/COPYING
OBJ 2021-09-23 22:05:10 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/linux/arch/sparc/lib/COPYING.LIB
OBJ 2021-09-23 21:44:42 0 sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/linux/fs/jffs2/LICENCE
—cut—
There is over 100 rows continuing, looks like residual of one of my previous tests. How can I flush that?

Two ways:

  1. One by one
uplink rm --pending "sj://temp/sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/iptables/COPYING"
  1. The whole bucket
uplink rb --force sj://temp

(perhaps it would work on this time)

If you would be forced to use “one by one” method, then you can use this one-liner:

just add --pending to uplink ls and to uplink rm, and use your own bucket and prefix (“folder”) name.

this worked uplink rm --pending "sj://temp/sw/VIDEO UTILITY/Panasonic HD writer AE 5.0/OSS LICENSE (AE)/HC-V520M,V520,V510/iptables/COPYING" for the individual file.
Interestingly, uplink rb --force sj://temp did not work yesterday, but today it worked, although I didn’t do anything else.
So it is finally resolved…