Extract Archives Remotely

Can I extract archives in buckets remotely?
I can send the files without an archive but that consumes bandwidth! (too many small files!)

Is that a command like extractFromArchive("path/to/archive", ".../destinationfolder").
And I would find another command like compressToArchive(".../targetFolder", "path/to/archive"[, overwrite])

i’m sorry about that topic, but i’m interested to make a gui application, to send larger files and extract it!

With an old version of uplink you could do something like this:

tar zcv * | uplink put sj://bucket/file.tar.gz
and
uplink get sj://bucket/file.tar.gz | tar zxv

However, it seems that at some point uplink lost the ability to upload from stdin. Now I can only think of some “hacky” ways to avoid having the entire archive on local disk.

1 Like

Yes, but this uploads the archive and stays archived in the cloud

but I want to extract it on the cloud,

Like This

locally i have:

  • …/folder/file1
  • …/folder/file2
  • …/folder/…
  • …/file50
  • …/file…

the program archives it to become “aaa.zip”
and sends it to the cloud

and in STORJ do a command to extract…

in STORJ i need to have after extracting:

  • …/folder/file1
  • …/folder/file2
  • …/folder/…
  • …/file50
  • …/file…

not:

  • …/aaa.zip

One of the features of Storj is that your data is encrypted and the company does not have access to it. This would requite Storj having the encryption key to your data.

1 Like

Oh yes, I can give the command the passphrase argument.

There is an issue in the public roadmap. Maybe that is what you are looking for?
Unfortunately that means implementation has not been finished (yet):

2 Likes

It’s better to do not do so: Per Segment Fee Calculation