How to send directories and sub-directories directly to Storj?

I just received my tardigrade invite!
I tried to upload a huge file (1.6GB tar archive) and it seemed to work. I didn’t try to download it yet but I will with a smaller file.
But I didn’t find how to send a whole directly in one command. I would like to create a script to backup a directory and all its files and sub-directories.

Is it possible so far?

Thank you!

Hello @jeremyfritzen! There currently isn’t a command for recursive copy, but will be in the future.

The developers suggest for now (if your on linux) that you could write a script to list the directory and then iterate over that list, or write very small go application using libuplink :slight_smile:


Thanks @Dylan !
I will work on it if I have some time in the coming days. If someone already did this kind of thing, it would be very nice to share it :smiley:

Hey @Dylan, it’s been a while since I played with a test storj-sim network and uplink, but I remember there being an option to mount storj to a path. Would that not also be a solution here?

Yes the mount command should work.

Another option is a local gateway and than use aws for uploads and downloads. aws has 2 commands that will work in this situation.

aws cp localfolder s3://storjbucket/ --recursive
aws sync localfolder s3://storjbucket/

If you want to upload files I would recommend the first command. The sync command has a different focus and you might not want that behavior.

Also useful is aws rb s3://storjbucket/ --force That will delete all files and the bucket. With uplink cli this would be a pain :slight_smile: (we will improve it over time)


I thought the mount command would only mount the Storj bucket in read-only mode (based on the official documentation)
Did I miss something? Or is the documentation already obsolete?


Hi @jeremyfritzen! Unfortunately yes, the documentation for the mount command is outdated, and mount was recently removed as an uplink command. For now, I would follow @littleskunk’s advice using the local gateway and aws cp command for recursive uploading or downloading of your large file. Hope this helps!


Using aws API could be good. But I am concerned about this warning in the tardigrade documentation:

A download can become a chargeable event for 2 times the actual file size if the gateway is running on another cloud provider. We recommend interfacing with the network directly through the Uplink Library.

I don’t understand this warning.
Can someone help me to understand why I could be charged 2 times the actual file size?


Usually cloud providers charge for downloads. Data is going from the aws CLI to the s3 gateway, then - to the storagenodes. So, when data is transferred from a gateway to the storagenode it’s going outside the datacenter and you will be charged for that traffic by cloud provider.
If your aws CLI (or libraries) in another datacenter, the data transfer from your aws CLI will be treated as an outside traffic too, so the cloud provider will charge you for that traffic.
So, you will pay twice for outgoing traffic, if you would use two different cloud providers for aws CLI (library) and s3 gateway.
You will not have such problem, if you will use the uplink or libuplink - there is no gateway and you can transfer data directly to storagenodes.

1 Like

Ok, I understand, thanks.
But if I install S3 gateway and AWS CLI on my local computer, then I should be only charged normally, right?
I mean that with this kind of architecture, nothing is running on AWS datacenters and data transfers only happen between my local computer and Storj nodes, is that right?

Thank you for your help!

1 Like

If you will use the local computer you will not be charged at all. This case with double charging is only related to the s3 gateway hosted in the cloud.
You will be charged when data is transferred from the gateway to storagenodes by cloud provider, when you will download data, you will be charged again by cloud provider for downloading data from the s3 gateway and by Storj network for downloads.
So, I would not recommend to use such configuration. Better when both components - aws CLI and s3 gateway on the local computer. In this case the bandwidth usage will be charged only by your ISP (if your ISP is charging for traffic) and by Storj network, when you will download your data back. The upload is free.
At the moment with your Tardigrade account you will have a 25GB of storage and 25GB of downloads for free. Uploads also free.