[Testers Needed] Filezilla Onboarding Page

  • Special Character Support - This is only true for the bucket name. Directories within the bucket are fine. I created directories with space, Umlaut and é
  • Unable to rename directories - That is indeed a bit sad but typical backup solution won’t need this I guess.
  • No spaces allowed in directory names - This is also only true for the highest directory which is the bucket_name. Subdirectories can have a space.

For easier onboarding I’d suggest creating a default bucket and using that bucket as the root directory if possible. That avoids all bucket name problems which will be quite strange to a normal user that isn’t used to s3 storages.


My upload tests were not quite what I expected. It seemed to me that the upload would freeze all the time for 10-30 seconds. However, having experience with the uplink library, I know that the upload speed displayed is erratic. It always hangs for many seconds (not showing any progress or change in upload speeds), then it goes to 50MiB/s, then back to 2MiB/s on a 5MiB/s connection. So the upload seems to pause all the time while in background actually there is a continued upload saturating my upload bandwitdth. (except between pieces, there’s always a gap).
I know this is an old problem of the uplink library.
But: This behaviour can make people nervous. It’s not what normal users expect and it looks like freezing, unreliable uploads.

Downloading seems significantly better, because I can download with 20MB/s so the pauses and erratic display of progress/speed is not that visible. However, the real speed was 15MB/s.

1 Like

Same thing here. Uploads appear to be happening in fits and starts. Looks like it stalls but then goes back to normal.
Haven’t tried downloads yet.
5 simultaneous uploads seem to have killed my iMac’s Ethernet link, though. No idea why.

I don’t know why my feedback landed in the other thread: Future changes to Reed Solomon numbers

Perhaps it’s related to your router.
If it’s an Apple wifi router - it’s known as slow and unreliable for simultaneous connections.

Upload seems to be stuck.

Doubt it. My router is a UniFi Dream Machine Pro. Mind you, I am using PiHole on a separate machine so I wonder if it’s being flooded with DNS requests…

Is you PiHole blocking outgoing ports?
They could be any

No, PiHole is only a DNS resolver, not a router.

Is this running stable for anyone?
I am having an upload hanging again at 9.1% for half an hour now. :roll_eyes:

Speed: 0 B/s. :nauseated_face:

Another feedback:
Maybe you should explain in the video how to get the last version of filezilla.

I am on Ubuntu and I basically run sudo apt update && sudo apt install filezilla.
It seems that the version I got doesn’t have the tardigrade protocol implemented (3.46.3 instead of 3.51).

I know that the official ubuntu repo is not on your hands but maybe you should give some insight about it.

I tried it again on FileZilla 3.51 with Ubuntu 20.04 (last time, I tried on Windows 10). My uploads fails each time.
What I did:
0. Download FileZilla 3.51 (here) and install it
- extracted the file
- run bin/filezilla: it seems to work but I got a message in the terminal Failed to load module "canberra-gtk-module". I don’t know if it’s important. I installed the appropriate library and now it’s OK.

  1. Create a new site called “tardigrade-test”
  2. Set up the connector:
    • europe-west-1.tardigrade.io” as the satellite
    • nothing in the Port field
    • the API key given by the site in the “API Key” field
    • A personal encryption passphrase
  3. Create a new folder: it’s working well.
  4. Upload a very tiny file (an jpg file). The only error messages I got:
    Erreur : Could not open local file
    Erreur : Échec du transfert du fichier (sorry it’s in French. In English, it means “Failed to transfer the file”).
    I also get some messages on the terminal, saying invalid source position for vertical gradient

I havent seen this issue yet are you blocking outgoing ports this uses the same process like uplink.

There was a link to the filezilla website to install filezilla on the signup page…

If you blocking any outgoing ports it will fail. This acts just like the uplink so you cant block any outgoing ports.

This is NOT production ready.

I can see now 25 successful file transmissions. However 1 file is not yet done for HOURS and it keeps restarting over and over again.


I did notice something similar with larger files that it takes much much longer to transfer so I just split the files in rar to 50MB per file and it goes much faster, I dont think it can handle 4gig files very well vs Small files. I dont think it has anything to do with being production ready this is just the limitations of the Uplink on the backend…
I tested with splitting the files down

And the single large file

See if you split the files you can upload much faster then a single large file.
But if your use cause is upload a video file to stream it back then this really isnt the use case.

1 Like


I did try to upload 100 files 1GB each, with a very low success rate so far (it’s been running for 12 hours now).

6 files have been successfully uploaded. 10 are currently being uploaded however with very slow progress.

I have many errors similar to the screenshot below that have affected the successful/current uploads at some point.


Hope this helps.

It was a 700mb file only.
This should be handled very well.

No, sorry, this is not production ready if it does not work. 700mb with FTP is never a problem.

I uploaded already almost 1 TB. My files are 100gb archives. Well, max speed is capped at 6 MB/s which is pretty slow nowadays. As a feedback: This has to be fixxed for higher speeds. I take what is free obviously :slight_smile: I am also willing to upload more if wanted for testing.

Beside the speed, everything works good!

Yes, you’re right. My mistake.

I don’t block any outgoing ports. I tried to upload a file, using uplink and the given API key with the same encryption passphrase, and it works.
But I cant’ see the bucket I created with FileZilla.

It could be if you use a different accounts or projects. Each of them have an own API keys.