[Testers Needed] Filezilla Onboarding Page

Perhaps it’s related to your router.
If it’s an Apple wifi router - it’s known as slow and unreliable for simultaneous connections.

:weary:
Upload seems to be stuck.

Doubt it. My router is a UniFi Dream Machine Pro. Mind you, I am using PiHole on a separate machine so I wonder if it’s being flooded with DNS requests…

Is you PiHole blocking outgoing ports?
They could be any

No, PiHole is only a DNS resolver, not a router.

Is this running stable for anyone?
I am having an upload hanging again at 9.1% for half an hour now. :roll_eyes:

Speed: 0 B/s. :nauseated_face:

Another feedback:
Maybe you should explain in the video how to get the last version of filezilla.

I am on Ubuntu and I basically run sudo apt update && sudo apt install filezilla.
It seems that the version I got doesn’t have the tardigrade protocol implemented (3.46.3 instead of 3.51).

I know that the official ubuntu repo is not on your hands but maybe you should give some insight about it.

[EDIT]
I tried it again on FileZilla 3.51 with Ubuntu 20.04 (last time, I tried on Windows 10). My uploads fails each time.
What I did:
0. Download FileZilla 3.51 (here) and install it
- extracted the file
- run bin/filezilla: it seems to work but I got a message in the terminal Failed to load module "canberra-gtk-module". I don’t know if it’s important. I installed the appropriate library and now it’s OK.

  1. Create a new site called “tardigrade-test”
  2. Set up the connector:
    • europe-west-1.tardigrade.io” as the satellite
    • nothing in the Port field
    • the API key given by the site in the “API Key” field
    • A personal encryption passphrase
  3. Create a new folder: it’s working well.
  4. Upload a very tiny file (an jpg file). The only error messages I got:
    Erreur : Could not open local file
    Erreur : Échec du transfert du fichier (sorry it’s in French. In English, it means “Failed to transfer the file”).
    I also get some messages on the terminal, saying invalid source position for vertical gradient

I havent seen this issue yet are you blocking outgoing ports this uses the same process like uplink.

There was a link to the filezilla website to install filezilla on the signup page…

If you blocking any outgoing ports it will fail. This acts just like the uplink so you cant block any outgoing ports.

This is NOT production ready.

I can see now 25 successful file transmissions. However 1 file is not yet done for HOURS and it keeps restarting over and over again.

THIS IS NOT PRODUCTION READY.

I did notice something similar with larger files that it takes much much longer to transfer so I just split the files in rar to 50MB per file and it goes much faster, I dont think it can handle 4gig files very well vs Small files. I dont think it has anything to do with being production ready this is just the limitations of the Uplink on the backend…
I tested with splitting the files down

And the single large file


See if you split the files you can upload much faster then a single large file.
But if your use cause is upload a video file to stream it back then this really isnt the use case.

1 Like

Hello,

I did try to upload 100 files 1GB each, with a very low success rate so far (it’s been running for 12 hours now).

6 files have been successfully uploaded. 10 are currently being uploaded however with very slow progress.

I have many errors similar to the screenshot below that have affected the successful/current uploads at some point.

image

Hope this helps.

It was a 700mb file only.
This should be handled very well.

No, sorry, this is not production ready if it does not work. 700mb with FTP is never a problem.

I uploaded already almost 1 TB. My files are 100gb archives. Well, max speed is capped at 6 MB/s which is pretty slow nowadays. As a feedback: This has to be fixxed for higher speeds. I take what is free obviously :slight_smile: I am also willing to upload more if wanted for testing.

Beside the speed, everything works good!

Yes, you’re right. My mistake.

I don’t block any outgoing ports. I tried to upload a file, using uplink and the given API key with the same encryption passphrase, and it works.
But I cant’ see the bucket I created with FileZilla.

It could be if you use a different accounts or projects. Each of them have an own API keys.

I used the same APi key and the same encryption key

Then it’s weird. I saw such if you configured the uplink with a restrictive access grant - you should specify a full path to see a content.
Can you try to specify a bucket?
Like

./uplink ls sj://my_hidden_bucket

Well one thing you will be disappointed every time if you compare this to an FTP. FTP doesn’t encrypt the files and split it for one thing.
FTP is a pretty mature protocol compared to Storj.

I do agree it would be nice to be able to upload faster with bigger files but like I said before the way the uplink works is the limiting factor, so for production it works good not great.
I tested a 700MB file it worked much better then a 5G file 700MB file took around 5mins to upload.

1 Like

Upload speeds were disappointing today for me too. I have 5MB/s upload bandwidth but the last 2.7GB file only uploaded with an average of 1.5MB/s which is quite lame. However, this seems to be due to the pauses in between chunks because during upload my bandwidth is saturated.
So it’s more or less ok if you’re aware of that problem.
A future enhancement for storj should be to upload pieces in parallel so there is no gap between pieces. But I understand that this is rather complicated.