The digital cinema as potential Tardigrade use case?

Same here like for Roberto. Single download is maxed at 6,0-6,5MB/sec.
If I start a second download from a different satelite, I can get the second file at the same speed, so 2x 6,0MB/sec, total 12MB/sec. When I start the next download from the third satelite, the third one is maxed at 2,4MB/sec while keeping the others at 6,0MB/sec, so total is around 14,5MB/sec which is around 120Mbit/sec.
After like 20 sec, the speed starts to fluctuate between the downloads, while total remains the same.
I did a few speedtest runs with servers in France, UK, Germany and I allways got around 300Mbit download speed, so this might be the external limit at my service provider.
Considering this, I believe that neither the single thread ~6,0MB/sec nor the multi thread 120Mbit/sec limit is not coming from my side.
FYI, I have 145/10 Mbit/sec to Anchorageā€¦ :smiley:

Thereā€™s no cap per se, unless some storagenode decides to use one.

There are few things that we know that contribute to the download performance:

First is, part size, if you use standard S3 settings and some s3 library, their defaults are rather low. Itā€™s usually better to upload using uplink cli to test the best case. Small part sizes have a larger overall overhead ā€“ unfortunately many S3 libraries default to small values.

Second is the handshake overhead of TCP+TLS, weā€™re currently working on moving the communication to something UDP based, that should improve it.

Third is a larger change to the concurrent downloading to better use all the established connections.

4 Likes

Thanks for trying to explain, but we are exploring a real world use case here: Something got uploaded and people are downloading it and the results are mixed at least where we would have hoped to see a much better - outstanding - performance.

We expect maximum performance without having to tweak anything or to make our minds up about what if storagenodes have imposed a cap and what default settings have been used.

2 Likes

We expect maximum performance without having to tweak anything or to make our minds up about what if storagenodes have imposed a cap and what default settings have been used.

I completely agree with that statement. We still have work to do on the performance front.

With regards to the note about tweaking s3 client settings, unfortunately, we cannot do anything about it, other than communicate it to people. Such software is targeting AWS S3, rather than our service and is outside of our control. The configuration happens entirely on the client side and we arenā€™t able to override it.

Of course, the tweaking isnā€™t necessary for uplink cli, uplink library (or tools based on them).

For large downloads going through linksharing adds an extra jump to the communication path and running uplink (or a similar custom tool) is probably going to work better.

In case itā€™s not clear, thank you for testing performance and inspecting things, such tests help us understand real-world problems better and find things to improve.

7 Likes

But in this case I have uploaded the files with Filezilla directly onto Tardigrade. In such case I cannot understand that there is tweaking required for best performance?

Yes, then tweaking is not necessary. The comments were more general i.e. known things that cause slowness.

Of course, but requiring someone you want to share something with to use a CLI implementation is not exactly workable. I know there were some rough ideas about a possible browser based uplink implementation, but at the time this was just an idea and implementation was far off as some challenges still needed to be tackled. Has there been progress on this? I think being able to run an uplink in the browser would be the holy grail for link sharing.

The issue with browsers is that they have a upper limit on how many connections you can make, so itā€™s more complicated than we hoped. There might be potential solutions like WebRTC, WebTransport or DirectSockets API that could help with it ā€“ but they require quite a bit of different internal wiring. Or in other words, we didnā€™t find an easy way to make it work.

There are some other approaches Iā€™ve thought about, that would require less effort:

  1. write a native desktop GUI application that you can use to download or
  2. a local ā€œuplink proxyā€ that the browser can communicate with.

The native desktop program could also work similarly to torrent magnet links, where you can click a link in the browser and itā€™ll start downloading locally. The local uplink proxy would require installing a browser extension and a local program.

Overall, we havenā€™t had progress in this front since weā€™re more focused on the internal performance improvements. Although, both of these should be implementable by using uplink library by 3rd parties.

3 Likes

That makes sense. I appreciate the complexity of dealing with many connections in a browser. Though I know there are browser based torrent clients that donā€™t require any additional downloads. I would think those run into similar issues. I guess probably in most cases still not as many connections as Storj would make.

Desktop clients are an option of course, but they would never be as universally available and require people who are ever more resistant to it to install something. What about phones, tablets, chromebooks?

That said, I do understand this is a hard problem to solve. Have you looked into something like this? GitHub - gopherjs/gopherjs: A compiler from Go to JavaScript for running Go code in a browser
Iā€™m guessing the uplink likely has specific enough requirements that itā€™s likely not that simple, but it may be worth a look.

Though I know there are browser based torrent clients that donā€™t require any additional downloads. I would think those run into similar issues.

Torrents are replicated, so data can be fewer connections. They are using WebRTC which allows more connections than WebSocket / DirectSocket, however we havenā€™t investigated too much. It might require some different servers for WebRTC capabilities.

Have you looked into something like this?

Yes, however the connection limitation is also on pure JavaScript code, using gopherjs / wasm doesnā€™t sidestep it. Also, we are already using parts of uplink compiled to WASM on the satellite for client side access grant generation.

The short answer is, yes, there might be a way to get it workingā€¦ but we havenā€™t yet figured out exactly how and donā€™t have time to implement it in the near future.

5 Likes

Well itā€™s clear that this has been on your minds, despite it not being top priority and very complex. I appreciate sharing some of the progress and findings though! Thanks. Keep up the great work!

2 Likes

Bye-bye use case:

Game over for this one:

1 Like

Also i want to add that requirements are going to be completely dependent on the deliverables required by the network/studio. The equipment used for delivery is going to vary so greatly that Iā€™m not even going to attempt to generalize the specs of the equipment used or the time for final outputs.

1 Like

Some quick links found from the Backblaze website. It is really interesting to read how the cloud changes media production, workflows and distribution. The links could be interesting for Storj as well:

4 Likes

Great idea and thank you for the links. A team of us is currently at the 2023 NAB show https://twitter.com/storj/status/1647751134675865600?s=20

11 Likes

Storj was also nominated for NABShow product of the year https://twitter.com/jggleeson/status/1647697777743523840

7 Likes

I hope it will help to get more video streaming clients.

7 Likes

I hope they were able to visit the

Bayern International c/o Bavarian Pavilion
Viewing Booth:
C6836 - 550 sq. ft. - Central Hall
Bayern International ā€“ Competence for International Business

to make some contacts into the german market.

And maybe have some :beers: :beer:

and they won?
https://twitter.com/storj/status/1648506474648465413?cxt=HHwWisDT9fH51eAtAAAA

Storj is proud to announce that we were awarded
@NABShow #product of the year in the storage category! We couldnā€™t be more thrilled to bring innovation in cloud storage to the video industry.

Congrats, thats amazing. :partying_face:

10 Likes