Error uploading file using CLI or S3 gateway

I cannot upload this particular video I took on my phone for love nor money. I tried using the S3 gateway and the CLI. Ive been able to upload other files. The error is:

Also attached the file if anyone wants to examine it. Theres nothing special about it. Just a video recorded on my iphone.

Have you tried using the GUI gateway to upload your video instead?

what is the GUI gateway please?


this isnt a GUI? Its the S3 gateway? Yes, as per title ive tried this method and CLI uplink.

Its more GUI then its CLI so I just call it GUI since you can just upload directly. I also did some tests It seems sluggish when trying to upload files.

The page literally has CLI all over it and there is no graphical interface anywhere, so using GUI is really confusing.

Unfortunately I don’t know what might cause this error, but I doubt using the s3 gateway will help as it would just add an abstraction layer in-between.

The web interface is a GUI to me if im not using a command to upload files then its a GUI

Are we looking at the same page? I see no web interface or even a mention of one and a ton of CLI commands.

Regardless I didn’t mean to derail this topic. So apologies and please ignore my previous post.

Well I thought when people use S3 gateway they use the web interface. Maybe I miss understand.

Yes, it can be used via web interface too, but the main purpose is to handle any S3 compatible client (bunch of) to interact with the Tardigrade network.
It doesn’t help to solve the problem from the description.

Just wanted to rule every possibility out. If everything gets the same error or not.

@stefanbenten I’ve done further testing and keep coming across certain files i can’t upload. Would be really grateful if someone could comment on this please. Ive provided the error and the actual file in the original post above.

Hi Will,

I would like to test this out with your file myself, but your link above is redirecting to a 404. Could you make a new one?

Another question - I assume the error you get each time for this file is the same each time you upload:

metainfo error: value changed: ...*Client).Batch:1118*streamStore).upload:239*streamStore).Put:113*shimStore).Put:50*Group).Go.func1:57

Is the data printed after the value changed: (e.g. \"ac110.../s0/testbucket/\\x02(\\xa9\\x9a....\\xf3\\xd1) consistent? Knowing this could help narrow down the issue, but it’s cool if you don’t know. In any case, if I get the file from you I can try to help.

Hey @moby

Thanks for the response. Ive uploaded the file again here

Not sure on the technical questions you asked though :slight_smile:


Would you mind uploading to a different file path and letting me know whether it succeeded?
So instead of running
./uplink_windows_amd64.exe cp C:/Test/IMG_1016.MP4 sj://testbucket
run something like
./uplink_windows_amd64.exe cp C:/Test/IMG_1016.MP4 sj://testbucket/phonemovie.mp4

@moby that worked…

@moby i just re ran the original command and it fails…

Ok, the reason for this issue makes sense to me now. Basically, during the first upload, the first 64mb segment of sj://testbucket/IMG_1016.MP4 was uploaded, successfully, but the download was either canceled, or unexpectedly failed before the remaining segments could be uploaded. Normally, in the case of a failed download, we would automatically delete the successfully uploaded segments, but there is a bug which we are aware of and working on fixing that causes these “zombie segments” to linger around (Design Draft: Zombie Segments Cleaner).

The reason the second upload worked is because you were using a different filepath, so there is no zombie segment. The original command will fail, just as re-running the new phonemovie.mp4 command will fail. The difference is that the first will fail because of a zombie segment exists which shouldn’t exist, because the entire file was never uploaded to begin with, while the second will fail because a complete, successfully uploaded file already exists.

Unfortunately, this is not something you can fix on your end, but we are aware of the issue and have been working on a fix for it.


Phew! glad its a known issue and on your radar! Appreciate the explanation and taking the time to look into it.

1 Like