Cant upload large files

Hi I am quite new to storj but not to s3 so I am using s3-client a python library to upload a large file > 3Gb that I could not upload via Uplink (the upload start fast and decrease till stop completely)

this is the cmd I have run:

s3-client -r europe-west-1 -e https://gateway.tardigradeshare.io upload realestate -f ‘/home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql’

Uploading file /home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql with object name /home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql
data transferred:   0%|                                                                                          | 0.00/3.79G [00:00<?, ?B/s]
Traceback (most recent call last):
  File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/transfer.py", line 279, in upload_file
    future.result()
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/futures.py", line 106, in result
    return self._coordinator.result()
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/futures.py", line 265, in result
    raise self._exception
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 126, in __call__
    return self._execute_main(kwargs)
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 150, in _execute_main
    return_value = self._main(**kwargs)
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 332, in _main
    response = client.create_multipart_upload(
  File "/home/aureliano/.local/lib/python3.8/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/home/aureliano/.local/lib/python3.8/site-packages/botocore/client.py", line 676, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (301) when calling the CreateMultipartUpload operation: Moved Permanently

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/aureliano/.local/bin/s3-client", line 8, in <module>
    sys.exit(main())
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 878, in main
    args.func(s3, args)
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 804, in cmd_upload
    upload_single_file(s3, args.bucket, args.filename, args.nokeepdir)
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 783, in upload_single_file
    s3.upload_file(bucket_name, file_name, key_name)
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 310, in wrapped_f
    result = func(*args, **kwargs)
  File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 539, in upload_file
    self.s3_resource.Bucket(bucket_name).upload_file(
  File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/inject.py", line 207, in bucket_upload_file
    return self.meta.client.upload_file(
  File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/inject.py", line 129, in upload_file
    return transfer.upload_file(
  File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/transfer.py", line 285, in upload_file
    raise S3UploadFailedError(
boto3.exceptions.S3UploadFailedError: Failed to upload /home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql to realestate//home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql: An error occurred (301) when calling the CreateMultipartUpload operation: Moved Permanently

I am quite lost in the discussions is this topic been fixed in the beta release?

Are there suggestion about best clients with verified history of successfully upload?

Thanks in advance for the support
Aureliano
Devops at Qwant.com

Below is a simple lab I ran using the aws cli. I also love using rclone.

AWS CLI Demo
# setup aws cli
aws configure
# entered access key and secret key and ledt region and output default
# display buckets
aws s3 ls --endpoint-url=https://gateway.tardigradeshare.io
# Output
# Returns nothing but seems to work
# Making a bucket
aws s3 mb s3://testbucket --endpoint-url=https://gateway.tardigradeshare.io
# Output (It works!)
make_bucket: testbucket
# Copy test 2gb archive to bucket
aws s3 cp /Users/dominickmarino/Desktop/Archive.zip s3://testbucket --endpoint-url=https://gateway.tardigradeshare.io
# Observation during (completed 99.8 MiB/1.9 GiB (1.1 MiB/s) with 1 file(s) remaining) (around 20Mbps)
# Output
upload: Desktop/Archive.zip to s3://testbucket/Archive.zip
# List the file we uploaded
aws s3 ls s3://testbucket --endpoint-url=https://gateway.tardigradeshare.io
# Output
2021-01-22 11:18:51 1996783599 Archive.zip
# Copy file back
aws s3 cp s3://testbucket/Archive.zip /Users/dominickmarino/Desktop/Archive2.zip --endpoint-url=https://gateway.tardigradeshare.io

Rclone Demo
# setup rclone
rclone config
# select n (New Remote)
# name
s3rctest
# select 4 (4 / Amazon S3 Compliant Storage Provider)
4
# select 13 (13 / Any other S3 compatible provider)
13
# select 1 (1 / Enter AWS credentials in the next step \ “false”)
1
# enter access key
<access_key>
# enter secret key
<secret_key>
# select 1 ( 1 / Use this if unsure. Will use v4 signatures and an empty region.\ “”)
1
# enter endpoint
gateway.tardigradeshare.io
# use default location_constraint
# use default ACL
# edit advanced config
n
# review config and select default
# quit config
q
# make bucket and path
rclone mkdir s3rctest:testpathforvideo
# list bucket:path
rclone lsf s3rctest:
# copy video over
rclone copy --progress /Users/dominickmarino/Desktop/Screen\ Recording\ 2021-03-12\ at\ 10.48.10\ AM.mov s3rctest:testpathforvideo/videos
# list file uploaded
rclone ls s3rctest:testpathforvideo
# output (40998657 videos/Screen Recording 2021-03-12 at 10.48.10 AM.mov)

Hi Dominic,
thanks the support indeed using awscli solve the problem. I’ve set up a tutorial who make use of storj as storage source for info

https://sinaure.medium.com/use-postgis-to-store-real-estate-data-is-the-market-going-up-down-7c433a938d83