Hi I am quite new to storj but not to s3 so I am using s3-client a python library to upload a large file > 3Gb that I could not upload via Uplink (the upload start fast and decrease till stop completely)
this is the cmd I have run:
s3-client -r europe-west-1 -e https://gateway.tardigradeshare.io upload realestate -f ‘/home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql’
Uploading file /home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql with object name /home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql
data transferred: 0%| | 0.00/3.79G [00:00<?, ?B/s]
Traceback (most recent call last):
File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/transfer.py", line 279, in upload_file
future.result()
File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/futures.py", line 106, in result
return self._coordinator.result()
File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/futures.py", line 265, in result
raise self._exception
File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 126, in __call__
return self._execute_main(kwargs)
File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 150, in _execute_main
return_value = self._main(**kwargs)
File "/home/aureliano/.local/lib/python3.8/site-packages/s3transfer/tasks.py", line 332, in _main
response = client.create_multipart_upload(
File "/home/aureliano/.local/lib/python3.8/site-packages/botocore/client.py", line 357, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/aureliano/.local/lib/python3.8/site-packages/botocore/client.py", line 676, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (301) when calling the CreateMultipartUpload operation: Moved Permanently
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/aureliano/.local/bin/s3-client", line 8, in <module>
sys.exit(main())
File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 878, in main
args.func(s3, args)
File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 804, in cmd_upload
upload_single_file(s3, args.bucket, args.filename, args.nokeepdir)
File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 783, in upload_single_file
s3.upload_file(bucket_name, file_name, key_name)
File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 310, in wrapped_f
result = func(*args, **kwargs)
File "/home/aureliano/.local/lib/python3.8/site-packages/s3_client/s3_client.py", line 539, in upload_file
self.s3_resource.Bucket(bucket_name).upload_file(
File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/inject.py", line 207, in bucket_upload_file
return self.meta.client.upload_file(
File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/inject.py", line 129, in upload_file
return transfer.upload_file(
File "/home/aureliano/.local/lib/python3.8/site-packages/boto3/s3/transfer.py", line 285, in upload_file
raise S3UploadFailedError(
boto3.exceptions.S3UploadFailedError: Failed to upload /home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql to realestate//home/aureliano/eclipse-workspace/real_estate/data/DVFPlus_4-0_SQL_LAMB93_R084-ED201/1_DONNEES_LIVRAISON/dvf_departements.sql: An error occurred (301) when calling the CreateMultipartUpload operation: Moved Permanently
I am quite lost in the discussions is this topic been fixed in the beta release?
Are there suggestion about best clients with verified history of successfully upload?
Thanks in advance for the support
Aureliano
Devops at Qwant.com