Questions tagged [s3cmd]

S3CMD QUESTIONS MUST BE PROGRAMMING RELATED. s3cmd is a command line tool for uploading, retrieving and managing data in Amazon S3. It is best suited for power users who don't fear command line. It is also ideal for scripts, automated backups triggered from cron, etc.

S3CMD QUESTIONS MUST BE PROGRAMMING RELATED. s3cmd is a command line tool for uploading, retrieving and managing data in Amazon S3. It is best suited for power users who don't fear command line. It is also ideal for scripts, automated backups triggered from cron, etc.

See the s3cmd homepage for more information.

282 questions
0
votes
1 answer

s3cmd restore giving error "no option -a"

I'm following the command from s3tools.org site, example provided for -s parameter. but getting an error... ➜ s3cmd restore "s3://xxasdx-xxasdx-saip/originals" -s -days:7 Usage: s3cmd [options] COMMAND [parameters] s3cmd: error: no such option:…
Sai Puli
  • 951
  • 8
  • 12
0
votes
0 answers

s3cmd continue put for big (multipart) file

Trying to upload a 20GB file to S3 with s3cmd as follows s3cmd --progress --continue-put put 20gb.file s3://bucket Due to security, my AWS access\secret keys are expiring after few hours and the s3cmd put operating fails. Executing the command…
Mr.
  • 9,429
  • 13
  • 58
  • 82
0
votes
0 answers

s3 write speed is too slow - is there a way to speed it up?

We are moving from NFS to S3 but I am not happy with the write performance. How can I speed it up ? Are there any configurations to use while moving large files? Now it's around ~10MB/s vs our regular NFS filesystem 200MB/s ie 20 times slower. I am…
Gr8DBA
  • 9
  • 2
0
votes
1 answer

Gitlab: How to configure Backup when using object-store

we are running GitLab installed in our Kubernetes Cluster, using rook-ceph Rados-Gateway as S3 Storage backend. We want to use the backup-utility delivered in the tools container from gitlab. As backup target we configured an external minio…
0
votes
1 answer

How to use ObjectSizeGreaterThan and ObjectSizeLessThan xml tags for AWS S3 command 'PutBucketLifeCycleConfiguration"?

Trying with a json input like this: { "Rules": [ { "Expiration": { "Date": "2023-01-01T00:00:00.000Z" }, "ID": "id1", "Filter": { "And": { "Tags": [ { "Key":…
aamer aamer
  • 315
  • 1
  • 3
  • 13
0
votes
0 answers

How to upload files with the char ':' to s3

On Linode I have an object storage that holds pacman software packages in it. The problem is that those packages may contain the char : in its version, which is part of the file name. Such packages cannot be renamed in any other way, otherwise the…
0
votes
1 answer

Change meta-data on selected files in S3

I found a bunch of image files that has the wrong extension. Due to the way the website is made, they must have a .jpg extension. But some of them are png files. So I made a quick list of fake JPEG files ls public/assets/image/*.jpg | xargs file…
dotnetCarpenter
  • 10,019
  • 6
  • 32
  • 54
0
votes
3 answers

unable to send data from ec2 to s3 bucket that is configured with object lock

I am trying to setup s3cmd to send data to s3 bucket that is configured with object lock, but i am getting below error message. s3 error: 400 (invalidrequest): content-md5 http header is required for put object requests with object lock…
madhan
  • 13
  • 4
0
votes
1 answer

Can two s3cmd put commands run concurrently without adverse results?

Can two s3cmd put commands[1] run concurrently to the same AWS/S3 bucket - without adverse reprocussions[2]? [1] Say, one runs manually, and then one from crontab joins. [2] E.g., file corruption on the S3 bucket.
boardrider
  • 5,882
  • 7
  • 49
  • 86
0
votes
1 answer

How to preceed the output generated by "exec &" with a time/Date in linux bash scripts?

I have the following script file that writes files to s3 from a local file system: #!/bin/bash CURR_DIR=`dirname $0` SCRIPT_NAME="$(basename $0)" LOG_FILE=$(echo $SCRIPT_NAME | cut -f 1 -d '.') TODAY=$(date '+%Y-%m-%d') NOW=$(date -d "$(date…
Jefledge
  • 39
  • 9
0
votes
1 answer

s3cmd performance is extremely poor when a user transfers 10TB file

I am trying to transfer 10TB file using s3cmd to COS(Cloud object storage). For transferring the file I am using below command: python3 cloud-s3.py --upload s3cmd /data/10TB.txt pr-bucket1 --multipart-chunk-size-mb 1024 --limit-rate 100M…
Sujata Yadav
  • 11
  • 1
  • 4
0
votes
2 answers

How to list *recent* files in AWS S3 bucket with AWS CLI or Python

I have a camera that adds new files to my AWS S3 bucket every hour, except when it doesn't. For rapid trouble-shooting, I'd like to be able to find (either list or view) the most recent file in the S3 folder. Or list all of the files since a…
Chris Sherwood
  • 393
  • 2
  • 4
  • 11
0
votes
0 answers

s3cmd configure doesn't accept a remote path

I'm trying to configure for a client's server s3cmd so we can run backup scripts on it. I can connect to and list buckets using Transmission (for Mac) but server is Linux and I need s3cmd. On transmission I input the following information (redacted…
0
votes
0 answers

Retrieve a file's public URL through either s3cmd or aws-sdk

I have a bucket. Bucket has files. I have a function that uploads files to buckets. I need to return the public URL for the files in the bucket, so when it displays on my UI, I can click and download the file directly. I can do that on either…
0
votes
1 answer

How do I fix incorrect public url on s3cmd response?

I am uploading a file to my S3 instance using s3cmd. When I run s3cmd put test_py3.csv.gz s3://my.bucket/path/ --acl-public after the upload it gives the public url as http://my.bucket.my.bucket/path/test_py3.csv.gz instead of…
Further Reading
  • 233
  • 2
  • 8