1

i am trying to automate file update on s3 using aws cli. when i manually run the code aws s3 cp /home/abc/lampdata.json s3://lamp it is working completely fine but in shell script file it is giving me an error

#!/bin/sh exec bash -c 'AWS_CONFIG_FILE=/root/.aws/config aws s3 cp /home/abc/lampdata.json s3://lamp' &

upload failed: ../../abc/lampdata.json to s3://lamp/lampdata.json Unable to locate credentials

though i have configured using aws configure. is this some problem with the shell script?

siddharth
  • 57
  • 7
  • Why do you need the AWS_CONFIG_FILE environment variable? seems like that is causing the problems. – Tamás Sallai Sep 07 '20 at 12:08
  • @TamásSallai thanks for the reply. it is not working with ```exec aws s3 cp /home/abc/lampdata.json s3://lamp' &``` either – siddharth Sep 08 '20 at 01:23
  • Why do you need exec or bash -c? I believe you can just write bash in a shell script. My other hunch is that maybe you are running the script as a different user? – Tamás Sallai Sep 08 '20 at 11:01
  • @TamásSallai i was following [this](https://stackoverflow.com/a/31426381/14233155). i am using exec inside a .sh file to execute shell script at startup boot on systemd Linux. i confirmed that i am not running the script as a different user. – siddharth Sep 09 '20 at 00:46

2 Answers2

1

@ Siddharth If the credentials already there in ~/.aws/credentials and you are able to execute aws s3 cp from terminal, so no need to specify the credentials in the script. If you set it for User profile and executing the script from that user, then it should inherit your environment by default.

#!/bin/sh
aws s3 cp /home/abc/lampdata.json s3://lamp

This should work.

0

From this source:

import boto3
session = boto3.Session(
    aws_access_key_id=settings.AWS_SERVER_PUBLIC_KEY,
    aws_secret_access_key=settings.AWS_SERVER_SECRET_KEY,
)
Then use that session to get an S3 resource:

s3 = session.resource('s3')
Kevin C
  • 4,851
  • 8
  • 30
  • 64