25

I want to enable cloudtrail logs for my account and so need to create an s3 bucket.I wanted to automate this task using Boto3.Currently I am using the following script

sess = Session(aws_access_key_id=tmp_access_key,
                   aws_secret_access_key=tmp_secret_key, aws_session_token=security_token)  
s3_conn_boto3 = sess.client(service_name='s3', region_name=region)  

bucket = s3_conn_boto3.create_bucket(Bucket=access_log_bucket_name,
                                                     CreateBucketConfiguration={'LocationConstraint':'us-east-1'},
                                                     ACL='authenticated-read',..).

I am new to Boto3 so I don't have much knowledge regarding usage of other parameters like GrantWrite,GrantWriteACP etc..

Please help me provide some code snippet regarding create s3 bucket and enabled cloudtrail logs in it.

Thanks

tom
  • 3,720
  • 5
  • 26
  • 48
  • What's wrong with your example ? – mvelay Mar 18 '16 at 16:08
  • 2
    Was also trying to use the `create_bucket` method. Is it just me or is Boto3 bad? What kind of documentation is 'string' ??? "GrantFullControl='string'" maybe the next bit will be clearer ... ;) ... "GrantFullControl (string) -- Allows grantee the read, write, read ACP, and write ACP permissions on the bucket." Anyways, sorry for the ranting ... have you found what actual strings someone need to use with parameter such as GrantFullControl? (note the full 'rant' in GrantFull ;) – Rastikan Jun 06 '18 at 20:01

4 Answers4

24

Go through the following documentation

http://boto3.readthedocs.io/en/latest/guide/migrations3.html

Creating the Connection

Boto 3 has both low-level clients and higher-level resources. For Amazon S3, the higher-level resources are the most similar to Boto 2.x's s3 module:

Boto 2.x

import boto

s3_connection = boto.connect_s3()

Boto 3

import boto3

s3 = boto3.resource('s3')

Creating a Bucket

Creating a bucket in Boto 2 and Boto 3 is very similar, except that in Boto 3 all action parameters must be passed via keyword arguments and a bucket configuration must be specified manually:

Boto 2.x

s3_connection.create_bucket('mybucket')

s3_connection.create_bucket('mybucket', location=Location.USWest)

Boto 3

s3.create_bucket(Bucket='mybucket')

s3.create_bucket(Bucket='mybucket', CreateBucketConfiguration={
    'LocationConstraint': 'us-west-1'})

Storing Data

Storing data from a file, stream, or string is easy:

Boto 2.x

from boto.s3.key import Key

key = Key('hello.txt')

key.set_contents_from_file('/tmp/hello.txt')

Boto 3

s3.Object('mybucket', 'hello.txt').put(Body=open('/tmp/hello.txt', 'rb'))
hd1
  • 33,938
  • 5
  • 80
  • 91
Vaibhav Walke
  • 445
  • 3
  • 6
10

First, in boto3, if you setup security using "aws configure" , you don't need to declare that "sess" section (http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.html)

# if you already done aws configure
import boto3
s3 = boto3.client("s3")
s3.create_bucket(Bucket="mybucket", ....) 

Second, is the bad boto3 documentation that fail to link proper information. This is found under boto3 pdf, page 2181 (https://media.readthedocs.org/pdf/boto3/latest/boto3.pdf)

Email : The value in the Grantee object is the registered email address of an AWS account.

Grantee : The AWS user or group that you want to have access to transcoded files and playlists. To identify the user or group, you can specify the canonical user ID for an AWS account, an origin access identity for a CloudFront distribution, the registered email address of an AWS account, or a predefined Amazon S3 group

And the easier solution is just use policy setting (http://support.cloudcheckr.com/getting-started-with-cloudcheckr/preparing-your-aws-account/aggregate-cloudtrail/) . You can convert the whole stuff using put_bucket_policy(), skip the dire GrantWrite,GrantWriteACP

mootmoot
  • 12,845
  • 5
  • 47
  • 44
6

To create an S3 Bucket using Python on AWS, you need to have "aws_access_key_id_value" and "aws_secret_access_key_value". You can store such variables in config.properties and write your code in create-s3-blucket.py file

Create a config.properties and save the following code in it.

aws_access_key_id_value='YOUR-ACCESS-KEY-OF-THE-AWS-ACCOUNT'
aws_secret_access_key_value='TOUR-SECRETE-KEY-OF-THE-AWS-ACCOUNT'
Bucket_value='S3-BUCKET-NAME'
LocationConstraint_value='REGION-FOR-S3-BUCKET'

Create create-s3-blucket.py and save the following code in it.

import boto3

def getVarFromFile(filename):
    import imp
    f = open(filename)
    global data
    data = imp.load_source('data', '', f)
    f.close()

getVarFromFile('config.properties')

client = boto3.client(
    's3',
    aws_access_key_id=data.aws_access_key_id_value,
    aws_secret_access_key=data.aws_secret_access_key_value
)
client.create_bucket(Bucket=data.Bucket_value, CreateBucketConfiguration={'LocationConstraint': data.LocationConstraint_value})

Use the following command to execute the python code.

python create-s3-blucket.py

In the same way, you can add different parameters and customise this code. Refer AWS's official documentation for more understanding.

Rahul
  • 111
  • 1
  • 3
-6
import boto3

client = boto3.client('s3')

response = client.create_bucket(
    ACL='private'|'public-read'|'public-read-write'|'authenticated-read',
    Bucket='string',
    CreateBucketConfiguration={
       'LocationConstraint': 'EU'|'eu-west-1'|'us-west-1'|'us-west-2'|'ap-south-1'|'ap-southeast-1'|'ap-southeast-2'|'ap-northeast-1'|'sa-east-1'|'cn-north-1'|'eu-central-1'
},
GrantFullControl='string',
GrantRead='string',
GrantReadACP='string',
GrantWrite='string',
GrantWriteACP='string')
Sahana S
  • 27
  • 6
  • I don't know why this is the most downvoted answer, it actually is an excerpt from the Amazon's boto3 documentation. Quite helpful as well. – neo7 Jan 25 '19 at 23:38
  • 3
    @neo7 Code-only answers without any usage explanation are often downvoted. Also, if it's a copy-paste without even noting the source, that's not good. – Asclepius Oct 04 '19 at 14:45