0

I'm trying to develop a way of breaking down S3 by which users/projects using CloudTrail. Does CloudTrail offer the ability to see which IAM user uploaded a a particular object to a bucket?

UPDATE:

I have a CloudTrail turned on that monitors object-level activities (for all s3 buckets, including read and write activities), however, when I try to list "PutObject" events in my update above, it doesn't work (i.e. the list of events comes up blank).

ct_client = boto3.client('cloudtrail')

response = ct_client.lookup_events(
    LookupAttributes=[
        {
            'AttributeKey': 'EventName',
            'AttributeValue': 'PutObject'
        }],
    StartTime=datetime(2018, 3, 1),
    EndTime=datetime.now(),
    MaxResults=50
)

UPDATE 2

Images of my bucket properties and CloudTrail in the console:

CloudTrail settings

Bucket Properties

claudiadast
  • 591
  • 3
  • 11
  • 33

2 Answers2

1

Yes, you can monitor IAM users uploading objects to S3 using CloudTrail. The amount of information that CloudTrail records is extensive.

This document link will give you an intro to CloudTrail S3 logging:

Logging Amazon S3 API Calls by Using AWS CloudTrail

This document link will give you detailed information on the events logged by CloudTrail:

CloudTrail Log Event Reference

Follow this document link to enable Object Level Logging for an S3 Bucket. This is necessary to see APIs such as PutObject:

How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events?

CloudTrail has a Python API. However, you will want to directly process the CloudTrail logs stored in S3.

CloudTrail Python Boto3 SDK

I prefer to analyze CloudTrail logs using Athena which makes this process easy.

Querying AWS CloudTrail Logs

John Hanley
  • 74,467
  • 6
  • 95
  • 159
  • I have a CloudTrail turned on that monitors object-level activities (for all s3 buckets, including read and write activities), however, when I try to list "PutObject" events in my update above, it doesn't work (i.e. the list of events comes up blank). – claudiadast Sep 11 '18 at 18:43
  • Did you enable bucket object logging? I updated my answer with a link. – John Hanley Sep 11 '18 at 19:01
  • In my CloudTrail, I chose the `Select all S3 buckets in your account ` option. Do I still need to go physically edit the properties section of every bucket to enable logging? – claudiadast Sep 11 '18 at 20:42
  • Yes - I provided you a link on how to enable object logging which is different than bucket logging. – John Hanley Sep 11 '18 at 20:49
  • I just checked the properties for every bucket and it says that object-level logging is enabled. – claudiadast Sep 11 '18 at 20:51
  • For which object APIs? – John Hanley Sep 11 '18 at 20:52
  • Grep the CloudTrail logs in S3 for PutObject. If it is not there, then you have not enabled object logging correctly. – John Hanley Sep 11 '18 at 20:53
  • See Update 2 above. In CloudTrail, I chose to enable it for Read and Write APIs. – claudiadast Sep 11 '18 at 20:55
  • From your screenshot, I do not see a bucket enabled for the CloudTrail logs. If you want to see everything you must enable this. Take the time to read the documents so that you understand how everything is configured and works. – John Hanley Sep 11 '18 at 21:05
1

I don't believe the Data Events are visible in the same way as Management Events. That's certainly the case if you view the Event History in the AWS Console.

As suggested elsewhere, laying an Athena table over the s3 location where the data events are stored works well - something like this will then tell you who/what uploaded the object:

SELECT
    useridentity
,   json_extract_scalar(requestparameters,'$.bucketName')
,   json_extract_scalar(requestparameters,'$.key')
FROM cloudtrail_logs
WHERE eventname IN ('PutObject')
AND json_extract_scalar(requestparameters,'$.bucketName') = 'xxx'
AND json_extract_scalar(requestparameters,'$.key') = 'yyy';

Where cloudtrail_logs is created in line with the docs at: https://docs.aws.amazon.com/athena/latest/ug/cloudtrail-logs.html

useridentity will not always be an IAM user - it may be an AWS Service, an external account, an assumed role as well - you can use the .type element to filter as required or simply pull all the elements.

Depending on the number of objects you have in S3 / the size of your cloudtrail_logs in S3 you may want to refine the s3 location of the cloudtrail_logs table by date - eg:

s3://<BUCKETNAME>/AWSLogs/<ACCOUNTNUMBER>/CloudTrail/<REGION>/2018/08/17

If you wanted you could execute the Athena query using boto3 saving the output to an S3 location and then pull that data from S3 also using boto3.

9bO3av5fw5
  • 882
  • 5
  • 14