16

I want to setup the following event/notification configuration in an AWS S3 bucket: Upon receipt of a file (s3:ObjectCreated:*) two targets shall be triggered:

  • SQS: To queue the file for detailed post processing for a retention period for a couple of days
  • Lambda: To do some quick immediate metrics processing

When I try to setup the configuration via the AWS console I get the following error message:

Configurations overlap. Configurations on the same bucket cannot share a common event type. : s3:ObjectCreated:*, s3:ObjectCreated:*

I've tried to set up the configuration via AWS SDK (Java) as proposed by the user guide, but similar result:

Caught an AmazonServiceException, which means your request made it to Amazon S3, but was rejected with an error response for some reason.
Error Message:    Configurations overlap. Configurations on the same bucket cannot share a common event type. (Service: Amazon S3; Status Code: 400; Error Code: InvalidArgument; Request ID: A0E8738522EA218F)
HTTP Status Code: 400
AWS Error Code:   InvalidArgument
Error Type:       Client
Request ID:       A0E8738522EA218F
Error XML<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>InvalidArgument</Code><Message>Configurations overlap. Configurations on the same bucket cannot share a common event type.</Message><ArgumentName>Event</ArgumentName><ArgumentValue>s3:ObjectCreated:*, s3:ObjectCreated:*</ArgumentValue><RequestId>A0E8738522EA218F</RequestId><HostId>p4qYoIXi38u3Jl3p0xpI7TFWgs0ZxsqK89oDTTy8D/tbw39NnaIT99jIvHIxt4XliRFxqNWl32M=</HostId></Error>
Steffen Opel
  • 63,899
  • 11
  • 192
  • 211
Roland Ettinger
  • 2,615
  • 3
  • 23
  • 24

3 Answers3

17

I suggest you should publish S3 Notifications to SNS Topic and have your Lambda function and SQS Queue subscribe to this SNS Topic.

This architecture should help you achieve what you're looking for.

adamkonrad
  • 6,794
  • 1
  • 34
  • 41
  • 2
    Right, SNS solved the problem. I have added "SNS:Publish" permission to the SNS topic and configured my SQS queue and my Lambda function as subscribers to the SNS topic. In the Lambda function I can unpack the S3Object containing buckt, path, etc. and then download the file from S3 and process it. – Roland Ettinger Jul 22 '15 at 11:04
7

Actually this error occurs because you try to hook same event by different functions.

Go to bucket's properties tab and scroll to Events section

For example, if a "PUT" event is already registered on that bucket, you will not be able to register a "PUT" event on another function.

Eugen Konkov
  • 22,193
  • 17
  • 108
  • 158
  • This helped me! I didn't realize there was still an event tied to the bucket from a prior lambda that had been just deleted. Not sure if my case was just a timing glitch, but going to the properties as mentioned above quickly showed me it was still there. – Ben Sep 16 '20 at 20:37
6

Best solution is probably going to be to trigger an SNS notification when the files is uploaded to S3, and then use SNS's 'fanout' capabilities to send mutliple, simulaneous SQS messages which can then be received and acted on independently.

Alternatively, if you only want to process 'step 2' if and only if 'step 1' is processed, then you can trigger a single SQS message for 'step 1', and then only on the successful completion of 'step 1', have the final act of 'step 1' be the sending of a second sqs event for 'step 2' for further processing.

E.J. Brennan
  • 45,870
  • 7
  • 88
  • 116