-1

I am trying to create lambda with s3 event trigger with below stack. i want to trigger the lambda when object are placed in two directory with in same s3

 MyBucket:
    Type: AWS::S3::Bucket
    Properties:
        BucketName: !Ref S3

LambdaS3:  # Name change -change fun name
    Type: AWS::Serverless::Function
    Properties:
      Handler: lambda_function.lambda_handler
      Runtime: python3.7
      Timeout: 60
      FunctionName: !Sub "${Environment}somefun" 
      CodeUri: ./mysource/src
      Role: !GetAtt myiam.Arn
      Description: "Test"
      Environment:
        Variables:
          KINESIS_STREAM: !Sub "${Environment}_my_kinesis"
          ENVIRONMENT: !Sub "${Environment}"
         
      Events:
          FileUpload:
              Type: S3
              Properties:
                Bucket: !Ref MyBucket
                Events: s3:ObjectCreated:*
                Filter:
                  S3Key:
                    Rules:
                      - Name: prefix
                        Value: "L1/Test_1/INPUT/"
                      - Name: suffix
                        Value: ".json"
                      - Name: prefix
                        Value: "L2/Test_2/INPUT/"
                      - Name: suffix
                        Value: ".json"  

Now here iam refering two s3 location for the event trigger

L1/Test_1/INPUT/ and L2/Test_2/INPUT/

but after i deployed the code, i am getting below error:

Following Resources Failed to create[MyBucket]. Property rules contains duplicate value

Can anyone help on this?

adhi hari
  • 41
  • 1
  • 9

1 Answers1

1

You're getting the error because you have duplicate values in the Rules array, as is stated in the error. In this case, the duplicate value is

- Name: suffix
  Value: ".json"

On top of this, the AWS SAM documentation also states:

The same type of filter rule cannot be used more than once. For example, you cannot specify two prefix rules.

So I assume even removing the double entry will not make the stack deploy. You could, also write the filter in your lambda function and not handle the incoming events that do not pass the filter. This provides a lot more flexibility in the filtering but will ensure that a lambda function is triggered with every upload in the bucket, which might be undesirable.

stijndepestel
  • 3,076
  • 2
  • 18
  • 22