2

I am using the event bridge rules console to trigger change/upload the new .csv file in the input_data folder in the bucket to trigger Sagemaker pipeline. But the Sagemaker pipeline is not being triggered.

Here is what I did :

  1. Created a new folder input_data in the bucket.
  2. Uploaded churn.csv
  3. When I upload again churn.csv, it should trigger Sagemaker pipeline.

This is my event pattern

{
  "source": ["aws.s3"],
  "detail-type": ["Object Created"],
  "detail": {
    "bucket": {
      "name": ["sagemaker-project-p-jsisqxxxxxx"]
    },
    "object": {
      "key": [{
        "prefix": "input_data/"
      }]
    }
  }
}

enter image description here

This is my target enter image description here

Can you please help me where I am making the mistake?

Rohit Kumar Singh
  • 647
  • 1
  • 7
  • 17
  • can you check monitoring logs, maybe it tried to execute but failed due to some permission error. – sid8491 Nov 24 '22 at 11:51
  • You should on the event bridge under the properties of s3 bucket. Have you checked that? – susan097 Jan 06 '23 at 09:44
  • Creating a rule that triggers a SageMaker Pipeline when new data is available on S3 need an extra step. S3 object-level data events are not processed by EventBridge. To do so, it is necessary to configure a CloudTrail trail to capture those object-level S3 events. See this: https://awstip.com/how-to-automatically-trigger-a-sagemaker-pipeline-using-eventbridge-3b71829a9e5 – susan097 Jan 06 '23 at 10:06

0 Answers0