8

Objective

I'd like to pass event data from Amazon EventBridge directly into an AWS Fargate task. However, it doesn't seem like this is currently possible.

Workaround

As a work-around, I've inserted an extra resource in between AWS Fargate and EventBridge. AWS Step Functions allows you to specify ContainerOverrides, in which the Environment property allows you to configure environment variables that will be passed into the Fargate task, from the EventBridge event.

Unfortunately, this workaround increases the solution complexity and cost unnecessarily.

Question: Is there a way to pass event data from EventBridge directly into an AWS Fargate (ECS) task, that I am simply unaware of?

  • Can you use SQS or SNS to pass the event from the eventbridge to the container? – Marcin Aug 10 '20 at 23:41
  • How would that look exactly? My original event source is S3 `PutObject` and `CompleteMultipartUpload`. An architecture diagram of what you're envisioning would help, if you're up to it. :) –  Aug 11 '20 at 00:14
  • There are many ways for doing this. You application on fargate could pull the SQS for messages. Or you could expose an HTTP endpoint on your app, so that SNS can push messages to it. Details are use-case specific, thus I also can't provide a concrete answer except some ideas in comments. – Marcin Aug 11 '20 at 00:18
  • 1
    My question is specifically surrounding a direct forwarding behavior between EventBridge and ECS (Fargate), so intermediaries are out of the question right now. I used Step Functions as a work-around, but as I mentioned in the question, I would prefer to reduce complexity of the solution. –  Aug 12 '20 at 01:06

2 Answers2

6

To pass data from Eventbridge Event to ECS Task for e.g with a Launch Type FARGATE you can use Input Transformation. For example let's say we have an S3 bucket configured to send all event notifications to eventbridge and we have an eventbridge rule that looks like this.

{
      "detail": {
        "bucket": {
          "name": ["mybucket"]
        }
      },
      "detail-type": ["Object Created"],
      "source": ["aws.s3"]
}

Now let's say we would like to pass the bucket name, object key, and the object version id to our ecs task running on fargate you can create a aws_cloudwatch_event_target resource in terraform with an input transformer below.

resource "aws_cloudwatch_event_target" "EventBridgeECSTaskTarget"{
  target_id = "EventBridgeECSTaskTarget"
  rule = aws_cloudwatch_event_rule.myeventbridgerule.name
  arn = "arn:aws:ecs:us-east-1:123456789012:cluster/myecscluster"
  role_arn = aws_iam_role.EventBridgeRuleInvokeECSTask.arn

  ecs_target {
    task_count          = 1
    task_definition_arn = "arn:aws:ecs:us-east-1:123456789012:task-definition/mytaskdefinition"
    launch_type = "FARGATE"
    
    network_configuration {
      subnets         = ["subnet-1","subnet-2","subnet-3"]
      security_groups = ["sg-group-id"]
   }
  }

  input_transformer {
    input_paths = {
      bucketname = "$.detail.bucket.name",
      objectkey   = "$.detail.object.key",
      objectversionid = "$.detail.object.version-id",
    }
    input_template = <<EOF
{
  "containerOverrides": [
    {
      "name": "containername",
      "environment" : [
        {
          "name" : "S3_BUCKET_NAME",
          "value" : <bucketname>
        },
        {
          "name" : "S3_OBJECT_KEY",
          "value" : <objectkey>
        },
        {
          "name" : "S3_OBJ_VERSION_ID",
          "value": <objectversionid>
        }
      ]
    }
  ]
}
EOF
  }
}

Once your ECS Task is running you can easily access these variables to check what bucket the object was created in, what was the object and the version and do a GetObject.

For e.g: In Go we can easily do it as follows. (snippets only not adding imports and stuff but you get the idea.

filename := aws.String(os.Getenv("S3_OBJECT_KEY"))
bucketname := aws.String(os.Getenv("S3_BUCKET_NAME"))
versionId := aws.String(os.Getenv("S3_OBJ_VERSION_ID"))

//You can print and verify the values in CloudWatch

//Prepare the s3 GetObjectInput

s3goi := &s3.GetObjectInput{
        Bucket: bucketname,
        Key:    filename,
        VersionId: versionId,
    }

s3goo, err := s3svc.GetObject(ctx, s3goi)
    if err != nil {
        log.Fatalf("Error retreiving object: %v", err)
    }

b, err := ioutil.ReadAll(s3goo.Body)
    if err != nil {
        log.Fatalf("Error reading file: %v", err)
    }
Aakash
  • 3,101
  • 8
  • 47
  • 78
2

There's no current direct invocation between EventBridge and Fargate. You can find the list of targets supported at https://docs.aws.amazon.com/eventbridge/latest/userguide/eventbridge-targets.html

The workarounds is to use an intermediary that supports calling fargate (like step-functions) or send the message to compute (like lambda [the irony]) before sending it downstream.

blr
  • 908
  • 4
  • 8
  • 1
    It seems to be possible: https://dev.to/maduranga95/run-a-scheduled-task-using-aws-fargate-with-ci-cd-3pej, https://docs.aws.amazon.com/AmazonECS/latest/userguide/cloudwatch_event_stream.html ("When using the Fargate launch type...") – The Onin Jul 29 '21 at 02:43