I want to create a separate custom metric for each ECS task that I will run when AutoScaling automatically will be creating more tasks. Is there a way to easily implement this using Cloudformation or maybe something similar? Maybe automatically creating a separate log group with a separate metric filter for each task. Or only a metric filter but assigning it to a log stream of the created task.
Asked
Active
Viewed 109 times
1 Answers
1
If the custom metric is published by your task, then you can just start publishing the metrics directly. https://aws.amazon.com/premiumsupport/knowledge-center/cloudwatch-custom-metrics/
You would want to use the dimension of the task identifier. You would get that from the metadata. https://docs.aws.amazon.com/AmazonECS/latest/developerguide/task-metadata-endpoint.html
So you would get something like. Note I haven't tried the dimension type of taskArn which might cause a concern, but AWS already publishes cloudwatch events with this field.
aws cloudwatch put-metric-data --metric-name petro_custom_1 --dimensions taskArn=$TaskARN --namespace "Custom" --value $Value

sfblackl
- 461
- 4
- 9
-
Thank you for the info! But I am also concerned about [this quesiton](https://stackoverflow.com/questions/52257298/calling-api-or-logging-into-the-console). What do you think about that? Is it more **effective** to publish data directly or logging the data and then pick it up using Metric Filters? – Ruslan Plastun Sep 16 '18 at 13:40
-
It depends on what you want to consume the data. We actually feed our logs to stdout, docker cloudwatch log driver, kinesis, logstash and then use elasticsearch for analysis and aggregation. Per request we have a special line with all meta data that gets to a different index pattern so we can retain the data longer than our raw log entries. So that option does work, but so does sending to cloud watch metrics. Do you mind updating your question on what you are trying to solve with this? – sfblackl Sep 16 '18 at 23:21