2

I am planning to copy the AWS CloudWatch Logs to ELK and want to use Kibana Dashboard to visualise the logs.

One option is to stream the logs from CloudWatch to ELK.

https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_ES_Stream.html

But I feel this will involve execution of Lambda functions extensively and it might not be a cost effective option.

Is there any other cost effective way to copy logs from CloudWatch to maybe S3 and then to ELK?

I am Ok, if the logs are not realtime, maybe a delay of 15 mins or maybe one hour is OK.

But I am looking for a cost effective solution.

Btw, what is the best way to purge the CloudWatch logs periodically? ( maybe after one week)

Dattatray
  • 1,745
  • 1
  • 21
  • 49

1 Answers1

0

Firstly, you can set the retention policy for your cloudwatch logs. By default it's indefinite, but you can set it to say like 7 days as you mentioned.

Secondly, according to me lambda would be a cost effective option to ingest the logs to your elasticsearch domain.

Other option, l can think as below:

  • ec2 instance with log stash reading the logs from cloudwatch and ingesting it to elasticsearch.
  • probably you can extend above with aws batch
Rishikesh Darandale
  • 3,222
  • 4
  • 17
  • 35