1

I have a use-case where QLDB table contains this :

customer-id | customer-name | customer-item-count

I have a need to publish the metrics per customer-id every 5 minutes to cloudwatch and this data is available is in the QLDB table.

Is there a way to do?

QLDB has export jobs to export the content to S3, is there tooling to dump contents to cloudwatch?

dashuser
  • 181
  • 1
  • 3
  • 16

1 Answers1

2

Many customers use periodic S3 exports (or Kinesis integration, if you signed up for the preview) to keep some sort of analytics DB up to date. For example, you might copy data into Redshift or ElasticSearch every minute. I don't have code examples to share with you right now. The tricky part is getting the data into the right shape for the destination. For example, QLDB supports nested content while Redshift does not.

Once the data is available and aggregated in the way you wish to query it, it should be a simple matter to run a report and write the results into CloudWatch.

Marc
  • 928
  • 5
  • 8
  • Oh wow! Didnt know about the Kinesis Preview! That's good to know. Any idea when that will be public? – dashuser Jan 06 '20 at 23:49
  • Thanks for the reply! We don't intend to store any nested content in QLDB for now. So, we are only looking to transfer the entire data that we have in QLDB every 5 minutes to CloudWatch, assumed there might some tooling around this particular use-case ? :) – dashuser Jan 06 '20 at 23:51
  • I'm sorry, there isn't yet. There are some sample apps in the works to demonstrate how to use streaming, but nothing for CloudWatch. Those samples might be helpful though (get you most of the way there) and will be released alongside the streams GA. – Marc Jan 07 '20 at 18:43