0

While I have alternative to write a lambda for this where I can do,

def lambda_handler:
   file = scan_db_and_push_data_in_csv()
   s3_client.upload(file)
   return s3_client_get_presigned_url()

But here I will be needing to do scan, and also I searched for other ways of exporting those are mostly manual or non-programmatical. The lambda code will get triggered on button click. Is there any better way to this.

While I have a idea that we can use a AWS Datapipeline for the use case. But can we get a pre-signed url for the CSV upload in that case. I am not able to decide which approach I should use.

Option 1: Use Lambda and do a scan and upload. Option 2: Trigger data pipeline through code and generate presigned url from s3 once data is uploaded in s3.

Resources I have looked:

DynamoDB export to CSV

Data Pipeline - DynamoDB export

Exporting and Importing DynamoDB Data Using AWS Data Pipeline

NoSQLKnowHow
  • 4,449
  • 23
  • 35
amazing_milkha
  • 148
  • 1
  • 10

0 Answers0