0

When importing from S3 to DynamoDB, does this count towards provisioned write throughput?

I have a service that is only read from, except for batch updates from a multi-gigabyte file in S3. We don't want to pay for provisioned writes all month, and scaling from 0 writes to several million could take a while given the AWS policy of only allowing provisioned rates to double at one time.

DeejUK
  • 12,891
  • 19
  • 89
  • 169

1 Answers1

1

Yes. EMR integration relies on the same API as any client application. As such is is subject to the same throughput policy.

Minor precision:

  • minimum throughput = 1 (not 0)
  • maximum throughput = 10,000 (not > 1,000,000)

By the way, huge 'scaling' can easily be automated provided that you only double at once. It only takes a couple of minutes to run. Maybe you could also consider storing "incremental" diff instead of the full "multi-gigabyte file in S3". It would save a lot...

The official optimization guide for DynamoDB can provide you some useful hints on how to optimize your import.

yadutaf
  • 6,840
  • 1
  • 37
  • 48
  • Is the a rate limit on how many times in a day you can increase throughput? – DeejUK Sep 07 '12 at 15:27
  • Nope. But can have only one single table in "UPDATING" state at a given time – yadutaf Sep 07 '12 at 15:36
  • Another suggestion from an AWS architect was to create a new table with the desired high write throughput, import, change the application from the old to the new table, and then decrease write provisioning. – DeejUK Sep 09 '12 at 11:59