I have a grails application. I'm using amazon dynamodb for a specific requirement which is accessed, and entries are added by a different application. Now I need to get all the information from the dynamodb table to a postgreSQL table. There are over 10000 records in the dynamodb but the throughput is
Read capacity units : 100
Write capacity units : 100
In BuildConfig.groovy I have defined the plugin
compile ":dynamodb:0.1.1"
In config.groovy I have the following configuration
grails {
dynamodb {
accessKey = '***'
secretKey = '***'
disableDrop = true
dbCreate = 'create'
}
}
The code I have looks something similar to this
class book {
Long id
String author
String name
date publishedDate
static constraints = {
}
static mapWith = "dynamodb"
static mapping = {
table 'book'
throughput read:100
}
}
When I try something like book.findAll() I get the following error
AmazonClientException: Unable to unmarshall response (Connection reset)
And when I tried to reduce the number of records by trying something like book.findAllByAuthor() (which also wud have above 1000's of records) I get the following error
Caused by ProvisionedThroughputExceededException: Status Code: 400, AWS Service: AmazonDynamoDB, AWS Request ID: ***, AWS Error Code: ProvisionedThroughputExceededException, AWS Error Message: The level of configured provisioned throughput for the table was exceeded. Consider increasing your provisioning level with the UpdateTable API.
I have the need to get all the records in dynamodb despite the throughput restriction and save it in a postgres table. Is there a way to do so?
I'm very new to this area, thanks in advance for the help.
After some research I came Across Google Guava. But even to use Guava RateLimiter, there wont be a fixed number of times I would need to send the request of how long it would take. So I'm looking for a solution which will suit the requirement I have