Questions tagged [amazon-dynamodb-streams]

Use this tag for questions related to DynamoDB Streams, which provides a time ordered sequence of item level changes in any DynamoDB table. The changes are de-duplicated and stored for 24 hours. This capability enables you to extend the power of DynamoDB with cross-region replication, continuous analytics with Redshift integration, change notifications, and many other scenarios.

Useful links:

  1. Documentation
  2. Use cases
393 questions
0
votes
1 answer

Dynamo Db stream direct into Elastic Search without other middle layer

Can we directly stream dynamo db data to AWS elastic search service without using logstash because using logstash will incur extra cost? In all the articles that I have read online it was either with logstash or with lambda we can achieve this.
0
votes
1 answer

Is it possible to do multiple integration requests in one API gateway request?

The challange I would like to build a simple and fast (< 50ms) API with API gateway that works as an intelligent cache. The request to the API should batch fetch some items from DynamoDB based on their keys. However, if one or more items are not…
0
votes
1 answer

DescribeStreamOutcome reported as "incomplete type"

I cloned aws-sdk-cpp from github and I managed to build it without problems (tests passed as well). I did not do "make install". I wanted to compile the dynamodbstreams part of the SDK exclusively, so I added -DBUILD_ONLY="dynamodbstreams" to the…
0
votes
1 answer

Deserializing Primary Key Value with Underscore: Unexpected character Expected space separating root-level values

In Java, using the Jackson ObjectMapper, I'm trying to deserialize a dynamo db object being read from a dynamo db stream. I first call: record.getDynamodb().getNewImage().get("primaryKey").getS().toString() to get the primaryKey value of "1_12345"…
Tibberzz
  • 541
  • 1
  • 10
  • 23
0
votes
2 answers

How to delete items containing specific string matching in DynamoDB table?

I want to delete specific string matching items in the table. For example, Table1 is having Foo123Bar and Foo345Bar in the name column. I want to delete two recs in name column.
Nathon
  • 165
  • 1
  • 4
  • 13
0
votes
1 answer

Dynamodb stream order of records

I am populating records in dynamodb in the following sequence: A11, A12, A13, A14, A15, A21, A22, A23, A24, A25, A31, A32, A33, A34, A35 Records with same prefix (Ai) have the same partition key but different sort keys. Assume that all the records…
0
votes
2 answers

How to use Apache Streaming with DynamoDB Stream

We have a requirement wherein we log events in a DynamoDB table whenever an ad is served to the end user. There are more than 250 writes into this table per sec in the dynamoDB table. We would want to aggregate and move this data to Redshift for…
0
votes
2 answers

How to performance tune my table in Dynamo db having DynamoDBAutoGeneratedKey as Hash Key as the PutRequest is getting slow with each insert

I am using dynamo db tables for saving the transactional data for my APIs requests. I am maintaining two tables 1. schedule - with SId as hashkey 2. summary - with DynamoDBAutoGeneratedKey (UUID) as hashkey and SId as an Attribute to it. schedule…
MG_7
  • 61
  • 3
  • 10
0
votes
0 answers

How to achieve idempotent lambda function?

I have a pipeline like this - table 1(dynamo db) -> aws lambda -> table 2 (dynamo db) So whenever there is any update hapeens in table 1 then lambda gets trigered. So lambda basically batch read( 1000 records) from table 1 , then perform a batch…
0
votes
0 answers

DynamoDB results not equal to object modeling

I have an object model like below. 'use strict'; var crypto = require('crypto'); var dynamoose = require('../../config/database'); var Schema = dynamoose.Schema; var NavSchema = new Schema({ client_id: { type: String, hashKey:…
0
votes
0 answers

What is the best way to integrate DynamoDB stream with CloudSearch?

I'd like to achieve near real time search for a document service, and here is my idea: I plan to use DynamoDB as my primary document store; and then whenever a new document update happens, an event in DynamoDB stream is created; I'd like to ask…
0
votes
1 answer

Real value not recognized sending JSON data from Kinesis Firehose to elasticsearch

I have an issue in Kibana with the field value explained in the following lines. I'll try to explain the situation. I'm sending dynamoDB streams to Lambda then to Kenesis Firehouse and finally from Firehose to Elasticsearch. I'm using Kibana to…
JosepB
  • 2,205
  • 4
  • 20
  • 40
0
votes
1 answer

accessing a DynamoDBStreams stream with Kinesis SDK

It it possible to access streams defined by the DynamoDBStreams service with the Kinesis SDK? For example, using C++, is it possible to get info on a stream added by, say $ aws dynamodb update-table --stream-specification…
0
votes
1 answer

getApproximateCreationDateTime() API not available with DynamoDB StreamRecord object

I'm working on a POC around DynamoDB Streams and was following this documentation. The StreamRecord object I get from calling Record.getDynamodb() method, doesn't seem to have the ApproximateCreationDateTime attribute, as mentioned in the javadocs…
0
votes
1 answer

To have the records of all the data without writing the segments again and again in the code while adding in the database

In a table I am having 292 records, for these 292 records I wrote 63 segments of parallel asynchronous in DynamoDB. If the records increases again I am writing segments in the code. Without writing the code like that to have added records to…
purushottam
  • 75
  • 1
  • 6