Questions tagged [amazon-kinesis]

Amazon Kinesis is a fully managed service for real-time processing of streaming data at massive scale.

Amazon Kinesis is a fully managed service for real-time processing of streaming data at massive scale. Amazon Kinesis can collect and process hundreds of terabytes of data per hour from hundreds of thousands of sources, allowing you to easily write applications that process information in real-time, from sources such as web site click-streams, marketing and financial information, manufacturing instrumentation and social media, and operational logs and metering data.

With Amazon Kinesis applications, you can build real-time dashboards, capture exceptions and generate alerts, drive recommendations, and make other real-time business or operational decisions. You can also easily send data to a variety of other services such as Amazon Simple Storage Service (Amazon S3), Amazon DynamoDB, or Amazon Redshift. In a few clicks and a couple of lines of code, you can start building applications which respond to changes in your data stream in seconds, at any scale, while only paying for the resources you use.

Useful links

1802 questions
0
votes
1 answer

UnrecognizedClientException while testing API Gateway with integration type Kinesis

I have created an API using API gateway with integration type 'Kinesis'. I am trying to access ListStreams method. I have created a role with AmazonKinesisFullAccess policy and trusted identity is set to apigateway.amazonaws.com. I have provided arn…
Pragmatic
  • 3,093
  • 4
  • 33
  • 62
0
votes
1 answer

Data flow from Nginx access log -> Rsyslog or Syslog -> Fluentd -> Kinesis

I am working on passing nginx access logs to Fluentd to aws kinesis to AWS S3 via kinesis firehose. nginx logs will be pushed to AWS Glacier during log rotation. I am at the initial steps where I need need to pass nginx access logs to fluentd via…
Jay Teli
  • 530
  • 1
  • 11
  • 19
0
votes
1 answer

How to perform face recognition on a streaming video using amazon Rekognition?

I am streaming video the amazon kinesis from raspberry pi (This is done). Now i want to perform face detection/recognition on that video using amazon Rekognition how to do it explain in detail with links. Thanks
0
votes
1 answer

How to transfer video from kinesis video stream to AWS Rekognition and perform data analytics on that video?

I am streaming video from raspberry pi to amazon kinesis video stream (this part is done). Now i want to send video to AWS Rekognition and perform face detection on the live video. Kindly answer in detail and with links. Thankyou!
0
votes
1 answer

high availability for kinesis data stream consumer

I want to make below architecture for data sending. producer --> Kinesis Data stream --> consumer Consumer server can be shut down, therefore I think there should be at least 2 consumers. Is it right? When there are two consumer for one data…
0
votes
0 answers

"Watch" directory using GStreamer and decode new media files on the fly

Suppose I have a directory, /tmp/media, into which some video-producing device continually writes timestamped (in the filename) 60s MP4s, keeping an hour's worth of history, e.g. (written at 13:42, deleted at 14:42)…
0
votes
0 answers

Is there any module in Ansible for Kinesis Firehose creation? like kiensis_stream module for Kinesis Data Streams

I am trying to create a simple AWS Kinesis Firehose using Ansible. I tried doing this: - name: Create kinesis_firehose: state: present name: example stream_type: DirectPut dest: Elasticsearch role_arn:…
0
votes
1 answer

how to create a custom AWS resource name?

I'm creating AWS Kinesis. When I create AWS resource with Pulumi, the resource is created with a unique name. Example:(mystream-263c353). How can I create a custom simple name, which doesn't have to be unique? Example:(mystream) Here's my…
tuioku
  • 119
  • 10
0
votes
1 answer

Continuously moving data from aws aurora mysql to aws s3

I am trying to setup data lake and move all the data to s3. I have to move aurora mysql data to s3 (most probably in parquet format). I tried initial POC using Data Migration Service with that we can move all data at once. Problem with this was…
0
votes
1 answer

Delay on StreamListener or condition

I'm reading docs but I'm not sure if this is possible on spring-cloud-stream using binder for kinesis. I want to wait consuming messages from stream with some delay or configuration or by positive condition. For example, I want wait 30 minutes after…
jagr
  • 309
  • 1
  • 6
  • 16
0
votes
1 answer

Kinesis Binder Default Read capacity and write capacity on dynamo db tables

As per the documentation in Spring Aws Kinesis Binder, the default values for readCapacity and writeCapacity are 1…
0
votes
1 answer

How to process a kinesis stream record? (multiple processors)

I'm working on a project that monitors a micro-service based system. the mock micro-services I created produce data and upload it to Amazon Kinesis, now I use this code here from Amazon to produce to and consume from the Kinesis. But I have failed…
0
votes
1 answer

Set AWS Kinesis cloudformation template

I am new to AWS cloudformation and in need to create a Kinesis datastream, then write records to this stream using python code. I was able to create a data stream through cloudformation template but not able to set the permissions. How I will…
0
votes
0 answers

Weird issue while processing events coming from kinesis

I setup amazon connect on aws and if I make a test call, it will put that call in a aws kinesis stream. I am trying to write a lambda that process this records and save them to database. If I make a simple call (call the number - asnwer - hangup) it…
DoArNa
  • 491
  • 2
  • 9
  • 29
0
votes
1 answer

Stuck with kinesis stagger window

I have a kinesis analytics application set up which takes data from a kinesis stream which has the following schema. -------------------------- Column ColumnType -------------------------- Level varchar(10) RootID …
RobustPath004
  • 57
  • 1
  • 8
1 2 3
99
100