Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
0
votes
1 answer

Inserting a document into Elastic using Alpakka

I'm trying to learn how to use Alpakka and have setup a test to write a document to Elastic. From reading docs, including https://doc.akka.io/docs/alpakka/current/elasticsearch.html have written the following : import akka.actor.ActorSystem import…
blue-sky
  • 51,962
  • 152
  • 427
  • 752
0
votes
0 answers

How to set checksum validation for the alpakka s3 sink

I need to enable aws object integrity validation, when streaming a file using akka-streams I have tried the following code but it doesn't work: val sink = S3.multipartUploadWithHeaders( bucket = bucket, key = dst, …
Eli Golin
  • 373
  • 1
  • 16
0
votes
1 answer

Akka streams: how to model 1 producer N consumers linar streams

I need to model a series of LINEAR Akka Streams processing streams that model a 1 producer N consumers system. As a quick reference, you can imagine having a single producer that produces messages in a Kafka topic, and N consumers that consume from…
0
votes
2 answers

convert akka journal event columns string value to java object

I am using aws dynamodb akka persistence API https://github.com/akka/akka-persistence-dynamodb which doesn't have a read journal API like Cassandra (Akka Persistence Query). I can write journal data to dynamodb the event column is in string java…
0
votes
1 answer

alpakka jms client acknowledgement mode delivery guarantee

I have an alpakka JMS source -> kafka sink kind of a flow. I'm looking at the alpakka jms consumer documentation and trying to figure out what kind of delivery guarantees this gives me. From…
0
votes
1 answer

Consume messages from Kafka topic partition from an offset / from a date when auto commit set to true

Dependency used Alpakka Kafka 3.0 We have below consumer settings. enable.auto.commit = true auto.offset.reset = earliest If We have enable.auto.commit = true then is it possible to consume messages from Kafka topic partition from a particular…
Akshay Jain
  • 73
  • 2
  • 8
0
votes
2 answers

Scala adding an element to JsValue before convertT[class]

I have a Kafka consumed record that will be parsed as JsValue with spray.json in scala, but I also have some data in the record's header, and I want to do: Consume record with Alpakka Kafka library (done) parse as json of type JsValue:…
Omar AlSaghier
  • 340
  • 4
  • 12
0
votes
1 answer

Akka stream stops processing data

When I run the below stream it does not receive any subsequent data once the stream runs. final long HOUR = 3600000; final long PAST_HOUR = System.currentTimeMillis()-HOUR; private final static ActorSystem actorSystem =…
blue-sky
  • 51,962
  • 152
  • 427
  • 752
0
votes
1 answer

Using ElasticsearchSource to filter results in Elastic

Here I'm using this code to read data from an index where timestamp is >= 1656055230028: Source>, NotUsed> source = ElasticsearchSource.create( …
blue-sky
  • 51,962
  • 152
  • 427
  • 752
0
votes
1 answer

Influxdb Alpakka connector not writing to database

I'm trying to write into the Influxdb (running in a docker container with version 2.0). I'm using Scala and Reactive Streams. Therefore the Alpakka connector (https://doc.akka.io/docs/alpakka/current/influxdb.html) because the Scala Reactive Client…
zel873ju
  • 139
  • 2
  • 7
0
votes
1 answer

Alpakka Kafka Java Test

I am trying to write Junit tests for an app that uses kafka streams to emit data from kafka to a websocket connection. I have been able to run the application locally publishing to the topic and seeing the data returned on a socket connection. …
mradey
  • 202
  • 1
  • 12
0
votes
1 answer

How to retrieve messages from Alpakka Mqtt Streaming client?

I was following document for writing a Mqtt client subscriber using alpakka. https://doc.akka.io/docs/alpakka/3.0.4/mqtt-streaming.html?_ga=2.247958340.274298740.1642514263-524322027.1627936487 After the code marked in bold, I’m not sure how could I…
AAA
  • 1
0
votes
1 answer

Scala Generic Repository Class For Reactive Mongo Repository(alpakka) - Needed Class Found T

Im trying to create a Generic Class in Scala so I can create a repository for different collection without repeating myself. The problem is that if I do it as a Generic Class(as in this example) I get a problem in this line: val codecRegistry =…
0
votes
2 answers

How to handle backpressure when Streaming file from s3 with actor interop

I am trying to download a large file from S3 and sending it's data to another actor that is doing an http request and then to persist the response. I want to limit number of requests sent by that actor hence I need to handle backpressure. I tried…
igx
  • 4,101
  • 11
  • 43
  • 88
0
votes
1 answer

Akka flow Input (`In`) as Output (`Out`)

I am trying to write a piece of code which does following:- Reads a large csv file from remote source like s3. Process the file record by record. Send notification to user Write the output to a remote location Sample record in input…
Aiden
  • 355
  • 5
  • 17