Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
1
vote
0 answers

Map reducing Scanamo query conditions

I'll dive right in. I am using ScanamoAlpakka. I have the following: Map( 'field1 -> value1, 'field2 -> value2, and so on... ) The goal is to iterate the map and remove any empty values. Once the empty values are removed, I need to transform…
Drakken Saer
  • 859
  • 2
  • 10
  • 29
1
vote
0 answers

Exception in Alpakka JsonReader substream not being caught by supervisor

I have a source which uses a flatMapConcat to consolidate a new stream of elements, produced by parsing Json. To do this, I am using JsonReader provided by Alpakka…
kiambogo
  • 21
  • 1
1
vote
0 answers

How is fetch size controlled in akka stream alpakka slick

The documentation of Slick cautions that for certain databases, the fetch size must be set. However in general, I wonder how to control the fetchSize with the akka-stream-alpakka-slick integration? We don't have access to the DB Action directly,…
MaatDeamon
  • 9,532
  • 9
  • 60
  • 127
1
vote
1 answer

Alpakka UDP: How can I respond to received datagrams via the already bound socket?

I'm using Alpakkas UDP.bindFlow to forward incoming UDP datagrams to a Kafka broker. The legacy application that is sending these datagrams requires a UDP response from the same port the message was sent to. I am struggling to model this behaviour…
Uwe Sommerlatt
  • 129
  • 1
  • 9
1
vote
0 answers

Can you append to a file using the alpakka HDFS connector?

I'm trying to use this connector to pull messages from Kafka and write them to HDFS. Works fine as long as the file doesn't already exist, but if it does then it throws a FileAlreadyExistsException. Is there a way to append to an already-existing…
csjacobs24
  • 775
  • 6
  • 13
1
vote
1 answer

Alpakka/Kafka - Partitions consumed faster than others

I’ve been using alpakka kafka to streaming data from kafka topics. I’m using: Consumer .committableSource(consumerSettings, Subscriptions.topics(topic)) Recently I’ve tried to spam more consumers like 3 on a topic which has 15 partitions.…
Thiago Pereira
  • 1,724
  • 1
  • 17
  • 31
1
vote
1 answer

SSLHandshakeException happens during file upload to AWS S3 via Alpakka

I'm trying to setup an Alpakka S3 for files upload purpose. Here is my configs: alpakka s3 dependency: ... "com.lightbend.akka" %% "akka-stream-alpakka-s3" % "0.20" ... Here is application.conf: akka.stream.alpakka.s3 { buffer = "memory" proxy…
Alex Fruzenshtein
  • 2,846
  • 6
  • 32
  • 53
1
vote
1 answer

How to release a message back to RabbitMQ so it's available for another consumer?

I'm using the Alpakka AMQP library (https://developer.lightbend.com/docs/alpakka/current/amqp.html) to process RabbitMQ messages in a reactive stream and dump them into Kafka. We are using Avro and Schema Registry, so malformed messages fail…
flybonzai
  • 3,763
  • 11
  • 38
  • 72
1
vote
1 answer

Alpakka XML content between tags

Alpakka XML processing flow allows to read xml file element by element. But how to extract data between particular StartElement and EndElement including StartElement data? subslice is not an option because there is no constant prefix for needed…
St.
  • 521
  • 3
  • 8
1
vote
1 answer

Incompatible equality constraint while using Akka Kafka Streams

I am trying to use Akka Kafka Streams following the Akka Kafka Streams documentation. Here is the code I have: ConsumerSettings consumerSettings = ConsumerSettings .create(actorSystem, new…
Prasanth
  • 1,005
  • 5
  • 19
  • 39
1
vote
0 answers

How to use Source.queue with Alpakka

I'm trying to create a producer to a JMS queue that can be used more than once; i.e., I don't want to create a connection to the queue every time I send a message. I want an actor with a connection open, and each time a message comes in, it uses…
Kingpin2k
  • 47,277
  • 10
  • 78
  • 96
1
vote
1 answer

Akka Stream - How to Stream from multiple SQS Sources

This is a subsequent post of Akka Stream - Select Sink based on Element in Flow. Assume I have multiple SQS queues I'd like to stream from. I'm using the AWS SQS Connector of Alpakka to create Source. implicit val sqsClient: AmazonSQSAsync = ??? val…
gyoho
  • 799
  • 2
  • 9
  • 25
1
vote
1 answer

Does it make sense to use Paging SQL Statement when using Slick (JDBC) Connector for Alpakka

I'm currently wondering how Slick (JDBC) Connector for Alpakka works under the hood - And I can't really find an answer using the Docs. Considering a use case where I want to process a large number of records selected from a database. Can I simply…
cokeSchlumpf
  • 163
  • 1
  • 9
1
vote
1 answer

Akka Streams KillSwitch in alpakka jms

I have a scenario in which I am starting multiple jmsSource (for different queues) using alpakka. I also need to detach the queues at any point in time. So I have added KillSwitch to the jms akka streams as below :- trait MessageListener { lazy…
Kiras
  • 69
  • 6
1
vote
1 answer

Alpakka - read Kryo-serialized objects from S3

I have Kryo-serialized binary data stored on S3 (thousands of serialized objects). Alpakka allows to read the content as data: Source[ByteString, NotUsed]. But Kryo format doesn't use delimiters so I can't split each serialized object into a…
Tvaroh
  • 6,645
  • 4
  • 51
  • 55