Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
1
vote
1 answer

Combine prefixAndTail(1) with Sink.lazySink for SubFlow created by .splitAfter

I am currently developing an akka-stream/alpakka application that has the following general logic Given a Flow, split it into a SubFlow using the splitAfter method. For each of the SubFlow's in point 1, use prefixAndTail(1) to create a key based on…
mdedetrich
  • 1,899
  • 1
  • 18
  • 29
1
vote
2 answers

Can MQTT v3 client work with MQTT v5 server?

I wanted to make use of Shared Subscription feature of MQTT v5. But currently I am using AKKA MQTT client which doesn't support MQTT v5 client. Can I still use v3 MQTT paho client and use Shared Subscription feature?
saumilsdk
  • 721
  • 1
  • 7
  • 17
1
vote
0 answers

Akka streams: Alpakka not using more than one CPU core

We have created a Alpakka stream, which consumes Kafka message from a topic and then process those messages. These messages are processed in parallel, using mapAsyncUnordered with a configured parallelism. The Kafka lag for the consumer increases,…
1
vote
0 answers

Inclusion of .futureValue method in scala.concurrent.Future?

I am working through a tutorial on getting data from an S3 bucket via this tutorial (the full test code can be found here). When I use the code val s3names: Source[Option[(Source[ByteString, NotUsed], ObjectMetadata)], NotUsed] = S3.download(bucket,…
Ryan Deschamps
  • 351
  • 4
  • 16
1
vote
0 answers

Number of Messages on akka-streams Backpressure

As well-explained here thanks to Sean Glover, I have been exploring how backpressure works on Alpakka. As I see, when mailbox size is not defined, the default mailbox size is set to be unbounded. My question is about demand requests that downstream…
Soner Guzeloglu
  • 103
  • 1
  • 11
1
vote
0 answers

Is possible to setup federation from code?

Is it possible to setup federation using pure AMQP protocol/headers? Right now I'm using REST API to setup federation and un-federate an autogenerated queue which is a bit cumbersome. Update: I'm using RabbitMQ 3.7.8 which is AMQP version 0.9 of the…
Lukasz Lenart
  • 987
  • 7
  • 14
1
vote
1 answer

implicit value for evidence parameter when iterating

My task is to get tables from jdbc ant put them to s3. I have generated classes of these tables with Slick schema code generator. If I write code manually for every table it works perfectly. Like that Slick .source(Tables.table1.result) …
uNxe
  • 33
  • 4
1
vote
1 answer

Handling Large Xml files in kafka

I am using Alpakka kafka with scala application. My Kafka is running inside docker and I am trying to publish message on Kafka producer using my code. My code is as follows def sendMsg(xmlFile: String): Future[Done] = { futureToFutureTry { …
user8856651
1
vote
1 answer

How to write CSV file with headers using akka stream alpakka?

I can't see to find it, hence i turn to slack to ask: Is there a way to write a csv file with its heards using akka stream alpakka ? The only thing i see is https://doc.akka.io/docs/alpakka/current/data-transformations/csv.html#csv-formatting But no…
MaatDeamon
  • 9,532
  • 9
  • 60
  • 127
1
vote
1 answer

akka.http.scaladsl.model.ParsingException: Unexpected end of multipart entity while uploading a large file to S3 using akka http

I am trying to upload a large file (90 MB for now) to S3 using Akka HTTP with Alpakka S3 connector. It is working fine for small files (25 MB) but when I try to upload large file (90 MB), I got the following…
Rishi
  • 1,279
  • 3
  • 20
  • 37
1
vote
1 answer

Akka Timeout exception but messages actually sent

I am working with a Scala 2.13 stack with the following technologies: play! framework 2.8 akka typed 2.6.3 alpakka kafka 2.0.3 An Akka-stream job reads event from Kafka, asks an actor to compute something, and based on the given response, produces…
spi-x-i
  • 287
  • 1
  • 12
1
vote
1 answer

TimeoutException when consuming files from S3 with akka streams

I'm trying to consume a bunch of files from S3 in a streaming manner using akka streams: S3.listBucket("", Some("")) .flatMapConcat { r => S3.download("", r.key) } .mapConcat(_.toList) .flatMapConcat(_._1) …
shagoon
  • 235
  • 2
  • 9
1
vote
1 answer

How handle a POST request with Kafka, Alpakka Kafka, Play Framework and Websocket?

Let's say I have two kafka topics, request_topic for my Post requests, and response_topic for my responses. This are the models: case class Request(requestId: String, body: String) case class Response(responseId: String, body: String, requestId:…
acmoune
  • 2,981
  • 3
  • 24
  • 41
1
vote
1 answer

Alpakka kafka consumer offset

I am using Alpakka-kafka in scala to consume a Kafka topic. Here's my code: val kafkaConsumerSettings: ConsumerSettings[String, String] = ConsumerSettings(actorSystem, new StringDeserializer, new StringDeserializer) …
Vasily802
  • 1,703
  • 2
  • 18
  • 35
1
vote
1 answer

What value to be passed for transaction Id in Transactional Flow api Alpakka

Using Alpakka I want to consume records using Transactional.Source Api and produce it to another topic using Transactional.flow but documents says that we need to pass transactionId. How should I create TransactionId for e.g. following…