Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
0
votes
1 answer

Connecting list of case classes to kafka producer?

I have the below case class: case class Alpakka(id:Int,name:String,animal_type:String) I am trying to connect a list of these case classes to a producer in kafka by using the following code: def connectEntriesToProducer(seq: Seq[Alpakka]) = { …
Nespony
  • 1,253
  • 4
  • 24
  • 42
0
votes
0 answers

Akka Streams Kafka error handling - access to element that caused the problem in Kafka Producer

I see for example exception like this: org.apache.kafka.common.errors.RecordTooLargeException: The message is 10000190 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size…
Piotr Kozlowski
  • 899
  • 1
  • 13
  • 25
0
votes
1 answer

Infinite AMQP Consumer with Alpakka

I'm trying to implement a very simple service connected to an AMQP broker with Alpakka. I just want it to consume messages from its queue as a stream at the moment they are pushed on a given exchange/topic. Everything seemed to work fine in my…
Nicolas Delaforge
  • 1,434
  • 1
  • 10
  • 12
0
votes
1 answer

Indefinite long polling with Alpakka Java

I am using this library: https://doc.akka.io/docs/alpakka/current/sqs.html for working with SQS. I am trying to create SQS long polling with it, they provided a snippet for reading messages from SQS: final CompletionStage> cs = …
Anand Chokshi
  • 17
  • 1
  • 7
0
votes
1 answer

How to stream mongo data with alpakka without any memory issue in lesser time

I'm new to alpakka. I used following code in Alpakka using MongoDB connector to fetch and loop through 100K records // Using Stream def getAllContacts(user_id: Int, list_id: Int): Source[ListContact, NotUsed] = { …
Sujit Baniya
  • 895
  • 9
  • 27
0
votes
1 answer

Can alpakka-xml process multiple xml files?

I'm having trouble using alpakka's XmlParsing Flow val files: List[String] = ... // file paths locally on disk // simple source emitting the contents of 2 XML files val documentSource = FileIO.fromPath(Paths.get(files.head)) …
CPS
  • 531
  • 1
  • 9
  • 18
0
votes
1 answer

akka stream alpakka csv: skip exception and parse next rows

I`m using Alpakka for parsing csv files. version "com.lightbend.akka" %% "akka-stream-alpakka-csv" % 0.20 I have csv file with unclosed quote. email test@emample.com "test@emample.com test@emample.com test@emample.com I want to skip bad rows and…
Slavik Muz
  • 1,157
  • 1
  • 15
  • 28
0
votes
2 answers

Akka Sink never closes

I am uploading a single file to an SFTP server using Alpakka but once the file is uploaded and I have gotten the success response the Sink stays open, how do I drain it? I started off with this: val sink = Sftp.toPath(path, settings, false) val…
spydon
  • 9,372
  • 6
  • 33
  • 63
0
votes
1 answer

How can I reduce Kafka log file size of Alpakka

I am doing data replication in alpakka using Consumer.commitableSource. But, the size of kafka log file is increases very quickly. The size reaches 5 gb in a day. As a solution of this problem, ı want to delete processed data immediately. I am using…
Ömer Çelik
  • 77
  • 2
  • 11
0
votes
1 answer

Is there Alpakka SOAP support?

I have an application written in Scala/Akka and trying to add SOAP support. Anyone did it with Alpakka? Or what is the best way to do it. So far I think Camel is the best solution.
karjan
  • 936
  • 1
  • 7
  • 17
0
votes
1 answer

Connect 1 input to n outputs with Alpakka

I'm trying some variation of connecting a producer to a consumer with the special case that some times I'd need to produce 1 extra message per message (e.g. 1 to the output topic and 1 message to a different topic) while keeping guarantees on that.…
fd8s0
  • 1,897
  • 1
  • 15
  • 29
0
votes
0 answers

Akka File Streaming throw error as akka.http.scaladsl.model.EntityStreamException: Entity stream truncation

We are Streaming file from S3 and processing it , after process complete we upload file back to S3 as Error / Archive file while streaming file from S3 it streams data and in between it stops processing with Error as…
Learner
  • 45
  • 1
  • 6
0
votes
1 answer

alpakka cassandrasource read data from cassandra continuously

We are doing some POC to read cassandra table continuosly using Alpakka CassandraSource. Following is the sample code: final Statement stmt = new SimpleStatement("SELECT * FROM testdb.emp1").setFetchSize(20); final CompletionStage> rows = …
Uday Matta
  • 56
  • 2
0
votes
2 answers

Alpakka s3 `multipartUpload` doesn't upload files

I have a question regarding the alpakka_kafka+alpakka_s3 integration. Alpakka s3 multipartUpload doesn't seems to upload files when I use alpakka kafka sources. kafkaSource ~> kafkaSubscriber.serializer.deserializeFlow ~> bcast.in bcast.out(0)…
Yuan Zhao
  • 479
  • 4
  • 6
0
votes
0 answers

Slick result set size default limit

I wonder if there is a default resultset size when performing DBAction with slick akka-streaming integration. Indeed (I am using akka-stream slick). When I write the following query: Slick.source(sql"""select * FROM…
MaatDeamon
  • 9,532
  • 9
  • 60
  • 127