Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
0
votes
1 answer

Actor is getting killed before processing the message

I am using akka streams. So i am having one actor with this functionality. Both messages were getting processed in order sometimes . but due to async call (I THINK) , i am getting dead letter in this. actor.tell(message,ActorRef.noSender());…
Mayank
  • 21
  • 2
0
votes
1 answer

Extra bytes with KafkaAvroSerializer

My setup is as follows: I'm retrieving xml files from an ftp server, unmarshall those into a POJO, map that into an Avro-generated class and then forward it into Alpakkas's Producer Sink like so: Ftp.ls("/", ftpSettings) .filter(FtpFile::isFile) …
0
votes
1 answer

Streaming download multiple files from S3 as zip through Akka HTTP or Play

I have an S3 structure that's the result of a Spark job that writes partitioned CSV files like below. bucketA output cleaned-data1 part000....csv part001....csv part002....csv cleaned-data2 ..... What I need is to…
0
votes
1 answer

Can't connect to vsftpd server with alpakka (akka-streams)

I'm trying to recreate the traversal example for the Alpakka FTP-Source connector with a vsftpd server in a Docker image, but can't seem to connect. Any pointers how to adjust the code would be very welcome: FtpSettings ftpSettings = FtpSettings …
styps
  • 279
  • 2
  • 14
0
votes
1 answer

Force alpakka kafka consumer show error message on deserialization error

Alpakka kafka consumer processes records until encounters record which it fails to deserialize and silently dies without leaving error message. How to force it to report error message?
Jeriho
  • 7,129
  • 9
  • 41
  • 57
0
votes
1 answer

not found: value fromRegistries

I am trying to use https://doc.akka.io/docs/alpakka/current/mongodb.html as the following: import akka.actor.ActorSystem import akka.stream.ActorMaterializer import cats.data.Chain import com.mongodb.reactivestreams.client.MongoClients import…
softshipper
  • 32,463
  • 51
  • 192
  • 400
0
votes
1 answer

MongoDB take the right reference with pattern matching

I am trying to insert records depends datatype to mongodb with the https://doc.akka.io/docs/alpakka/current/mongodb.html api. Let`s first take a look at the datatype: sealed trait MsgDoc { } final case class MsgPreFailure(raw: String,…
softshipper
  • 32,463
  • 51
  • 192
  • 400
0
votes
1 answer

Why Logging is Not Working for Akka Stream

I am using Alpakka and have toy example below: val system = ActorSystem("system") implicit val materializer: ActorMaterializer = ActorMaterializer.create(system) implicit val adapter: LoggingAdapter = Logging(system, "customLogger") implicit val…
Mojo
  • 1,152
  • 1
  • 8
  • 16
0
votes
1 answer

Commit to kafka consumer after response from MongoSink - alpakka mongo connector

I am using alpakka connector to consume packets from Kafka and insert them into Mongo db. I was trying to commit the offset after getting response from Mongo db but couldn't find anything regarding the same. How can I make sure that offset will be…
0
votes
1 answer

Multiple Consumer threads using Alpakka connector

I am using Alpakka kafka connector to consume packets from kafka. I am using Consumer as a CommittableSource. I would like to create multiple consumer threads on a single machine and use them as a single source. How can I achieve that? Currently, I…
0
votes
1 answer

Saving messages from different topics to different files in Alpakka

I'm trying to figure out how to pass messages from a Kafka consumer subscribed to multiple topics to a processing stage based on a topic (e.g. save them to a specific file or a database or whatever). There is a Consumer.externalCommittableSource but…
synapse
  • 5,588
  • 6
  • 35
  • 65
0
votes
1 answer

What is the purpose of createDrainingControl?

I was reading through the documentation of Alpakka. While reading through Kafka consumer API, I came across createDrainingControl(), I was wondering what is the use for this function? I understand that this is used to drain and stop the stream, but…
Saksham
  • 127
  • 3
  • 9
0
votes
1 answer

Possible encoding issue with Google PubSub

When running a subscription source from the Alpakka PubSub library I received possible encoded data. @Singleton class Consumer @Inject()(config: Configuration, credentialsService: google.creds.Service)(implicit actorSystem: ActorSystem) { …
Titan Chase
  • 101
  • 1
  • 6
0
votes
1 answer

Alpakka JMS request/response with temporary queues - possible out of the box?

Thinking of migrating some JMS-based legacy to Alpakka, one of widely used patterns in the code is request/response with temporary queue (JMSReplyTo). Is it possible with Alpakka out of the box?
bobah
  • 18,364
  • 2
  • 37
  • 70
0
votes
2 answers

How to create several partitions by Alpakka

I'm trying to create a simple producer which create a topic with some partitions provided by configuration. According to Alpakka Producer Setting Doc any property from org.apache.kafka.clients.producer.ProducerConfig can be set in kafka-clients…
Sergio Rodríguez Calvo
  • 1,183
  • 2
  • 16
  • 32