Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
0
votes
2 answers

Unable to find akka Configurations with Spark Submit

I built a fat jar and I am trying to run it with spark-submit on an EMR or locally. here is the command: spark-submit \ --deploy-mode client \ --class com.stash.data.omni.source.Runner myJar.jar \ I keep getting an error related to…
collarblind
  • 4,549
  • 13
  • 31
  • 49
0
votes
0 answers

SFTP Uploading File Success But Can't Terminate

I am uploading a file to a SFTP server with Alpakka. The code below uploads successfully but I am unable to shutdown the system. Even though stream is completed successfully this error occurs: [ERROR] [03/04/2020 12:40:26.996]…
collarblind
  • 4,549
  • 13
  • 31
  • 49
0
votes
1 answer

Download pdf file from s3 using akka-stream-alpakka and store it as an array of bytes

I am trying to download pdf file from S3 using the akka-stream-alpakka connector. I have the s3 path and try to download the pdf using a wrapper method over the alpakka s3Client. def getSource(s3Path: String): Source[ByteString, NotUsed] = { val…
Chaitanya
  • 3,590
  • 14
  • 33
0
votes
2 answers

Download pdf file from s3 using akka-stream-alpakka

I am trying to download pdf file from S3 using the akka-stream-alpakka connector. I have the s3 path and try to download the pdf using a wrapper method over the alpakka s3Client. def getSource(s3Path: String): Source[ByteString, NotUsed] = { val…
Chaitanya
  • 3,590
  • 14
  • 33
0
votes
3 answers

Slick plain sql query with pagination

I have something like this, using Akka, Alpakka + Slick Slick .source( sql"""select #${onlyTheseColumns.mkString(",")} from #${dbSource.table}""" .as[Map[String, String]] .withStatementParameters(rsType = ResultSetType.ForwardOnly,…
s.illes79
  • 1
  • 1
0
votes
1 answer

Akka Alpakka SqsSource weirdly enough can work with queueUrl and also with queueName

Im using Akka Streams, and also alpakka.sqs.scaladsl to read messages from sqs queue. I'v done in many times, but now I uploaded a version that put the queue name in the source instead of queue url which this was how I did it all the time. this is…
JohnBigs
  • 2,691
  • 3
  • 31
  • 61
0
votes
1 answer

How to access metrics of Alpakka CommittableSource with back off?

Accessing the metrics of an Alpakka PlainSource seems fairly straight forward, but how can I do the same thing with a CommittableSource? I currently have a simple consumer, something like this: class Consumer(implicit val ma: ActorMaterializer,…
tjarvstrand
  • 836
  • 9
  • 20
0
votes
1 answer

Fan-in operator that accumulates on one inlet

I would like to have an akka-stream operator with two inlets. On one inlet it receives metadata about messages. On the second inlet the messages themselves. The problem is that while metadata is received for one message at a time, messages are…
gurghet
  • 7,591
  • 4
  • 36
  • 63
0
votes
1 answer

Alpakka S3 download file from bucket, save to file, and have filename available for next part in flow

I am trying to build code that consumes S3 keys then download those files from S3, then saves that data to a file on disk with the keyname (required for a process further along the flow) and as an output returns the key/filename. What I have till so…
Aktaeon
  • 189
  • 2
  • 14
0
votes
1 answer

Not able to consume messages from Kafka Consumer using Alpakka

I am trying to consume messages from kafka using alpakka. I dont get any error from akka actors that consumer has stopped, but it is failing to consume any messages. Below is my code val consumerSettings =…
0
votes
1 answer

Streaming events from Kafka to Couchbase using Akka Stream and Kafka offset committing

I'm trying to design an Akka Stream using Alpakka to read events from kafka topic and put them to the Couchbase. So far I have the following code and it seems to work somehow: Consumer .committableSource(consumerSettings,…
Alex Sergeenko
  • 642
  • 5
  • 22
0
votes
0 answers

Configuring Akka Alpakka kafka to investigate hanging consumers

I want to log a Kafka stream properly and efficiently for investigating my use case, which is presented below. The problem that I am trying to solve is exposed on github https://github.com/akka/alpakka-kafka/issues/899 as a bug, but it might be that…
MaatDeamon
  • 9,532
  • 9
  • 60
  • 127
0
votes
1 answer

How can I use a value in an Akka Stream to instantiate a GooglePubSub Flow?

I'm attempting to create a Flow to be used with a Source queue. I would like this to work with the Alpakka Google PubSub connector: https://doc.akka.io/docs/alpakka/current/google-cloud-pub-sub.html In order to use this connector, I need to create a…
Jacob Goodwin
  • 354
  • 3
  • 7
0
votes
1 answer

How to handle `MongoException`?

I don't know, how to detect and handle MongoException in case of MongoDB disconnect val pingCmd: Publisher[bson.Document] = mongoCollFactory.db.runCommand(BsonDocument.parse("""{"ping": 1}""")) detect MongoException which does not connect to the…
funnyDev
  • 76
  • 6
0
votes
1 answer

akka stream alpakka csv: Stream is failing when the wrong number of columns read from the CSV file

I am reading a CSV file from a remote location(ftp) and file has an invalid number of columns. Steam is not progressing when such rows encountered in the file. I need to skip them with an error message and proceed. Here is what I have tried,…