Questions tagged [alpakka]

Alpakka is the collective name for various Akka Streams connectors, integration patterns, and data transformations.

Alpakka is a community-driven initiative that provides connectors, integration patterns, and data transformations that are built with Akka Streams. This toolkit is meant to be a "modern alternative to Apache Camel" (hence its name, which is a homophone of "alpaca," a relative of the camel, and was first used as a code name for an old akka-camel module).

From an introductory blog post:

Akka Streams already has a lot that are useful for integrations. Defining processing pipelines is what the Akka Streams DSL is all about and that is exactly what you need for operating on streaming data that cannot fit in memory as a whole. It handles backpressure in an efficient non-blocking way that prevents out-of-memory errors, which is a typical problem when using unbounded buffering with producers that are faster than consumers.

Connectors:

  • AMQP
  • Apache Geode
  • AWS DynamoDB
  • AWS Kinesis
  • AWS Lambda
  • AWS S3
  • AWS SNS
  • AWS SQS
  • Azure Storage Queue
  • Cassandra
  • Elasticsearch
  • File
  • FTP
  • Google Cloud Pub/Sub
  • HBase
  • IronMq
  • JMS
  • MongoDB
  • MQTT
  • Server-sent Events (SSE)
  • Slick (JDBC)
  • Spring Web

Integration Patterns:

  • Splitter

Data Transformations:

  • Parsing Lines
  • JSON
  • Compressing/decompressing
  • Comma-Separated Values (CSV)
  • RecordIO Framing
  • Extensible Markup Language (XML)

Additional Links:

217 questions
0
votes
1 answer

Reactive akka stream : How to delay the graph shutdown until the source has dried out?

I am trying to feed data (say 10 strings) to an akka-stream graph from a kafka broker Here's the graph in an unittest : val consumer = AlpakkaConsumer(KafkaBootstrapServer(kafkaURL).value, "porteur-id") val drainingControl = Consumer …
0
votes
1 answer

can't get akka streams / alpakka S3 to work in simple case

I'm doing what I think is a very simple thing to check that alpakka is working: val awsCreds = AwsBasicCredentials.create("xxx", "xxx") val credentialsProvider = StaticCredentialsProvider.create(awsCreds) implicit val staticCreds =…
user7654493
  • 123
  • 1
  • 9
0
votes
1 answer

Aplakka scala s3 connector hangs when trying to put data

I'm trying to process aws s3 put into bucket, with a simple string, I couldn't do this with alpakka (scala) but I can process with same request using aws java sdk Using alpakka my thread just hangs not processing anything, Future.onComplete not…
yevhensh
  • 390
  • 2
  • 3
  • 14
0
votes
1 answer

Akka Streams buffer on SubFlows based on parent Flow

I am using akka-streams and I hit an exception because of maxing out the Http Pool on akka-http. There is a Source of list-elements, which get split and thus transformed to SubFlows. The SubFlows issue http requests. Although I put a buffer on the…
gkatzioura
  • 2,655
  • 2
  • 26
  • 39
0
votes
1 answer

How to get a Map [String, String] returned by a Kafka Consumer (Alpakka)?

I should get a Map [String, String] back from a Kafka Consumer, but I don't really know how. I managed to configure the consumer, it works fine, but I don't understand how I could get the Map. implicit val system: ActorSystem = ActorSystem() …
0
votes
1 answer

How can I configure Alpakka Slick to enable streaming from my database in Java?

I am trying to stream results from a query using Slick within a Java application, using Akka Streams and Postgres: Source mySource = Slick.source( slickSession, "SELECT * from message where started_instant is null", …
Noremac
  • 3,445
  • 5
  • 34
  • 62
0
votes
0 answers

Getting kafka topic name using scala

I am working on a scala application in which I am using kafka. I want to retrieve topic names from my code. My code is as follows: def getTopic = { metadataClient.listTopics().map(x => logger.info(s"topic - $x")) } And I am calling this…
user8856651
0
votes
0 answers

alpakka: stream objects from s3 bucket

Is there a way to read all objects from given s3 bucket as s3 Source with alpakka ? I need something like this (pseudocode): S3.readObjects(bucket) .map(object => parseJson(object.getContent)) ... There are two sources provided by alpakka s3…
Normal
  • 1,347
  • 4
  • 17
  • 34
0
votes
0 answers

akka-streams dependency issue with alpakka-mongodb

I am trying to use alpakka-mongodb based on the docs. My dependencies look like this: object Versions { val alpakkaMongo = "2.0.0" val akkaVersion = "2.5.31" } lazy val deps = Seq( "com.lightbend.akka" %%…
irrelevantUser
  • 1,172
  • 18
  • 35
0
votes
2 answers

Alpakka Kafka stream never getting terminated

We are using Alpakka Kafka streams for consuming events from Kafka. Here is how the stream is defined as: ConsumerSettings consumerSettings = ConsumerSettings .create(actorSystem, new…
Prasanth
  • 1,005
  • 5
  • 19
  • 39
0
votes
1 answer

Alpakka AMQP : How to detect declaration exception?

I have a AMQP Source and AMQP Sink with Declarations: List declarations = new ArrayList() {{ add(QueueDeclaration.create(sourceExchangeName)); add(BindingDeclaration.create(sourceExchangeName,…
Jerald Baker
  • 1,121
  • 1
  • 12
  • 48
0
votes
1 answer

Akka streams Kafka consumer process parallel

I’m working on a Kafka consumer application using Akka Kafka connector. I would like the consumer to process messages parallelly. which consumer group should I choose Choosing a consumer? how can I configure the parallelism on the consumer side?
vkt
  • 1,401
  • 2
  • 20
  • 46
0
votes
1 answer

Not able to Produce Message to Kafka topic using Transactional.Sink in Alpakka but I see idempotent producer is enabled

Hi I was trying to use Producer api like shown in Alpakka documentation. I'm able to consume record using Transactional source and Producer is created but not able to put message to topic Not able to Produce to topic using Transactional.Sink in…
0
votes
2 answers

Alpakka and S3 truncating downloaded files

I have a simple piece of code, based on alpakka examples, which should download some file from S3 for further processing: S3.download(bucket, file) .runWith(Sink.head) .flatMap { case Some((data, _)) => …
bolo
  • 11
  • 3
0
votes
1 answer

Get reference of original element after Akka Streams Sink?

I am trying to use the Amqp alpakka connector as a source and sink. Source AMQP_SOURCE -> processAndGetResponse -> Sink> AMQP_SINK I want to acknowledge the message…
Jerald Baker
  • 1,121
  • 1
  • 12
  • 48