0

I have the below code where I am using a mutable list buffer to store files recieved from kafka consumer , and then when the list size reached 15 I insert them into cassandra . But Is their any way to do the same thing using immutable list.

  val filesList = ListBuffer[SystemTextFile]()
  storeservSparkService.configFilesTopicInBatch.subscribe.atLeastOnce(Flow[SystemTextFile].mapAsync(4) { file: SystemTextFile =>
    filesList += file
    if (filesList.size == 15) {
      storeServSystemRepository.config.insertFileInBatch(filesList.toList)
      filesList.clear()
    }
    Future(Done)
  })
Piyush_Rana
  • 305
  • 3
  • 15
  • What does `storeServSystemRepository.config.insertFileInBatch` do? Is that a synchronous operation or asynchronous? What is the method signature? – Tim Moore Apr 20 '17 at 08:47
  • this inserts the data in batched of 15 , so here we are using the Lagom cassandra session and batch statement to insert data . Yes it return Future(done). – Piyush_Rana Apr 21 '17 at 05:10
  • If `insertFileInBatch` returns `Future[Done]` then you should be returning that future from the block passed to `mapAsync`, rather than creating a new, independent future. – Tim Moore Apr 24 '17 at 00:17

2 Answers2

3

Something along these lines?

Flow[SystemTextFile].grouped(15).mapAsync(4){ files =>
  storeServSystemRepository.config.insertFileInBatch(files)
}
Jasper-M
  • 14,966
  • 2
  • 26
  • 37
  • when I add the grouped keyword then it does not consumes the messges. – Piyush_Rana Apr 18 '17 at 07:54
  • Also If I tried with groupedWithin(15, Duration(20, "seconds")), then it only consumes 1 files in 20 sec while without the group keyword it consumes all the files (70) in milli seconds – Piyush_Rana Apr 18 '17 at 08:17
  • According to @Piyush_Rana's comment above, `insertFileInBatch` returns a `Future[Done]`, so that should be returned directly rather than wrapping another `Future` around it. Would you like to update your answer or should I write a new one? – Tim Moore Apr 24 '17 at 00:18
  • @Piyush_Rana does this updated version work for you? – Tim Moore Apr 26 '17 at 03:02
0

Have you tried using Vector?

      val filesList = Vector[SystemTextFile]()
      storeservSparkService.configFilesTopicInBatch.subscribe.
          atLeastOnce(Flow[SystemTextFile].mapAsync(4) { file: SystemTextFile =>
       filesList = filesList :+ file
       if (filesList.length == 15) {
            storeServSystemRepository.config.insertFileInBatch(filesList.toList)
       }
       Future(Done)
     })