Questions tagged [scala-streams]

37 questions
1
vote
1 answer

Do Kotlin Sequences cache intermediate results?

When operating Kotlin sequences with functional APIs such as map, flatMap, +, etc., are computed intermediate results cached so upon second evaluation there is no recomputation? If not, replacing Lists with Sequences could in some situations cause…
Shreck Ye
  • 1,591
  • 2
  • 16
  • 32
1
vote
1 answer

How to emulate Sink in akka streams?

I have a simple "save" function that is using akka-stream-alpakka multipartUpload, it looks like this: def save(fileName: String): Future[AWSLocation] = { val uuid: String = s"${UUID.randomUUID()}" val s3Sink: Sink[ByteString,…
jack miao
  • 1,398
  • 1
  • 16
  • 32
1
vote
3 answers

How to find two successive and same values in a Stream?

How can I find two successive and same values in a Stream and return this "duplicate-value": def succ: Stream[Int] => Int = str => ... For instance Stream(1, 2, 4, 3, 5, 5, 2, 2) would result in 5. How would one do that?
appcodix
  • 342
  • 2
  • 15
1
vote
0 answers

reading a large file using cats stream scala

Reading a large text file (with 40M+ lines) and doing some operations on this list and writing the output to new file. Ex: call a web service & use the response to do union or intersection with this list (repeat few hundred times this process) what…
vkt
  • 1,401
  • 2
  • 20
  • 46
1
vote
2 answers

Creating a Stream out of a couple of elements lazily in Scala

Just for testing purposes I wanted to compute lazily 2 elements: Stream( { Thread.sleep(2000); 1 }, { Thread.sleep(2000); 2 }, Stream.empty[Int] ).foreach(println) But running this code does not yield the desired result. The values…
devoured elysium
  • 101,373
  • 131
  • 340
  • 557
1
vote
1 answer

Scala stream iterate and memory management

I have this code: val res = Stream // launch the real computation, which alternates E and M steps, updating the computation state .iterate(initCompState)(Base.emIteration) .take(nIteration) .last The idea is to provide an initial state…
vkubicki
  • 1,104
  • 1
  • 11
  • 26
1
vote
1 answer

How to convert an Akka Source to a scala.stream

I already have a Source[T], but I need to pass it to a function that requires a Stream[T]. I could .run the source and materialize everything to a list and then do a .toStream on the result but that removes the lazy/stream aspect that I want to…
JBY
  • 238
  • 2
  • 10
0
votes
0 answers

FS2 stream group into chunks using a predicate

I'm looking for a "chunkBy" kind of operator that provides the below functionality: test("chunkBy") { val s = Stream(1, 2, 3, 4, 5, 6, 7, 8).covary[IO] val range = 4 val actual: List[(Int, Chunk[Int])] = s.chunkBy(element => element %…
Jeet Banerjee
  • 194
  • 2
  • 2
  • 12
0
votes
0 answers

FS2 stream inside a stream

I have an FS2 stream that reads a queue and for each received message, it then reads a file as a stream. How do I model such a stream inside a stream in FS2 ? I can achieve this using a flatMap operation such as Stream(1,2,3).flatMap(i =>…
Jeet Banerjee
  • 194
  • 2
  • 2
  • 12
0
votes
1 answer

Why Source.fromInputStream is working with GZIPInputStream but not with ZipInputStream

I am trying to stream a zip file. The following chunk of code prints line by line as expected: val inputStream = new GZIPInputStream(new FileInputStream("/some/path")) val source = Source.fromInputStream(inputStream) for(line <- source.getLines) { …
Alexandre Annic
  • 9,942
  • 5
  • 36
  • 50
0
votes
1 answer

Is it a library bug in a functional language when a function with the same name but for different collections produces different side effects?

I'm using Scala 2.13.1 and evaluate my examples in a worksheet. At first, I define two functions that return the range of a to (z-1) as a stream or respectively a lazy list. def streamRange(a: Int, z: Int): Stream[Int] = { print(a + " ") if (a…
0
votes
2 answers

How to consume api rest passing flink stream as parameter and return this stream transformed

I'm new in apache flink. I have one flink scala project that consume data from kafka cluster and I need to pass the stream result as parameter to consume api that return this stream transformed. Here is my code class Testing { def main(args:…
0
votes
2 answers

Polymorphic type error during recursion - how to solve?

Im new to scala and im following the book "FP in Scala". Right now i am writing an unfold function for the Stream datatype, which i am recreating. The problem is, that the type checker tells me that the polymorphic type seems to be wrong for the…
Daniel.T
  • 23
  • 3
0
votes
2 answers

Spark Kafka Streaming multi partition CommitAsync issue

I am reading a message from Kafka topic which has multiple partitions. While reading from message no issue, while Committing the offset range to Kafka, I am getting an error. I tried my level best and not able to resolve this issue. Code object…
Gnana
  • 2,130
  • 5
  • 26
  • 57
0
votes
1 answer

Spark Kafka Streaming CommitAsync Error

I am new to Scala and RDD concept. Reading message from kafka using Kafka stream api in Spark and trying to commit after business work. but I am getting error. Note: Using repartition for Parallel work How to read offset from stream APi and commit…
Gnana
  • 2,130
  • 5
  • 26
  • 57