Questions tagged [scalaz-stream]

scalaz-stream is a streaming I/O library. The design goals are compositionality, expressiveness, resource safety, and speed. The design is meant to supersede or replace older iteratee or iteratee-style libraries.

The library supports a number of other interesting use cases:

  • Zipping and merging of streams: A streaming computations may read from multiple sources in a streaming fashion, zipping or merging their elements using a arbitrary Tee. In general, clients have a great deal of flexibility in what sort of topologies they can define--source, sinks, and effectful channels are all first-class concepts in the library.
  • Dynamic resource allocation: A streaming computation may allocate resources dynamically (for instance, reading a list of files to process from a stream built off a network socket), and the library will ensure these resources get released in the event of normal termination or when errors occur.
  • Nondeterministic and concurrent processing: A computation may read from multiple input streams simultaneously, using whichever result comes back first, and a pipeline of transformation can allow for nondeterminism and queueing at each stage.
  • Streaming parsing (UPCOMING): A separate layer handles constructing streaming parsers, for instance, for streaming JSON, XML, or binary parsing. See the roadmap for more information on this and other upcoming work.
83 questions
1
vote
1 answer

How to stop ScalaZ Process created by time.awakeEvery?

I learned that by scalaz.stream.time.awakeEvery(1.second) I can create a process that creates an event every one second. Quite obvious. I can then map that process to accomplish some task every second. So far so good. What if I want to stop this…
amorfis
  • 15,390
  • 15
  • 77
  • 125
1
vote
0 answers

Is there a way to stream data received from an http endpoint directly into kafka using http4s?

http4s uses scalaz streams and there is scalaz streams implementation for kafka. Can we directly stream data received at an http endpoint into kafka like, http endpoint being the source and kafka being sink.…
Bharath
  • 9
  • 1
1
vote
1 answer

Monad transformers with scalaz-streams

In this snippet y.run doesn't typecheck. object Test { type StateStringTask[A] = StateStringT[Task, A] type StateStringT[M[_], A] = StateT[M, String, A] val x: Process[Task, Unit] = ??? val y: Process[StateStringTask, Unit] = ??? x.run…
Danny Navarro
  • 2,733
  • 1
  • 18
  • 22
1
vote
0 answers

"Accumulating" scalaz-stream channel

I'm trying to implement a scalaz-stream channel that accumulates statistics about the events it receives and, once complete, emits the final statistics. To give a concrete, simplified example: imagine that you have a Process[Task, String] where each…
Nicolas Rinaudo
  • 6,068
  • 28
  • 41
1
vote
1 answer

Why awakeEvery was removed from scalaz-stream

I found that there is no more awakeEvery inside scalaz.stream.Process in modern scalaz-stream. How to run something with period then?
dk14
  • 22,206
  • 4
  • 51
  • 88
1
vote
1 answer

nondeterminism.njoin: maxQueued and prefetching

Why does the njoin prefetch the data before processing? It seems like an unnecessary complication, unless it has something to do with how Processes of Processes are merged? I have a stream that runs effects whenever a new element is generated. I'd…
Pyetras
  • 1,492
  • 16
  • 21
1
vote
2 answers

Scalaz-stream chunking UP to N

Given a queue like so: val queue: Queue[Int] = async.boundedQueue[Int](1000) I want to pull off this queue and stream it into a downstream Sink, in chunks of UP to 100. queue.dequeue.chunk(100).to(downstreamConsumer) works sort of, but it will…
Dominic Bou-Samra
  • 14,799
  • 26
  • 100
  • 156
1
vote
1 answer

Asynchronous "node" in a scalaz-stream

I have a Process[Task, A], and I need to run a function A => B whose run time ranges from instantaneous to really long on each A of the stream to yield a Process[Task, B]. The catch is that I'd like to process each A as soon as possible in an…
Nicolas Rinaudo
  • 6,068
  • 28
  • 41
1
vote
0 answers

How can I implement wye.mergeLeftBiased in scalaz-stream

I am having trouble implementing the following function with scalaz-stream: /** * Same as `merge` but is biased on the left side. Both `merge` and `mergeLeftBias` are * non-deterministic, but `merge` tries to balance results from both to the…
jedesah
  • 2,983
  • 2
  • 17
  • 29
1
vote
0 answers

Memory efficiency of scalaz-stream

Let's consider the first example from the README in its github page: val converter: Task[Unit] = io.linesR("testdata/fahrenheit.txt") .filter(s => !s.trim.isEmpty && !s.startsWith("//")) .map(line =>…
lolski
  • 16,231
  • 7
  • 34
  • 49
1
vote
0 answers

Equivalent of collection.groupBy in scalaz-streams

I have a folder which contain multiple files with names such as filetype1_ddMMyyyy_hhmm, filetype2_ddMMyyyy_hhmm Per each day, there could be multiple files with a different hour and I would need to parse only the one with the highest hour. In a…
Edmondo
  • 19,559
  • 13
  • 62
  • 115
1
vote
1 answer

What is a good way to implement a "delayed" list stream in Scalaz

I tried doing this, but that did not work: Process("Hello", "Salut", "Bye", "Ciao").interleave(time.sleep(0.5.seconds).repeat)
jedesah
  • 2,983
  • 2
  • 17
  • 29
1
vote
0 answers

Gather result from scalaz stream Process

Recently I started using scalaz streams in Scala/Akka. I'm fetching records from a nosql database. I want to map records to message items (via translateItem: Item) and create Packages (1 Package = 100 Items) of them. E.g. there are 500 records. val…
Marco Mayer
  • 197
  • 2
  • 10
1
vote
1 answer

scalaz-stream: how to chunk with concatenation?

Is there an idiomatic way to chunk and concatenate? The ways that I've found (examples for bytes): 1. import scodec.bits.ByteVector def byteChunk(n: Int): Process1[ByteVector, ByteVector] = process1.chunk(n).map(_.reduce(_ ++ _)) But…
Vasiliy Levykin
  • 302
  • 4
  • 6
1
vote
0 answers

this process feels like a scan, but I'm not sure it is...is it?

I created a process to help me parse log files. The log files need to be tagged with a string tag. Not ever record can provide this string tag, so I need to maintain some state across each log event so that each event will have a tag. If an event is…
user1763729
  • 167
  • 1
  • 11