1

It must be damn simple. But for some reason I cannot make it work.

  • If I do io.linesR(...), I have a stream of lines of the file, it's ok.
  • If I do Processor.emitAll(), I have a stream of pre-defined values. It also works.

But what I actually need is to produce values for scalaz-stream asynchronously (well, from Akka actor).

I have tried:

  • async.unboundedQueue[String]
  • async.signal[String]

Then called queue.enqueueOne(...).run or signal.set(...).run and listened to queue.dequeue or signal.discrete. Just with .map and .to. With an example proved to work with another kind of stream -- either with Processor or lines from the file.

What is the secret? What is the preferred way to create a channel to be streamed later? How to feed it with values from another context?

Thanks!

Dmitry Kurinskiy
  • 533
  • 1
  • 3
  • 16
  • Did you try the approach described here: https://github.com/scalaz/scalaz-stream/blob/master/src/test/scala/scalaz/stream/examples/CreatingStreams.scala#L85. Paul is also providing this gist on the subject: https://gist.github.com/pchiusano/8087426 – Eric Nov 20 '14 at 05:27

1 Answers1

0

If the values are produced asynchronously but in a way that can be driven from the stream, I've found it easiest to use the "primitive" await method and construct the process "by hand". You need an indirectly recursive function:

def processStep(v: Int): Process[Future, Int] =
  Process.emit(v) ++ Process.await(myActor ? NextValuePlease())(w => processStep(w))

But if you need a truly async process, driven from elsewhere, I've never done that.

lmm
  • 17,386
  • 3
  • 26
  • 37
  • I really need a truly async process. I have master/workers actors topology with unpredictable delays for requests (up to hours), and I need to write the events to the file line by line. Don't know how to employ your snippet there :( but I'll take a look this direction, thanks – Dmitry Kurinskiy Nov 20 '14 at 10:57