3

I have a Scalatra application where the ScalatraServlet will take care of decoding the response, performing authentication etc., and then send the task to an actor for processing, think:

// class has FutureSupport
get("/status/:id") {
  val id = params("id")
  // (other validation tasks)
  myActor ? StatusRequest(id)
}

This will compute the response in myActor and once the answer is there it will be sent to the client as a batch.

Now in some cases I want to send a streaming response, for example when the whole response doesn't fit into memory. One way I could do this is the following:

get("/file/:id") {
  val id = params("id")
  // (other validation tasks)
  myActor ? FileRequest(id, response.outputStream)
  // myActor writes *directly to the stream* and sends back
  // to the sender only a Unit response when done
}

Now I was wondering if this is a sane thing to do. I guess in Akka many things should not be passed around between actors, in particular mutable state. The above approach has worked fine for me, but are there any issues? One other thing I imagined was to use Akka Streams as in:

get("/file/:id") {
  val id = params("id")
  // (other validation tasks)
  val stream = response.outputStream // don't close over `response`
  val sink = StreamConverters.fromOutputStream(stream)
  myActor ? FileRequest(id, sink)
  // myActor writes to the sink and sends back
  // to the sender only a Unit response when done
}

Is this a better approach? Can I send a sink to an actor or is this even worse than sending an OutputStream? Are there better approaches to the problem?

tgpfeiffer
  • 1,698
  • 2
  • 18
  • 22

0 Answers0