I've set up a spark job-server (see https://github.com/spark-jobserver/spark-jobserver/tree/jobserver-0.6.2-spark-1.6.1) in standalone mode.
I've set up some jobs using Scala, every job use the same shared context, but I don't understand how to persist my job's (or context's) logs.
Currently I'm using:
val logger = LoggerFactory.getLogger(getClass)
without any success. I haven't found any docs or examples, is it even possible?
Thank you