I used to run spark job server from server_start.sh, It comes with log files with default assigned context specified in log4j.
However, when I ran the following command created context
curl -d "" 'myhost.com:8090/contexts/my-context?num-cpu-cores=4&mem-per-node=512m'
I no logger see log files for the above context I manually created. May I know where to find the log file or how to specify during creating context to make log files comes the the expected locations ?
After I set &context-per-jvm=true. I found my log files for each job. However, it will be deleted very soon. May I know if there is a way to keep the log file for each job for a specific timed period ?