0

I used to run spark job server from server_start.sh, It comes with log files with default assigned context specified in log4j.

However, when I ran the following command created context

curl -d "" 'myhost.com:8090/contexts/my-context?num-cpu-cores=4&mem-per-node=512m'

I no logger see log files for the above context I manually created. May I know where to find the log file or how to specify during creating context to make log files comes the the expected locations ?

After I set &context-per-jvm=true. I found my log files for each job. However, it will be deleted very soon. May I know if there is a way to keep the log file for each job for a specific timed period ?

Tom
  • 83
  • 8
  • Not sure why you are saying "it will be deleted very soon.", do you mean they are automatically deleted by SJS? – noorul Apr 23 '17 at 16:54
  • Yes. The log file and the directory named with context holding the log file will be disappeared after a few seconds. – Tom Apr 24 '17 at 21:51

0 Answers0