I am trying to change the level of logging to StdErr console in Spark 2.2 under Mesos and using Scala, sbt and spark-submit in cluster mode.
Background
Recently we had our Spark/Mesos cluster reinstalled with new version of spark 2.2.0 vs previous 2.0.1. All my code works except the logging. It seems that the src/main/resources/log4j.properties file which sets the level of logs is not picked up at startup. It worked with Spark 2.0.1.
I added this option to spark-submit script to check what log4j is doing
--conf "spark.driver.extraJavaOptions=-Dlog4j.debug"
This is what I get for Spark 2.2
log4j: Trying to find [log4j.xml] using context classloader sun.misc.Launcher$AppClassLoader@50134894.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@50134894 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader sun.misc.Launcher$AppClassLoader@50134894.
log4j: Using URL [file:/usr/local/spark/conf/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL file:/usr/local/spark/conf/log4j.properties
This is what I get for old version with Spark 2.0.1
log4j: Trying to find [log4j.xml] using context classloader org.apache.spark.util.MutableURLClassLoader@10b48321.
log4j: Trying to find [log4j.xml] using sun.misc.Launcher$AppClassLoader@61e717c2 class loader.
log4j: Trying to find [log4j.xml] using ClassLoader.getSystemResource().
log4j: Trying to find [log4j.properties] using context classloader org.apache.spark.util.MutableURLClassLoader@10b48321.
log4j: Using URL [jar:file:/var/lib/mesos/...path.../myProject-assembly-0.1.0-SNAPSHOT.jar!/log4j.properties] for automatic log4j configuration.
log4j: Reading configuration from URL jar:file:/var/lib/mesos/...path.../myProject-assembly-0.1.0-SNAPSHOT.jar!/log4j.properties
Questions
- What are some Spark cluster settings or version related issues that make my resource folder invisible? Why is it not on the class path anymore in Spark 2.2 and class loader gets the default setting file?
- How else I could change log level?
Update: Here is an example spark-submit command I use:
spark-submit \
--class myScript \
--master mesos://masterIP:7077 \
--total-executor-cores 30 \
--driver-memory 30g \
--deploy-mode cluster \
--name myScript \
--conf "spark.driver.extraJavaOptions=-Dlog4j.debug" \
--verbose \
http://192.168.75.41/~xxx/myProject-assembly-0.1.0-SNAPSHOT.jar