I've tried changing log4j.properties.template to log4j.properties inside hadoop-home/conf but spark still does not pick it up. I've tried setting
sparkconf.set("log4j.configuration", ".\\config\\log4j.properties");
but that doesn't work either. I also tried adding
-Dlog4j.configuration=.\config\log4j.properties
to eclipse run configuration but doesn't work. Spark is still using its default during startup
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
I also set SPARK_CONF_DIR to my environmental variable to point to the spark/conf dir but that doesn't seem to work neither.
I am running this in windows standalone mode in eclipse
SparkConf sparkConf = new SparkConf().setAppName("Test").setMaster("local[1]")
.set("log4j.configuration", ".\\config\\log4j.properties");