I am running my code using Spark-Shell in EMR cluster. Sample is:
[hadoop@<IP> ~]$ spark-shell --jars <JAR_LIST> --num-executors 72 --executor-cores 5 --executor-memory 16g --conf spark.default.parallelism=360
...
scala> val args = Array(...)
scala> org.abc.MainClass(args)
... start ... execution
Now I have code like
dataFrame.foreachPartition { dataSetPartition => {
val localLogger: Logger = Logger.getLogger("PartitionLogger")
logger.info("INFO")
...
logger.error("TEST")
...
Problem is I not able to get partition logs. How can I analyze same.
Version:
Spark: 2.2.1
Scala: 2.11