0

I am running my code using Spark-Shell in EMR cluster. Sample is:

[hadoop@<IP> ~]$ spark-shell --jars <JAR_LIST>  --num-executors 72 --executor-cores 5 --executor-memory 16g --conf spark.default.parallelism=360 
...
scala> val args = Array(...)
scala> org.abc.MainClass(args)
... start ... execution

Now I have code like

dataFrame.foreachPartition { dataSetPartition => {
  val localLogger: Logger = Logger.getLogger("PartitionLogger")
  logger.info("INFO")
  ...
  logger.error("TEST")
  ...

Problem is I not able to get partition logs. How can I analyze same.

Version:

Spark: 2.2.1
Scala: 2.11
user811602
  • 1,314
  • 2
  • 17
  • 47

1 Answers1

0

Look at spark UI under Executors tab, you will see a logs column containing stderr and stdout for each exectuor.

Mi7flat5
  • 89
  • 4