2

enter image description here

My spark application is failing with the above error.

Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.

My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.

Can anyone give me clue on why i am getting this error?

AKC
  • 953
  • 4
  • 17
  • 46

1 Answers1

3

In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true

in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.

Just comment that line and see if it helps.

Dumitru
  • 31
  • 2
  • Did you raise this bug to Spark? Or did you confirm if this is a bug? It seems like we are having the same problem in Spark 2.1 – Ross Brigoli Mar 08 '18 at 08:56
  • @RossBrigoli We've run into this exception when unix user running the spark application the SPARK_WORKER_DIR doesn't have write permissions to the directory (or to specific files in that directory if it's trying to do cleanup). – Derek Kaknes Jun 08 '18 at 19:50