0

My spark streaming application is running in standalone mode, executors which have finished are still holding jar files.

After a couple of days, it starts failing because worker Nodes are going out of space. How can we delete these completed executors?

1 Answers1

0

add below property in spark-env.sh

export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true -Dspark.worker.cleanup.interval=300 -Dspark.worker.cleanup.appDataTtl=60"
Nikhil Suthar
  • 2,289
  • 1
  • 6
  • 24