I have spark yarn client submitting jobs and when it does that, it creates a directory under my "spark.local.dir" which has files like:
__spark_conf__8681611713144350374.zip
__spark_libs__4985837356751625488.zip
Is there a way these can be automatically cleaned? Whenever I submit a spark job I see new entries for these again in the same folder. This is flooding up my directory what should I set to make this clear automatically?
I have looked at a couple of links online even on SO but couldn't find a solution to this problem. All I found was a way to specify the dir path by "spark.local.dir".