4

I'm using the code in this example to run a scala program using spark. The program executes fine, but when the StreamingContext tries to stop I get this error:

java.io.IOException: Failed to delete: ..\AppData\Local\Temp\spark-53b87fb3-1154-4f0b-a258-8dbeab6601ab
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1010)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1951)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

I have changed nothing from the code. Just cloned it to my local file system, run sbt assembly command to generate .jar file, and then run the program using spark-submit.

Also, I'm running windows cmd as Administrator so I don't think it is a privileges issue.

Any clues on what's causing this error?

Thanks for help!

Community
  • 1
  • 1

1 Answers1

0

I think spark app creates temporary staging file in your local system (Probably when checkpoint is called) and when context is stopped tries to clean up temporary files and not able to delete. There 2 option there either file already deleted or there no privileges to delete.

FaigB
  • 2,271
  • 1
  • 13
  • 22
  • 5
    Hi @FiagB. I believe it is not a privileges issue because I run the command prompt in administrator mode. I'm not sure though if spark should have also some privileges in order to delete temporary generated files. – Mohammad Zammam Mar 09 '17 at 14:11
  • I don't think it is a privilege issue either. I have the same issue when I exit spark-shell, or when I run any of the examples. I tried changing the working dir with --conf spark.local.dir, but the problem wasn't solved. If anyone has a solution, please share. – Germán Aquino May 27 '17 at 13:09