0

Hi i am running sparkr progrm through shell script. I pointed the input file to local means it is working fine,but when i point to hdfs means it throws error.

Exception in thread "delete Spark local dirs" java.lang.NullPointerException

Exception in thread "delete Spark local dirs" java.lang.NullPointerException
at  org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:161)
at  org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:141)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1617)
at org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:139)

any help will be appreciated.

sharon paul
  • 93
  • 2
  • 9

1 Answers1

0

I faced the same issue with Scala script. The problem was with the master URL, so I removed setting the master URL.

Previously:

val conf = new org.apache.spark.SparkConf().setMaster(masterURL).set("spark.ui.port",port).setAppName("TestScalaApp")

Fixed code:

val conf = new org.apache.spark.SparkConf().setAppName("TestScalaApp")
Fish Below the Ice
  • 1,273
  • 13
  • 23
abc
  • 1