1

I am running spark jobs locally for debugging purpose. I have imported spark-core jar file using sbt. I am using hiveContext in my code. It is throwing following error. The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-

Once again, I have not installed spark or hadoop on my local machine. I am importing all the jars using sbt and running them in Intellij. Is there any way I can fix that.

Thanks,

hp2326
  • 181
  • 1
  • 3
  • 12

0 Answers0