0

I am getting an error when executing a simple Spark job via eclipse. Can someone please help? This is my code:

val sc=SparkSession.builder()
  .master("yarn")
  .appName("hive Test")
  .config("spark.hadoop.fs.defaultFS","hdfs://localhost:9000")
  .config("spark.yarn.jars","hdfs://localhost:9000/sparkjars/*.jar")
  .getOrCreate()
println("Testing")

Erorr:

'/tmp/hadoop-Aditya' is not recognized as an internal or external command

Shaido
  • 27,497
  • 23
  • 70
  • 73
  • my code val sc=SparkSession.builder().master("yarn").appName("hive Test") .config("spark.hadoop.fs.defaultFS","hdfs://localhost:9000") .config("spark.yarn.jars","hdfs://localhost:9000/sparkjars/*.jar") .getOrCreate() println("Testing") – Aditya Singh May 03 '20 at 11:01
  • Remove .config & execute code again ? – Srinivas May 03 '20 at 11:11

0 Answers0