I am getting an error when executing a simple Spark job via eclipse. Can someone please help? This is my code:
val sc=SparkSession.builder()
.master("yarn")
.appName("hive Test")
.config("spark.hadoop.fs.defaultFS","hdfs://localhost:9000")
.config("spark.yarn.jars","hdfs://localhost:9000/sparkjars/*.jar")
.getOrCreate()
println("Testing")
Erorr:
'/tmp/hadoop-Aditya' is not recognized as an internal or external command