0

I am running a spark word count program using Livy (Running it without Livy using spark-submit works fine). On submitting the job using Livy Rest request, it returns with id for the job as below:

curl -X POST --data '{"file": "/home/ubuntu/SparkWordCount/target/Sparkwc.jar", "className": "org.learningspark.simple.WordCount", "files": ["hdfs://sparkmaserip:8020/tmp/input-file"]}' -H "Content-Type: application/json" http://sparkmasterip:8998/batches

Response:

{"id":12,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":[]}

If I check the status of the job with id 12 using below command it gives response saying dead:

curl  http://sparkmasterip:8998/batches/12

Response

{"id":12,"state":"dead","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null}

Thanks

Saurabh Rana
  • 168
  • 3
  • 22

1 Answers1

0

I faced same issue when I used cluster mode i.e. livy.spark.master = yarn-cluster. Worked fine with livy.spark.master = yarn-client

Sneha
  • 31
  • 3