-2

I have to run multiple spark job one by one in a sequence, So I am writing a shell script. One way I can do is to check success file in output folder for job status, but i wanna know that is there any other way to check the status of spark-submit job using unix script, where I am running my jobs.

Kumar Harsh
  • 423
  • 5
  • 26

1 Answers1

0

You can use command

yarn application -status <APPLICATIOM ID>

where <APPLICATIOM ID> is your application ID and check for line like:

State : RUNNING

This will give you the status of your application

To check the list of application, run via yarn you can use command

yarn application --list

You can add also -appTypes to limit the listing based on the application type

Romeo Ninov
  • 6,538
  • 1
  • 22
  • 31