I have to run multiple spark job one by one in a sequence, So I am writing a shell script. One way I can do is to check success file in output folder for job status, but i wanna know that is there any other way to check the status of spark-submit job using unix script, where I am running my jobs.
Asked
Active
Viewed 800 times
-2
-
Is a submitted job a success when the job is running or when it is finished with exitcode 0? – Walter A Dec 28 '18 at 21:24
1 Answers
0
You can use command
yarn application -status <APPLICATIOM ID>
where <APPLICATIOM ID>
is your application ID and check for line like:
State : RUNNING
This will give you the status of your application
To check the list of application, run via yarn you can use command
yarn application --list
You can add also -appTypes to limit the listing based on the application type

Romeo Ninov
- 6,538
- 1
- 22
- 31
-
I have to use it in shell script so i dont have the job ID at that time – Kumar Harsh Dec 28 '18 at 11:04