I have been looking for a way to get percentage of Job completed for the corresponding job id.
Right now, the Spark JobServer UI shows the corresponding status for a running job:
{
"duration": "Job not done yet",
"classPath": "jobserver.spark.sql.SparkJobServerClient",
"startTime": "2017-11-13T11:22:46.030+05:30",
"context": corresponding_context_name,
"status": "RUNNING",
"jobId": "ef16374c-f370-442c-9cea-25aa1b427a0a"
}
And immediately afterwards, the completion status would look like:
{
"duration": "5.909 secs",
"classPath": "jobserver.spark.sql.SparkJobServerClient",
"startTime": "2017-11-13T11:22:46.030+05:30",
"context": corresponding_context_name,
"result": "2017-10-24-00-00-00,3120,9958,25.74,23.61,2.7,7195,4.31,4.54,8.84,13.41,9.96,8.11,6.77,5.59,4.68,3.96,3.39,2.94,15.5,4.94,2.61,0.45,1,4.6146717E7\n",
"status": "FINISHED",
"jobId": "ef16374c-f370-442c-9cea-25aa1b427a0a"
}
What I would like is to get the percentage of job status during the processing stage, so that it can be shown at the frontend.
If anybody could help me with this, it would be very helpful.
P.S. First post here, any error in format for posting question is regretted. Thanks.