We are using DSE Analytics. I am trying to schedule a spark job using crontab , via spark-submit. Basically this job should run every night , When the job is about to be submitted for subsequent times , the existing application should be killed , I am having trouble finding a way to do it.
Because I am unable to find the application Id of the submitted job or the driver Id so I can shutdown gracefully.
I understand that the Spark Master Web UI can be used to find the Submission Id , but if I am going to setup a cron for this , I can't get the Id from the UI . Is there a proper way to do this. We are running DSE 6.7 with Analytics running in a dedicated DC. Any help would be appreciated