I am using databricks rest API to run spark jobs. I am using the foollowing commands:
curl -X POST -H "Authorization: XXXX" 'url/api/2.0/jobs/create' -d ' {"name":"jobname","existing_cluster_id":"0725-095337-jello70","libraries": [{"jar": "dbfs:/mnt/pathjar/name-9edeec0f.jar"}],"email_notifications":{},"timeout_seconds":0,"spark_jar_task": {"main_class_name": "com.company.DngApp"}}'
curl -X POST -H "Authorization: XXXX" 'url/api/2.0/jobs/run-now' -d '{"job_id":25854,"jar_params":["--param","value"]}'
here param is an input args but I want to find a way to override spark driver properties, usually I do :
--driver-java-options='-Dparam=value'
but I am looking for the equivalent for the databricks rest API side