I have jar and an associated properties file. In order to run the jar, this is what I do on Databricks on Azure:
I click on:
+Create Job
Task: com.xxx.sparkmex.core.ModelExecution in my.jar - Edit / Upload JAR / Remove
Parameters: Edit
Main Class: com.xxx.sparkmex.core.ModelExecution
Arguments: ["-file","/dbfs/mnt/mypath/myPropertyFile.properties","-distributed"]
Cluster: MyCluster
and then I click RunNow
I am trying to achive the same using databricks cli
This is what I am doing/want to do:
1) upload the properties file
dbfs cp myPropertyFile.properties dbfs:/mnt/mypath/myPropertyFile.properties
2) Create a job: databricks jobs create
when I do this, it asks for a --jason-file. Where do I get the jason file from?
3) Upload the jar file: how do I upload the jar file?
4) Upload the property file: how do I upload the properties file?
5) restart the cluster: databricks clusters restart --cluster-id MYCLUSTERID
6) Run the job
and Repeat. The reason I want to repeat is that everytime I upload a new properties file with a different settings. I do not know how to do step 2 to 4 and step 5.