Is it possible to submit/configure a spark python script (.py) file to databricks job?
I have my developments happening in my Pycharm IDE, then push/commit the code to our gitlab repository. My requirement is I need to create new jobs in databricks cluster as and when a python script is moved to a GitLab master branch.
I would like to get some suggestions if its possible to create a databricks job on a python script, using gitlab.yml scripts?
In databricks Job UI, I could see spark jar or a notebook that can be used, but wondering if we can provide a python file.
Thanks,
Yuva