I am using google dataproc cluster to run spark job, the script is in python.
When there is only one script (test.py for example), i can submit job with the following command:
gcloud dataproc jobs submit pyspark --cluster analyse ./test.py
But now test.py import modules from other scripts written by myself, how can i specify the dependency in the command ?