I am running some jobs using:
- dbx version 0.7.4
- pyspark 3.2.2
- delta-spark 2.0.0
- Python 3.8.1
I am following the guidelines from : https://dbx.readthedocs.io/en/latest/features/assets/?h=dbx+launch+assets
I run the following commands
dbx deploy <my-workflow> --assets-only
dbx launch <my-workflow> --from-assets
I get the following error:
TypeError: submit_run() got an unexpected keyword argument 'permissions'
On the deployment.yml
I have included this:
custom:
basic-cluster-props: &basic-cluster-props
spark_version: "10.4.x-scala2.12"
node_type_id: "Standard_D3_v2"
basic-settings: &basic-settings
libraries:
- pypi:
package: "pyyaml"
permissions:
access_control_list:
- user_name: "userid"
permission_level: "IS_OWNER"
- group_name: "admins"
permission_level: "CAN_MANAGE"
- group_name: "rolename"
permission_level: "CAN_MANAGE"
basic-static-cluster: &basic-static-cluster
new_cluster:
<<: *basic-cluster-props
num_workers: 1
<<: *basic-settings
environments:
default:
strict_path_adjustment_policy: true
workflows:
- name: "current-integration-test"
<<:
- *main-static-cluster
spark_python_task:
python_file: "file://tests/integration/cp/silver/test_myjob_job.py"
parameters: ["--conf-file", "file:fuse://conf/int/cp.yml","--cov=dlite"]
What am I missing here?