0

I would like to use the same workflow and task definitions for our dev, stag, and prod environments.

Option 1) I was thinking about capturing an "environment_key" as a parameter for my Python job ("dev" for this case). Is there a placeholder like "task_id" that I can use here? I know I can use environment variables to do that, but this is simpler.

environments:
  dev:
    strict_path_adjustment_policy: true
    workflows:
      - name: "test"
        tasks:
          - task_key: "test1"
            <<: *cluster-dev
            max_retries: 0
            spark_python_task:
              python_file: "file://jobs/main.py"
              parameters: ["--env={{environment_key}}","--run=test2","--task={{task_key}}"]

Option 2) Another way would be by capturing a parameter from the command like:

dbx deploy --environment=dev
dbx launch --environment=dev

Is there a way to accomplish that?

Thanks.

André Salvati
  • 504
  • 6
  • 17

1 Answers1

0

You can achieve this by enabling jinja support in dbx:

https://dbx.readthedocs.io/en/latest/features/jinja_support/

rupaj
  • 66
  • 7
  • While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - [From Review](/review/late-answers/34604172) – Koedlt Jul 02 '23 at 04:33