I would go a different route - instead of the requiring each job & cluster to specify necessary values, you can use the templatefile function to substitute necessary values in the script, like this:
locals {
script_path = "${path.module}/datadog-install-driver-workers.sh"
params = {
DD_ENV = "dev"
DD_API_KEY = "aaaaa"
}
}
resource "databricks_global_init_script" "init" {
name = "datadog script"
content_base64 = base64encode(templatefile(local.script_path, local.params))
}
with the script template as following:
#!/bin/bash
#
DD_ENV="${DD_ENV}"
DD_API_KEY="${DD_API_KEY}"
echo "Some code that outputs $${DD_ENV}"
and this will generate it correctly:

The only thing that you need to take into account is that you may need to escape shell variables substitutions that use the same syntax as Terraform: ${var}
to $${var}
- see documentation.