Questions tagged [terraform-provider-databricks]

For questions about Databricks Terraform Provider

Databricks Terraform Provider is used to deploy resources inside the Databricks workspaces. On AWS & GCP it also could be used to deploy workspaces as well. On Azure, use the AzureRM provider

Related tags:

71 questions
3
votes
1 answer

Error while starting the azure databricks cluster

I setup a Databricks instance on Azure using terraform. The deployment seems to be good. But, I am getting the following error when creating/starting a new cluster, Message Cluster terminated. Reason: Cloud provider launch failure Help A cloud…
3
votes
1 answer

Terraform Unable to list provider

I am trying to create azure databrick cluster but when i try to run terraform init i see the following error. How can i rectify this. So bascially how to use different provider in terraform. Terraform version is Terraform v0.14.5 When i run…
3
votes
2 answers

Terraform databricks labs provider

I have an issue with terraform databricks labs provider, the below code give me an error "status 400: err Response from server {"error_code":"INVALID_PARAMETER_VALUE","message":"Path must be absolute: databricks"} " There is nothing in documentation…
2
votes
2 answers

Terraform databricks cannot configure default credentials

We are running terraform through an Azure pipeline to create a databricks workspace and related resources, however when the apply stage of Terraform gets to the stage where it is grabbing the latest version of spark, the process throws an…
2
votes
1 answer

How to specify Databricks Cluster Policy Family from Terraform?

When I create a Databricks Cluster Policy from the UI, I can choose a Policy Family. This Policy Family let's us have a base template JSON, which we can then override by passing custom JSON tags of our own. It works fine. When I create a similar…
2
votes
1 answer

Error: cannot create metastore: Only account admin can create metastores

I'm trying to create a databricks metastore: resource "databricks_metastore" "this" { name = "primary" storage_root = format("abfss://%s@%s.dfs.core.windows.net/", azurerm_storage_container.unity_catalog.name, …
Chris Snow
  • 23,813
  • 35
  • 144
  • 309
2
votes
1 answer

Mounting ADLS gen2 with AAD passthrough in Azure Databricks with Terraform

I am trying to mount my ADLS gen2 storage containers into DBFS, with Azure Active Directory passthrough, using the Databricks Terraform provider. I'm following the instructions here and here, but I'm getting the following error when Terraform…
2
votes
3 answers

Using databricks workspace in the same configuration as the databricks provider

I'm having some trouble getting the azurerm & databricks provider to work together. With the azurerm provider, setup my workspace resource "azurerm_databricks_workspace" "ws" { name = var.workspace_name resource_group_name…
1
vote
2 answers

How to dynamically change variables in a Databricks notebook based on to which environment was it deployed?

I want to move data from S3 bucket to Databricks. On both platforms I have separate environments for DEV, QA, and PROD. I use a Databricks notebook which I deploy to Databricks using terraform. Within the notebook there are some hardcoded variables,…
1
vote
1 answer

Databricks PAT token and secret creation

We are trying to create a databricks workspace using terraform in Azure. As configured in the terraform script Databricks is created and but continuously getting error on token creation and secret creation. we are using SP to authenticate to…
1
vote
1 answer

terraform variable type error : Attribute must be a whole number

The following code throws the error Error: Attribute must be a whole number, got 1.123456781234567e+15 on main.tf line 56, in resource "databricks_mws_permission_assignment" "ws_usergp": │ 56: workspace_id =…
1
vote
2 answers

How to pass argument value to databricks_global_init_script resource

I am trying to call datadog-install-driver-workers.sh using terraform resource databricks_global_init_script and this script required 2 input values to pass DD_API_KEY and DD_ENV How do I pass these these values along with source script…
raj
  • 137
  • 1
  • 1
  • 10
1
vote
1 answer

Terraform Databricks automatic installation of libs

I have few clusters created and I want to install the libraries on them. The catch though is I want these libraries to be installed automatically whenever I create another cluster. Is there a way to do this? What I am currently doing is this ->…
1
vote
1 answer

datadog integration with databricks cluster created using terraform

I am creating databricks cluster using terraform and would like to setup datadog on it whenever new cluster (master/worker nodes) gets created to push logs into datadog. How do we push logs in datadog? I was trying below but not sure how to get…
raj
  • 137
  • 1
  • 1
  • 10
1
vote
1 answer

Error: cannot read mws workspaces: RESOURCE_DOES_NOT_EXIST: workspace 96783599 does not exist

When I do terraform apply. My workspace is getting created but I am getting the following error. I have looked to find the "workspace 96783599". But uanble to find the any resource with that number. Error: cannot read mws workspaces:…
1
2 3 4 5