2

I'm a dev on QHub which is a collection of templatized terraform scripts to deploy a data science platform on various cloud providers (AWS, GCP, Azure, Digital Ocean.) It uses S3 buckets to store the terraform state, and as part deployment we need to import that state if it is available. In the case when the infrastructure is initially deployed, or if it is being deployed using git-ops, the state files don't exist locally. This results in the requirement for a "import if exists, otherwise continue" functionality which I'm having difficulty finding.

Previously we would run terraform import and if the bucket didn't exist the command would fail silently and continue. With the upgrade to terraform v1.0.5 the import command will hang until terraform times out, which is too long.

My current workaround is to use the GNU default command timeout and force the command to fail like so: timeout 10 terraform import module.terraform-state.module.gcs.google_storage_bucket.static-site test-dev-terraform-state. If this fails I will then run terraform apply to create the resource.

There is the additional requirement of avoiding the use of cloud-specific CLI tools to determine if the bucket exists.

There has to be a better way to do this. Any thoughts?

neurochief
  • 51
  • 4

0 Answers0