-1

Currently, I am using "Mongey/kafka" provider and now I have to switch to "confluentinc/confluent" provider with my existing terraform pipeline. How can I do this ?

Steps currently following to switch the provider

Changing the provider in main.tf file and running following command to replace provider

terraform state replace-provider Mongey/kafka confluentinc/confluent

and after that I run terraform init command to install the new provider But after that when I am running

terraform plan

it is giving "no schema available for module.iddn_news_cms_kafka_topics.kafka_acl.topic_writer[13] while reading state; this is a bug in terraform and should be reported" error.

Is there any way, I will change the terraform provider without disturbing the existing resources created using terraform pipeline ?

ravvi
  • 117
  • 6

1 Answers1

0

The terraform state replace-provider command is intended for switching between providers that are in some way equivalent to one another, such as the hashicorp/google and hashicorp/google-beta providers, or when someone forks a provider into their own namespace but remains compatible with the original provider.

Mongey/kafka and confluentinc/confluent do both have resource types that seem to represent the same concepts in the remote system:

Mongey/kafka confluentinc/confluent
kafka_acl confluent_kafka_acl
kafka_quota confluent_kafka_client_quota
kafka_topic confluent_kafka_topic

However, despite representing the same concepts in the remote system these resource types have different names and incompatible schemas, so there is no way to migrate directly between them. Terraform has no way to understand which resource types in one provider match with resource types in another, or to understand how to map attributes from one of the resource types onto corresponding attributes of the other.

Instead, I think the best thing to do here would be to ask Terraform to "forget" the objects and then re-import them into the new resource types:

  1. terraform state rm kafka_acl.example to ask Terraform to forget about the remote object associated with kafka_acl.example. There is no undo for this action.
  2. terraform import confluent_kafka_acl.example OBJECT-ID to bind the OBJECT-ID (as described in the documentation) to confluent_kafka_acl.example.

I suggest practicing this in a non-production environment first so that you can be confident about the behavior of each of these commands, and learn how to translate from whatever ID format the Mongey/kafka provider uses into whatever import ID format the confluentinc/confluent provider uses to describe the same objects.

Martin Atkins
  • 62,420
  • 8
  • 120
  • 138