My requirement is to create a dynamic resource connector in Kafka cluster. Below is my "connector.tf" file:
resource "confluent_connector" "source" {
environment {
id = confluent_environment.staging.id
}
kafka_cluster {
id = confluent_kafka_cluster.dedicated.id
}
config_sensitive = {
"salesforce.password" : var.source_salesforce_password,
"salesforce.password.token" : var.source_salesforce_password_token,
"salesforce.consumer.key" : var.source_salesforce_consumer_key,
"salesforce.consumer.secret" : var.source_salesforce_consumer_secret
}
config_nonsensitive = {
"connector.class" : "SalesforceCdcSource",
"kafka.auth.mode" : "KAFKA_API_KEY",
"salesforce.cdc.name" : "AccountChangeEvent",
"kafka.api.key" : confluent_api_key.app-manager-kafka-api-key.id,
"kafka.api.secret" : confluent_api_key.app-manager-kafka-api-key.secret,
"salesforce.instance" : var.source_salesforce_url,
"salesforce.username" : var.source_salesforce_username,
for_each = { for s in var.source_salesforce_connector_name : s.source_salesforce_connector_name => s },
"name" : each.value["source_salesforce_connector_name"],
"kafka.topic" : each.value["source_salesforce_topic_name"],
"output.data.format" : each.value["source_salesforce_data_format"],
"tasks.max" : each.value["source_salesforce_max_task"]
}
depends_on = [
confluent_kafka_topic.topic
]
lifecycle {
prevent_destroy = false
}
}
Variable declaration as below: variable.tf file
variable "source_salesforce_connector_name" {
type = list(map(string))
default = [{
"source_salesforce_connector_name" = "SalesforceCdcSourceConnector_0_TF"
}]
}
And I am running this execution from .tfvars file:
source_salesforce_connector_name = [
{
source_salesforce_connector_name = "SalesforceCdcSourceConnector_1_TF"
source_salesforce_topic_name = "json-topic-1"
source_salesforce_data_format = "JSON"
source_salesforce_max_task = "1"
},
]
Getting below error with the execution, please suggest how to pass for_each condition into the JSON configuration as highlighted above.
I tried with above steps and execution, however getting below error:
terraform plan -var-file="DEV/DEV.tfvars"
Error: each.value cannot be used in this context
on modules\confluent_kafka_cluster_dedicated\source_connector_salesforce_cdc.tf line 27, in resource "confluent_connector" "source":
27: "name" : each.value["source_salesforce_connector_name"],
28: "kafka.topic" : each.value["source_salesforce_topic_name"],
29: "output.data.format" : each.value["source_salesforce_data_format"],
30: "tasks.max" : each.value["source_salesforce_max_task"],*
A reference to "each.value" has been used in a context in which it unavailable, such as when
the configuration no longer contains the value in its "for_each" expression. Remove this
reference to each.value in your configuration to work around this error.
CodePudding user response:
If you want multiple confluent_connector
resources based on var.source_salesforce_connector_name
, then your for_each
should be outside of config_nonsensitive
:
resource "confluent_connector" "source" {
for_each = { for s in var.source_salesforce_connector_name : s.source_salesforce_connector_name => s },
environment {
id = confluent_environment.staging.id
}
kafka_cluster {
id = confluent_kafka_cluster.dedicated.id
}
config_sensitive = {
"salesforce.password" : var.source_salesforce_password,
"salesforce.password.token" : var.source_salesforce_password_token,
"salesforce.consumer.key" : var.source_salesforce_consumer_key,
"salesforce.consumer.secret" : var.source_salesforce_consumer_secret
}
config_nonsensitive = {
"connector.class" : "SalesforceCdcSource",
"kafka.auth.mode" : "KAFKA_API_KEY",
"salesforce.cdc.name" : "AccountChangeEvent",
"kafka.api.key" : confluent_api_key.app-manager-kafka-api-key.id,
"kafka.api.secret" : confluent_api_key.app-manager-kafka-api-key.secret,
"salesforce.instance" : var.source_salesforce_url,
"salesforce.username" : var.source_salesforce_username,
"name" : each.value["source_salesforce_connector_name"],
"kafka.topic" : each.value["source_salesforce_topic_name"],
"output.data.format" : each.value["source_salesforce_data_format"],
"tasks.max" : each.value["source_salesforce_max_task"]
}
depends_on = [
confluent_kafka_topic.topic
]
lifecycle {
prevent_destroy = false
}
}