1

I want to publish multiple table data on to same Kafka topic using the below connector config, but I am seeing below exception

Exception

Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409

The connector seems to ignore the subject strategy properties set and keeps using the old ${topic}-key and ${topic}-value subjects.

[2019-04-25 22:43:45,590] INFO AvroConverterConfig values: 
    schema.registry.url = [http://schema-registry:8081]
    basic.auth.user.info = [hidden]
    auto.register.schemas = true
    max.schemas.per.subject = 1000
    basic.auth.credentials.source = URL
    schema.registry.basic.auth.user.info = [hidden]
    value.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy
    key.subject.name.strategy = class io.confluent.kafka.serializers.subject.TopicNameStrategy

Connector configuration

    curl -i -X POST -H "Accept:application/json" -H "Content-Type:application/json" localhost:8083/connectors/ -d '{
      "name": "two-in-one-connector",
      "config": {
        "connector.class": "io.debezium.connector.mysql.MySqlConnector",
        "tasks.max": "1",
        "database.hostname": "xxxxxxx",
        "database.port": "3306",
        "database.user": "xxxxxxx",
        "database.password": "xxxxxxxxx",
        "database.server.id": "18405457",
        "database.server.name": "xxxxxxxxxx",
        "table.whitelist": "customers,phone_book",
        "database.history.kafka.bootstrap.servers": "broker:9092",
        "database.history.kafka.topic": "dbhistory.customer",
        "transforms": "dropPrefix",
        "transforms.dropPrefix.type":"org.apache.kafka.connect.transforms.RegexRouter",
        "transforms.dropPrefix.regex":"(.*)",
        "transforms.dropPrefix.replacement":"customer",
        "key.converter.key.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicRecordNameStrategy",
        "value.converter.value.subject.name.strategy": "io.confluent.kafka.serializers.subject.TopicRecordNameStrategy"
      }
    }'
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Suraj
  • 97
  • 5
  • Related https://stackoverflow.com/questions/55801680/kafka-connector-and-schema-registry-error-retrieving-avro-schema-subject-not – OneCricketeer Apr 28 '19 at 02:29

1 Answers1

0

Try setting strategy classes to below parameters in you connector configuration (JSON) file instead of "key.converter.key.subject.name.strategy" and "value.converter.value.subject.name.strategy"

"key.subject.name.strategy" "value.subject.name.strategy"

Gokul Potluri
  • 262
  • 4
  • 16
  • Have you tested these? `AvroConverterConfig` takes the prefixes of `key.converter` and `value.converter` first... Much like `value.converter.schema.registry.url`, it should be `value.converter.value.subject.name.strategy` – OneCricketeer Apr 28 '19 at 01:36
  • @cricket_007 I am already using "value.converter.value.subject.name.strategy" in my connector config. Still doesn't work. – Suraj Apr 28 '19 at 17:59
  • @Suraj As I answered in the other post I linked to, I don't think these properties were tested within the Connect API, and they're not explicitly documented anywhere outside the PR that added them – OneCricketeer Apr 28 '19 at 18:53