5

I am using Debezium as a CDC tool to stream data from MySql. After installing Debezium MySQL connector to Confluent OSS cluster, I am trying to capture MySQL bin_log changes into a Kafka topic. When I create a connector, after taking the snapshot of the database, I am left with a continuous series of error.

I checked MySql bin_log is ON and tried restarting schema-registry and connectors with different serializers. But I am getting the same errors.

Error logs show:

[2019-06-21 13:56:14,885] INFO Step 8: - Completed scanning a total of 955 rows from table 'mydb.test' after 00:00:00.086 (io.debezium.connector.mysql.SnapshotReader:565)
[2019-06-21 13:56:14,886] INFO Step 8: scanned 1758 rows in 2 tables in 00:00:00.383 (io.debezium.connector.mysql.SnapshotReader:601)
[2019-06-21 13:56:14,886] INFO Step 9: committing transaction (io.debezium.connector.mysql.SnapshotReader:635)
[2019-06-21 13:56:14,887] INFO Completed snapshot in 00:00:01.055 (io.debezium.connector.mysql.SnapshotReader:701)
[2019-06-21 13:56:14,965] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 11 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,066] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 12 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,168] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 13 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,269] WARN [Producer clientId=producer-5] Error while fetching metadata with correlation id 14 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION} (org.apache.kafka.clients.NetworkClient:968)
[2019-06-21 13:56:15,370] WARNDebezium [Producer clientId=producer-5] Error while fetching metadata with correlation id 15 : {kbserver=UNKNOWN_TOPIC_OR_PARTITION}

The connector payload that I am sending is as follows:

{
      "name": "debezium-connector",
      "config": {
            "connector.class": "io.debezium.connector.mysql.MySqlConnector",
            "tasks.max": "1",
            "key.serializer": "io.confluent.connect.avro.AvroConverter",
            "value.serializer": "io.confluent.connect.avro.AvroConverter",
            "database.hostname": "localhost",
            "database.port": "3306",
            "database.user": "test",
            "database.password": "test@123",
            "database.whitelist": "mydb",
            "table.whitelist": "mydb.test",
            "database.server.id": "1",
            "database.server.name": "kbserver",
            "database.history.kafka.bootstrap.servers": "kafka:9092",
            "database.history.kafka.topic": "db-schema.mydb",
            "include.schema.changes": "true"
       }
    }

does someone know why this is happening or how can I fix it?

Rahul Gupta
  • 51
  • 1
  • 3
  • 1
    Maybe you need to set [auto.create.topics.enable](https://docs.confluent.io/current/installation/configuration/broker-configs.html) – dmkvl Jun 21 '19 at 15:37
  • I don't see any `ERROR`, just `WARN`. Does the connector actually fail, or are you just not getting data when you think you should be? – Robin Moffatt Jun 21 '19 at 16:04
  • @RobinMoffatt No connector is not failing, it cannot automatically create topics for each table to produce messages. – Rahul Gupta Jun 24 '19 at 07:54
  • @dmkvl Enabling `auto.create.topics.enable` into broker config actually worked. Thank you so much for your suggestions. – Rahul Gupta Jun 24 '19 at 11:04
  • did you solve your problem? I am having same issue, so looking if there is any update while bumped here. – user404 Feb 10 '20 at 07:50

4 Answers4

0

please refer to database.whitelist and table.whitelist. They are inconsistent.

It should be either mydb and mydb.test or db and db.test depending on the name of database.

Jiri Pechanec
  • 1,816
  • 7
  • 8
0

I had the same error... In my case, I had not created the "schema change topic" ahead of time. See: https://debezium.io/documentation/reference/stable/connectors/sqlserver.html#about-the-debezium-sqlserver-connector-schema-change-topic

Once I created that topic, the error went away and data began streaming.

mike d
  • 11
  • 1
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Apr 27 '22 at 06:55
0

The solution was hidden in the comment but was successful for me: it is sufficient to enable the autocreation of topic on the kafka broker.

If you use strimzi it is sufficient to modify the config section with the following property:

 auto.create.topics.enable: 'true'

The same configuration can be set in the kafka Connect. It a design choice where to set it.

If you do it on the broker it will be taken as a default and it will be valid for all the new request of creating a new kafka topic. If you set it on the Connect it will be valid only for the topic coming through the connect.

Hoewever I would avoid both those two solutions and create the topic manually in the setup. This is from the debezium documentation

the topic you need to create need to have the same name as the , where serverName is the logical server name that is specified in the database.server.name

Cr4zyTun4
  • 625
  • 7
  • 18
0

Configure your connector to auto create topics: https://debezium.io/documentation/reference/stable/configuration/topic-auto-create-config.html

"topic.creation.enable": "true",
"topic.creation.default.replication.factor": "1",
"topic.creation.default.partitions": "1",
shem
  • 83
  • 5