2

I am using Docker. I am new to Kafka connect. My use case is such that I have a Postgres database from where I need to capture any Change Event (INSERT-UPDATE-DELETE) on the Kafka topic and process it further. But I am stuck in capturing the CHANGE event. I am following the below link:

https://medium.com/high-alpha/data-stream-processing-for-newbies-with-kafka-ksql-and-postgres-c30309cfaaf8

Just after the creation of Connector with the below configuration:

{"name": "postgres-source",
  "config": {"connector.class":"io.debezium.connector.postgresql.PostgresConnector",
    "tasks.max":"1",
    "database.hostname": "postgres",
    "database.port": "5432",
    "database.user": "postgres",
    "database.password": "postgres",
    "database.dbname" : "students",
    "database.server.name": "dbserver15",
    "database.whitelist": "students",
    "database.history.kafka.bootstrap.servers": "kafka:9092",
    "database.history.kafka.topic": "schema-changes.students",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "value.converter": "org.apache.kafka.connect.storage.StringConverter",
    "key.converter.schemas.enable": "false",
    "value.converter.schemas.enable": "true",
    "value.converter.schema.registry.url": "http://schema-registry:8081",
    "transforms": "unwrap",
    "transforms.unwrap.type": "io.debezium.transforms.UnwrapFromEnvelope"
  }
}

I am using the below command to capture the snapshot/change event in the database:

kafka-console-consumer --bootstrap-server localhost:9092 --from-beginning --topic dbserver15.public.admission

It shows the data in this format: Struct{student_id=1,gre=337,toefl=118}...

But as soon as I do any INSERT-UPDATE-DELETE action on this table, below error is thrown by Kafka-Connector:

org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
    at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)\n\tat io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
    at io.debezium.connector.postgresql.PostgresConnectorTask.poll(PostgresConnectorTask.java:156)\n\tat org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:244)
    at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:220)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:175)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:219)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)\nCaused by: java.lang.IllegalArgumentException: Invalid identifier: 
    at io.debezium.relational.TableIdParser$TableIdTokenizer.tokenize(TableIdParser.java:68)
    at io.debezium.text.TokenStream.start(TokenStream.java:445)
    at io.debezium.relational.TableIdParser.parse(TableIdParser.java:28)
    at io.debezium.relational.TableId.parse(TableId.java:39)\n\tat io.debezium.connector.postgresql.PostgresSchema.parse(PostgresSchema.java:218)
    at io.debezium.connector.postgresql.RecordsStreamProducer.process(RecordsStreamProducer.java:238)
    at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$streamChanges$1(RecordsStreamProducer.java:131)
    at io.debezium.connector.postgresql.connection.pgproto.PgProtoMessageDecoder.processMessage(PgProtoMessageDecoder.java:48)
    at io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1.deserializeMessages(PostgresReplicationConnection.java:265)
    at io.debezium.connector.postgresql.connection.PostgresReplicationConnection$1.read(PostgresReplicationConnection.java:250)
    at io.debezium.connector.postgresql.RecordsStreamProducer.streamChanges(RecordsStreamProducer.java:131)
    at io.debezium.connector.postgresql.RecordsStreamProducer.lambda$start$0(RecordsStreamProducer.java:117)
    ... 5 more

Below are the solutions, I have looked into:

org.apache.kafka.connect.errors.ConnectException: An exception occurred in the change event producer. This connector will be stopped

https://gitter.im/debezium/user?at=5e1f2846be66165ecbd4e0fe

https://debezium.io/documentation/reference/connectors/postgresql.html#postgresql-when-things-go-wrong
which says, snapshot.mode is set to exported, which allows the connector to perform a lock-free snapshot.
but when I add "snapshot.mode"="exported" an error is thrown stating,
The 'snapshot.mode' value 'exported' is invalid: Value must be one of always, never, initial_only, initial, custom

Can somebody be little more elaborate and explain me what am I missing. I guess its something related to configuration.

Iskuskov Alexander
  • 4,077
  • 3
  • 23
  • 38
Zeeshan Kareem
  • 83
  • 1
  • 2
  • 4
  • Could you please share the names of the tables you are capturing? – Jiri Pechanec Aug 31 '20 at 04:19
  • @Jiri Pechanec table name is: **admission** – Zeeshan Kareem Sep 01 '20 at 03:11
  • ok, please tak a look at https://gitter.im/debezium/user?at=5e3bf0206f9d3d34982430c0 and make sure you are using the up-to-date Debezium and postgres images – Jiri Pechanec Sep 02 '20 at 04:18
  • 1
    @ZeeshanKareem did you managed to get to the bottom of this? I'm using "debezium/debezium-connector-postgresql:1.4.0" and started to see those error after adding "snapshot.mode": "exported" to my connector config – shlomiLan May 04 '21 at 07:53

0 Answers0