1

I am trying to push messages from kafka topic (data is in avro format) to Postgres table. I have all privilege to create/insert/update/delete database/table. first time i ran the sink connector, it create a table automatically and loaded all data but when i stopped the connector and again try to load new data into existing data it giving error like :

Caused by: java.sql.SQLException: java.sql.BatchUpdateException: Batch entry 0 INSERT INTO "testing" ("EMPID","TS","EMPNAME","EMPSALARY") VALUES ('abc123','2019:01:23','john',10) ON CONFLICT ("EMPID") DO UPDATE SET "TS"=EXCLUDED."TS","EMPNAME"=EXCLUDED."EMPNAME","EMPSALARY"=EXCLUDED."HITS" was aborted.  Call getNextException to see the cause.
org.postgresql.util.PSQLException: ERROR: relation "testing" does not exist
Position: 13

Here is my sink connector config

"name": "test",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"tasks.max": "1",
"topics": "emp_data",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://localhost:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://localhost:8081",
"connection.url": "jdbc:postgresql://localhost:5432/temp",
"connection.user": "root",
"connection.password": "pwd",
"compact.map.entries": "false",
"insert.mode": "upsert",
"batch.size": "1",
"table.name.format": "testing",
 "pk.mode":"record_value",
 "pk.fields":"EmpID"
"fields.whitelist": "timestamp,empid,empname, empsalary",
"key.ignore": "true",
"auto.create": "false",
"auto.evolve": "true",
"type.connect": "kafka-connect"

I have created table my own in database and then tried to push data into table but nothing happen.

Any help would be highly appreciated!

P Singh
  • 51
  • 7

0 Answers0