I have a Kafka topic containing messages with an Avro-serialized key and Avro-serialized value.
I am trying to setup a sink connector to land these values into a table in a postgres database (AWS RDS in this case).
I have tried a number of variations on the topic, messages and sink config themselves, but looking at the following example, if someone could provide guidance on where I am going wrong, that would be great! :)
My topic has the following schema (in a schema registry)...
Key Schema
{
"type": "record",
"name": "TestTopicKey",
"namespace": "test.messaging.avro",
"doc": "Test key schema.",
"fields": [
{
"name": "unitId",
"type": "int"
}
]
}
Value Schema
{
"type": "record",
"name": "TestTopicValues",
"namespace": "test.messaging.avro",
"doc": "Test value schema.",
"fields": [
{
"name": "unitPrice",
"type": "int",
"doc": "Price in AUD excluding GST."
},
{
"name": "unitDescription",
"type": "string"
}
]
}
I am manually producing records to the topic using the "kafka-avro-console-producer" as follows:
/bin/kafka-avro-console-producer --broker-list kafka-box-one:9092 --topic test.units --property parse.key=true --property "key.separator=|" --property "schema.registry.url=http://kafka-box-one:8081" --property key.schema='{"type":"record","name":"TestTopicKey","namespace":"test.messaging.avro","doc":"Test key schema.","fields":[{"name":"unitId","type":"int"}]}' --property value.schema='{"type":"record","name":"TestTopicValues","namespace":"test.messaging.avro","doc":"Test value schema.","fields":[{"name":"unitPrice","type":"int","doc":"Price in AUD excluding GST."},{"name":"unitDescription","type":"string"}]}'
Once that Producer starts I can then successfully add records to the topic as follows:
{"unitId":111}|{"unitPrice":15600,"unitDescription":"A large widget thingy."}
NB: I can also successfully consume with kafka-avro-console-consumer as expected.
The postgres table I am trying to Sink into looks like this:
CREATE TABLE test_area.unit_prices (
unitId int4 NOT NULL,
unitPrice int4 NULL,
unitDescription text NULL,
CONSTRAINT unit_prices_unitid_pk PRIMARY KEY (unitId)
);
My sink connector looks like this:
{
"name": "test.area.unit.prices.v01",
"config": {
"connector.class": "JdbcSinkConnector",
"topics": "test.units",
"group.id": "test.area.unit.prices.v01",
"key.converter": "io.confluent.connect.avro.AvroConverter",
"key.converter.schema.registry.url": "http://kafka-box-one:8081",
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "http://kafka-box-one:8081",
"connection.user": "KafkaSinkUser",
"connection.password": "KafkaSinkPassword",
"connection.url": "jdbc:postgresql://unit-catalogue.abcdefghij.my-region-1.rds.amazonaws.com:5432/unit_sales?currentSchema=test_area",
"table.name.format": "unit_prices",
"auto.create": false,
"auto.evole": "false"
}
}
My expectation is that records would appear in the postgres table shortly after the Sink is shown as running. However nothing is sinking.
Additional notes:
- I can connect and write to the postgres RDS instance from the Kafka Connect box on which this Sink connector is being published with credentials as per the Sink connector using usql.
- The sink connector status is "running" which suggests to me that there are no errors in the Sink syntax.