I would like send a json to a Kafka topic using Postman.
In postman when sending the following json everything works (no error returned by kafka and when consuming the topic I can see the proper values):
{
"records": [
{
"key":"some_key",
"value":"test"
}
]
}
However, when sending the following json with a schema
and payload
embedded:
{
"schema": {
"type": "struct",
"fields": [
{
"type": "string",
"optional": false,
"field": "userid"
},
{
"type": "string",
"optional": false,
"field": "regionid"
},
{
"type": "string",
"optional": false,
"field": "gender"
}
],
"optional": false,
"name": "ksql.users"
},
"payload": {
"userid": "User_1",
"regionid": "Region_5",
"gender": "MALE"
}
}
I get the following answer:
{
"error_code": 422,
"message": "Unrecognized field: schema"
}
The following sources advise to embed a schema, as it is required by the JDBC sink: https://github.com/confluentinc/kafka-connect-jdbc/issues/609 https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/
In my sink-postgresql.properties
I set:
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.shemas.enable=true
key.converter.shemas.enable=true
key.converter=org.apache.kafka.connect.json.JsonConverter
I don't know why it's not working so far, some help might be useful.
system: Ubuntu 18.04 confluent-platform postman 7.14