1

I would like send a json to a Kafka topic using Postman.

In postman when sending the following json everything works (no error returned by kafka and when consuming the topic I can see the proper values):

{
  "records": [
    {
        "key":"some_key",
      "value":"test"

    }
  ]
}

However, when sending the following json with a schema and payload embedded:

{
  "schema": {
    "type": "struct",
    "fields": [
      {
        "type": "string",
        "optional": false,
        "field": "userid"
      },
      {
        "type": "string",
        "optional": false,
        "field": "regionid"
      },
      {
        "type": "string",
        "optional": false,
        "field": "gender"
      }
    ],
    "optional": false,
    "name": "ksql.users"
  },
  "payload": {
    "userid": "User_1",
    "regionid": "Region_5",
    "gender": "MALE"
  }
}

I get the following answer:

{
    "error_code": 422,
    "message": "Unrecognized field: schema"
}

The following sources advise to embed a schema, as it is required by the JDBC sink: https://github.com/confluentinc/kafka-connect-jdbc/issues/609 https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/

In my sink-postgresql.properties I set:

value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.shemas.enable=true

key.converter.shemas.enable=true
key.converter=org.apache.kafka.connect.json.JsonConverter

I don't know why it's not working so far, some help might be useful.

system: Ubuntu 18.04 confluent-platform postman 7.14

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
ChisActi
  • 23
  • 4
  • post a code how you're doing this, or there is no possibility to help you with that – cojack Jan 13 '20 at 16:50
  • Hi @cojack, the main code is actually the postman request shown above, not sure what I can add... – ChisActi Jan 13 '20 at 19:58
  • 1
    Hi @chrisacti, welcome to StackOverflow :) Is there a reason you're not just using Avro for this? It would make your life much easier. – Robin Moffatt Jan 13 '20 at 20:57
  • Is there a requirement to use the Rest Proxy? – OneCricketeer Jan 14 '20 at 02:45
  • Hi @RobinMoffatt, there is not particular reason we are not using avro except that we will need to modify some of our codes to adapt when sending messages to topics. I didn't expect using json format would be an issue. – ChisActi Jan 14 '20 at 07:42
  • Hi @cricket_007, yes I need to use the Rest Proxy as I'd like to build a microservice architecture where Kafka is not necessarely installed locally. – ChisActi Jan 14 '20 at 07:46
  • You don't need Kafka installed locally to use a producer or consumer. Just like you don't need Google servers running locally to do a search – OneCricketeer Jan 14 '20 at 09:20
  • You generally need to concern yourself with schemas at some point (especially if you're interfacing with RDBMS), and JSON is generally not a great way to do it (inefficient, etc) - Avro is. Especially if you're at the early stages of your journey, take the opportunity to use Avro from the outset. BTW head to http://cnfl.io/slack for more discussion and help with Confluent Platform. – Robin Moffatt Jan 14 '20 at 09:30
  • Thanks for the tip @RobinMoffatt, I'll then try with Avro and see if it's working – ChisActi Jan 14 '20 at 09:46

0 Answers0