0

I've been writing a custom kafka connector plugin. For this, I've to create a schema dynamically from the metadata file and then read each record in data file and publish individual record to a topic. Below is sample code to create schema:

Create individual field

JsonObject fileNameJSONObj = new JsonObject();
    fileNameJSONObj.addProperty("name", "Update_Timestamp");
    fileNameJSONObj.addProperty("type", "string");
    fileNameJSONObj.addProperty("logicalType", "timestamp-millis");
    fileNameJSONObj.addProperty("default", "");

Create record

JsonObject fileNameJSONObj = new JsonObject();
jsonObject.addProperty("type", "record");
jsonObject.addProperty("name", "myrecord");
jsonObject.addProperty("namespace","abc");
jsonObject.add("fields", array);

I've logicalType for all the fields and I write these fields to an avsc file too. In the file, it shows up as:

{"name":"Update_TimeStamp","type":"string","logicalType":"timestamp-millis","default":""},

However, when I invoke the schema registry endpoint at http://localhost:8081/subjects/myvalue/versions/1, it shows up as below:

{\"name\":\"Update_TimeStamp\",\"type\":{\"type\":\"string\",\"connect.default\":\"\"},\"default\":\"\"}

I've debugged enough but unable to make any headway. What could I be doing wrong here?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
user123
  • 281
  • 1
  • 3
  • 16
  • 1
    Timestamps should ideally be longs if it's just millis. How did you register the schema? – OneCricketeer Feb 10 '18 at 00:48
  • @cricket_007, schema gets registered as part of publishing the records and something must be goofy here which is why it doesn’t complain of the time stamp field not being long and also doesn’t show the logical types either. Can you shed some light into this? – user123 Feb 11 '18 at 02:35
  • Connect has its own metadata wrappers. I've not seen any reference to the fact that logicalTypes are preserved. I also only have experience using the avro maven plugin to generate Java objects with Avro schemas in them, then producing those. As demonstrated by https://github.com/confluentinc/examples/tree/3.3.0-post/kafka-clients/specific-avro-producer – OneCricketeer Feb 11 '18 at 02:41

0 Answers0