I've been writing a custom kafka connector plugin. For this, I've to create a schema dynamically from the metadata file and then read each record in data file and publish individual record to a topic. Below is sample code to create schema:
Create individual field
JsonObject fileNameJSONObj = new JsonObject();
fileNameJSONObj.addProperty("name", "Update_Timestamp");
fileNameJSONObj.addProperty("type", "string");
fileNameJSONObj.addProperty("logicalType", "timestamp-millis");
fileNameJSONObj.addProperty("default", "");
Create record
JsonObject fileNameJSONObj = new JsonObject();
jsonObject.addProperty("type", "record");
jsonObject.addProperty("name", "myrecord");
jsonObject.addProperty("namespace","abc");
jsonObject.add("fields", array);
I've logicalType for all the fields and I write these fields to an avsc file too. In the file, it shows up as:
{"name":"Update_TimeStamp","type":"string","logicalType":"timestamp-millis","default":""},
However, when I invoke the schema registry endpoint at http://localhost:8081/subjects/myvalue/versions/1, it shows up as below:
{\"name\":\"Update_TimeStamp\",\"type\":{\"type\":\"string\",\"connect.default\":\"\"},\"default\":\"\"}
I've debugged enough but unable to make any headway. What could I be doing wrong here?