5

We are using kafka with avro schemas and schema registry set to FULL compatibility. Our schemas use logicalType fields, for example:

{
  "name": "MyRecord",
  "type": "record",
  "fields": [
    {
      "name": "created_at",
      "type": [
        "null",
        {
          "type": "long",
          "logicalType": "timestamp-millis"
        }
      ],
      "default": null
    }
  ]
}

This works fine with the pretty old version of confluent-kafka we are using, as it depends on avro-python3 1.8. However, recent confluent-kafka depends on avro-python3 1.10, and message serialization fails with TypeError: unhashable type: 'mappingproxy'

I've opened a PR to fix the issue but it's not getting much attention.

Assuming it will not be merged, what other options do I have to upgrade to a recent confluent-kafka?

The only solution I see is getting rid of the logicalType, but that will be an incompatible schema change, so I either give up on the FULL compatibility or use a different topic bound to a different schema.

And even if the above works, then I have to manually convert millis to timestamps, which is quite a change across our codebase.

Federico Fissore
  • 712
  • 4
  • 18

1 Answers1

0

Your schema isn't valid Avro. The documentation states that “when a default value is specified for a record field whose type is a union, the type of the default value must match the first element of the union”. For your field created_atnull” is the only valid default value, otherwise you'll have to define your schema as

...
    {
      "name": "created_at",
      "type": [
        {
          "type": "long",
          "logicalType": "timestamp-millis"
        },
        "null"
      ],
      "default": 1
    }
...

this would have an default of the January 1st, 1970 (actually 1 millisecond after midnight), which might not be what you want.

eik
  • 2,104
  • 12
  • 15