Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
2
votes
1 answer

How to extract nested field from Envelop type schema in s3 sink connector

Avro schema : { "type": "record", "name": "Envelope", "namespace": "test", "fields": [ { "name": "before", "type": [ "null", { "type": "record", "name": "Value", "fields": [ …
2
votes
0 answers

Auto-resolve schema by Confluent magic bytes using ABRiS for Spark

Is there a way to automatically resolve the schema by the leading magic byte for each message, which contains the schema id for that message? As we know, Confluent AVRO prepends the schema id to the message. So, each message has its own schema id…
2
votes
1 answer

Kafka Connect and subject name strategies for Key Converter

I am trying to setup a Debezium MySQL Source connector. My goal is to have one topic for each database, so I am investigating the possibility to leverage subjects, in such a way that a topic can contain different message types and their schema can…
2
votes
0 answers

Schema Registry Issue - Protobuf Unions

We are currently working on a POC that deals with using a 'oneof' to have multiple events into the same topic. However, we seem to be getting a serialization exception when publishing to the union Kafka topics. We are creating a union protobuf…
2
votes
2 answers

Custom avro message deserialization with Flink

The Flink consumer application I am developing reads from multiple Kafka topics. The messages published in the different topics adhere to the same schema (formatted as Avro). For schema management, I am using the Confluent Schema Registry. I have…
2
votes
1 answer

auto.register.schemas set to false doesn't work as intended

auto.register.schemas=false doesn't work as I expect. If I read the documentation it's suppose to counter the producer to regsiter new…
Omegaspard
  • 1,828
  • 2
  • 24
  • 52
2
votes
0 answers

How to configure JSON Schema to work with LocalDateTime in a Java Pojo?

I'm having trouble using LocalDateTime in a Java Pojo and sending it through Kafka + Schema Registry using JsonSchema. JsonSchema maps the LocalDateTime as an array of integers, but the actual message being sent through the stream contains a String…
2
votes
1 answer

How can I read and decode AVRO messages from Kafka along with their associated kafka key using Benthos?

I am using Benthos to read AVRO-encoded messages from Kafka which have the kafka_key metadata field set to also contain an AVRO-encoded payload. The schemas of these AVRO-encoded payloads are stored in Schema Registry and Benthos has a…
Mihai Todor
  • 8,014
  • 9
  • 49
  • 86
2
votes
1 answer

Kafka connect | Failed to deserialize data for topic | Error retrieving Avro key / value schema version for id | Subject not found error code: 40401

first of all thanks to @OneCricketeer for your support so far. I have tried so many configurations by now that I don't know what else I could try. Using confluent connect-standalone worker.properties sink.properties to access an external…
2
votes
2 answers

Using Confluent Schema Registry with MSK

Is it possible to integrate Confluent Schema Registry with AWS MSK? If you have done this before, can you please provide some pointers / blogs you followed to achieve it?
guru
  • 409
  • 4
  • 21
2
votes
1 answer

How can Confluent SchemaRegistry help ensuring the read (projection) Avro schema evolution?

SchemaRegistry helps with sharing the write Avro schema, which is used to encode a message, with the consumers that need the write schema to decode the received message. Another important feature is assisting the schema evolution. Let's say a…
Stanislav
  • 21
  • 1
2
votes
1 answer

Send Record with JSON Schema to Kafka using Spring-Kafka and Confluent schema registry

I cannot find any information on the internet how to send a record with a json schema to kafka using spring kafka. How can I do that?
2
votes
3 answers

Subject does not have subject-level compatibility configured

We use Kafka, Kafka connect and Schema-registry in our stack. Version is 2.8.1(Confluent 6.2.1). We use Kafka connect's configs(key.converter and value.converter) with value: io.confluent.connect.avro.AvroConverter. It registers a new schema for…
2
votes
0 answers

Debezium: RestClientException: Leader not known; error code: 50004

I am getting exceptions when i try to deploy debezium connector to kafka-connect. As a result, snapshots are not created and also cdc streaming is blocked. The problem is I am not able to find where the issue is i.e. is it in kafka-connect,…
2
votes
1 answer

How to enforce Kafka consumer to use certain version of schema?

We have multiple versions of schemas of a topic, and messages are in AVRO format. From what I understand, when a consumer receives an AVRO message, the message itself contains an id which will be used to retrieve schema from schema registry, and…