Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
4
votes
1 answer

Kafka Connect: how can I use the AvroConverter without schema registry?

In our worker-distributed props, we have: value.converter=io.confluent.connect.avro.AvroConverter value.converter.schema.registry.url= We've noticed that the performance is bad when we use schema registry. Can I use AvroConverter…
4
votes
3 answers

How to generate pojo from kafka avro record?

I have a User class and I am seralizing it as avro, (using Confluent avro serializer and schema registry) and publish it to a Kafka topic. I made a consumer to print data to console and it works fine. What I am trying now is to create the original…
Alfred
  • 21,058
  • 61
  • 167
  • 249
4
votes
2 answers

How to produce a Kafka avro record exactly as like produced using avro console producer?

I am using Confluent 3.3.0. My intention is to use kafka-connect to insert values from Kafka topic into Oracle table. My connector works fine with the avro record I have produced using avro console producer like below: ./kafka-avro-console-producer…
Alfred
  • 21,058
  • 61
  • 167
  • 249
4
votes
1 answer

How local cache is managed by Confluent Schema Registry

We plan to push millions of messages through Kafka, using the Schema Registry (SR) (Confluent) to not sent the full schema all the times. The architecture is related in this picture: The fact is we want to avoid to have one call to the SR all the…
user954156
  • 468
  • 2
  • 6
  • 17
4
votes
2 answers

Flink with Confluent Kafka schema registry

I am trying to write to Confluent kafka with schema registry from Flink using FlinkKafkaProducer10. Below error is produced. I created custom schema serializer see ConfluentAvroSerializationSchema class. Code compiles but produces runtime error.…
dejan
  • 196
  • 2
  • 11
4
votes
1 answer

Confluent Schema Registry Persistence

I would like to be able to keep a schema with a fixed id even if the server is restarted. Is it possible to persist the schemas in the Schema Registry in order to have them with the same id after the server crashes? Otherwise, is it possible to…
Adrian Olosutean
  • 179
  • 1
  • 3
  • 11
3
votes
1 answer

AvroConverter fails to serialize Nested Structures using latest schema version

I'm trying to setup a debezium connector to capture the changes on a collection and stream those changes to a kafka topic. Everything works great (inserts, updates, deletes/tombstones) until I introduced the schema registry and Avro Schemas to the…
3
votes
1 answer

Why use a schema registry

I just started working with Kafka and I use Protocol Buffers for the message format and I just learn about schema registry. To give some context we are a small team with a dozen of webservices and we use Kafka to communicate between them and we…
3
votes
0 answers

classcastexception while deserializing the avro message in spring cloud kafka

I have been trying to integrate kafka within spring boot but doesn't seem to work. I am getting an exception while consuming the message. I think the publisher works fine but the consumer fails to deserialize the message. Caused by:…
3
votes
1 answer

Unable to find Databricks spark sql avro shaded jars in any public maven repository

We are trying to create avro record with confluent schema registry. The same record we want to publish to kafka cluster. To attach schema id to each records (magic bytes) we need to use-- to_avro(Column data, Column subject, String…
Snigdhajyoti
  • 1,327
  • 10
  • 26
3
votes
3 answers

How to add an enum value to an AVRO schema in a FULL compatible way?

I have an enum in an AVRO schema like this : { "type": "record", "name": "MySchema", "namespace": "com.company", "fields": [ { "name": "color", "type": { "type": "enum", …
singe3
  • 2,065
  • 4
  • 30
  • 48
3
votes
0 answers

Cannot Build Confluent Common repo

I'm trying to build Confluent schema registry which requires common and rest-utils to be built. But I'm unable to build common from source and getting the following Maven error: [ERROR] Failed to execute goal…
guru
  • 409
  • 4
  • 21
3
votes
1 answer

Spring Kafka-ConfigException: Invalid value TopicNameStrategy for configuration key.subject.name.strategy: Class TopicNameStrategy could not be found

we just deployed a kafka producer to prod and facing a weird issue that didn't popup in non-prod. The service is a spring boot microservice that receives a REST HTTP request and uses spring kafka to publish an event onto a topic. The Microservice is…
3
votes
1 answer

Basic Authentication for Kafka Connect to Access Schema Registry

We have set our Schema Registry and Kafka Connect to use basic authentication. Some of the connectors seem to be running. But some of them gives an error: "io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized;…