Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
0
votes
0 answers

Not able to write AVRO object to another topic using kafka streams

I have a kafka streams app, which reads an AVRO message do some transformation on it and then write it to another topic. I can able to read AVRO message without any issues but while writing to another topic, I am getting the following error. This…
0
votes
1 answer

How to deserialize avro data using Apache Beam (KafkaIO)

I've only seen one thread containing information about the topic I've mentioned which is : How to Deserialising Kafka AVRO messages using Apache Beam However, after trying a few variations of kafkaserializers I still cannot deserialize kafka…
0
votes
1 answer

Delete schema from schema registry

I know that the documentation of Confluent says that the schemas cannot be deleted in case of some clueless producer. BUT I still would like to know if there is any way of deleting a schema from the schema registry, not just deleting a specific…
Dipperman
  • 119
  • 1
  • 12
0
votes
2 answers

Not able to deserialize Avro specific record as a generic record in Spring Cloud Stream

I would like to have a generic consumer with Spring Cloud Stream where I don't need to specify the schema of an Avro message at the compile-time specifically. Since schema id is included in the message, it should be possible to use the schema id to…
0
votes
0 answers

Schema registry build failed, Failed to execute goal org.apache.maven.plugins:maven-plugin-plugin

Build of schema-registry failed at maven-plugin [ERROR] Failed to execute goal org.apache.maven.plugins:maven-plugin-plugin:3.2:descriptor (default-descriptor) on project kafka-schema-registry-maven-plugin: Execution default-descriptor of…
0
votes
1 answer

Deserialization PRIMITIVE AVRO KEY with kafka lib

I'm currently incapable of deserialize an avro PRIMITIVE key in a KSTREAM APP the key in encoded with an avro schema ( registered in the schema registry ) , when i use the kafka-avro-console-consumer, I can see that the key is correctly…
0
votes
0 answers

How to change kafkastore.timeout.ms either in code or in YAML file

Does anyone have a sample code of how to change the kafkastore.timeout.ms either in code or in YAML configuration. Thanks, Austin
AustinTX
  • 1,322
  • 4
  • 21
  • 28
0
votes
2 answers

org.apache.kafka.connect.runtime.rest.errors.BadRequestException

I am trying to write a kafka connector to move data that is in kafka topic into mongodb(sink). For i have added required configurations in connect-json-standalone.properties file and also in connect-mongo-sink.properties file in kafka folder. In…
0
votes
1 answer

AvroSpecificRecord : Json format

I am getting values in kafkatopic (serialized as avro) "full_address":{"string":"SJS6-L - Alviso, Alviso, California, United States"}, Why is "string" included in this ? Isn't it should be a direct json ?
SunilS
  • 2,030
  • 5
  • 34
  • 62
0
votes
0 answers

How to convert Avro Generic Record to bytes?

Hi I am working on confluent kafka using .net. I have one consumer which returns generic record. I want to de-serialize the data returned from the consumer. Below is my consumer implementation. public ConsumeResult
0
votes
1 answer

How to handle the result received from Confluent Kafka consumer?

Hi I am working on Confluent Kafka. I have one consumer which returns ConsumeResult. Below is my implementation of consumer. public ConsumeResult Consume(string topic) { consumer.Subscribe(topic); …
0
votes
0 answers

Confluent Schema Registry dies unexpectedly

Schema registry is not starting immediately after the restart (controlled shutdown) of my confluent kafka (v3.3.0) cluster which is running on VM - SSL enabled. But if I try to restart after sometime (at least 2 hours after the kafka cluster…
Sathish
  • 245
  • 1
  • 3
  • 16
0
votes
1 answer

Extra bytes with KafkaAvroSerializer

My setup is as follows: I'm retrieving xml files from an ftp server, unmarshall those into a POJO, map that into an Avro-generated class and then forward it into Alpakkas's Producer Sink like so: Ftp.ls("/", ftpSettings) .filter(FtpFile::isFile) …
0
votes
1 answer

Empty column when deserializing avro from apache kafka with pyspark

I'm doing a proof of concept with Kafka, Spark and jupyter notebooks, and i'm having a weird issue. Im trying to read Avro records in from kafka to pyspark. I'm using the confluent schema registry to get the schema to deserialize the avro…
0
votes
1 answer

Confluent and Spring schema registry for kafka

I'm settings up a schema registry server for kafka. I've used confluent schema registry and all was well but then I saw, that you can with less hassle set up a default, spring one. So I did but I was a bit surprised, it seems harder to control the…