Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
-1
votes
1 answer

cross language (de)serialization

i am trying to serialize in python and unmarshal in golang but i am facing error. error message -- "cannot parse invalid wire-format data". code configuration -- python code -- schema_registry_client = SchemaRegistryClient({'url':…
-1
votes
1 answer

Deleted Shema for kafka topic

I've permanently deleted schema for kafka topic, Now I'm unable to deserialize messages from the topic, what should I do . stack trace: Internal Server Error A 500 error has occurred: Request processing failed; nested exception is…
-1
votes
1 answer

How to JSON serialize message and post it to Kafka Topic with JSON Schema

confluent-schema-registry javascript package can be used to serialize and de-serialize messages posted to Kafka Topic. Unfortunately, it only supports AVRO format. Is there a similar package that supports JSON serialization?
alphanumeric
  • 17,967
  • 64
  • 244
  • 392
-1
votes
1 answer

Java process builder issue with schema

i am using java process builder to execute the curl command to publish the schema to schema registry but getting error. problem with schema format, not sure how to pass as argument for process builder. please provide any suggestions enter code…
-1
votes
1 answer

Does Confluent schema registry only support AVRO?

Can we use schema registry for json messages and json schemas? Or is it like we have to use avro serialization for value serialization of messages.
-1
votes
1 answer

How to let Kafka Avro consumer read payload with the specific schema id / version but without generating the Avro schema java class?

I am using Avro and schema registry. I am wondering how to let the consumer read the data by specifying the schema id in Java project without having a Java class for the Avro schema. From my understanding, we could not specify the schema version…
Shan
  • 177
  • 3
  • 12
-1
votes
1 answer

Connection refused error while producing record into MSK topic using schema registry

So i have been able to register schema and produce and consume from command line . This is how i have done . bin/kafka-avro-console-producer --broker-list…
-1
votes
1 answer

pyspark: avro deserialize function on dataframe fails as it expects list

avro deserialize function expects bytes in a list and fails while applying on a data frame. Only works with collect() but driver/master is running out of memory Using spark version 2.3.3 with python 3.6.8 dataframe is being created from Hive table…
-2
votes
1 answer

How to serialize protobuf message with schema id

Looking for a library or algorithm that implements serialization of a message in protobuf format with schema version which retrieved from confluent schema registry. I use the php-rdkafka extension to send messages to Kafka topic.
-2
votes
1 answer

Avro DataFileWriter API with confluent schema registry

Can I use avro DataFileWriter with schema registry?
-2
votes
1 answer

kafka streams joinWindow and auto create a avrò schema

When I use kafka stream joined windows, auto create an avro schema like this " * KSTREAM-JOINTHIS-0000000125-store-changelog-value"** I want to know, why this can create avro schema ? there is my code: Serde
1 2 3
71
72