Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
0
votes
1 answer

Exception Occurred Subject not found error code - Confluent

I can see an error in my logs that Subject with name A.Abc-key is not present. I listed all the subjects and verified that the A.Abc-key is not present but the A.Abc-value is present On checking property-key for same Topic i get following error :…
0
votes
1 answer

How to create schema from a Map and register to Schema Registry

Is there a way to create Schema from Map. I have a map with key-value pairs and want to create Schema from this. I have seen the org.apache.avro.Schema class(from avro-tools-1.8.2.jar) and there is APIs like below to read JSON and create Schema from…
tryingSpark
  • 143
  • 2
  • 15
0
votes
0 answers

How to discard default value fields of AVRO data at the Kafka AVRO consumer end?

I have defined schema in the schema registry with 10 fields. Using confluent-3.3.0, Kafka 0.10 {"schema":…
tryingSpark
  • 143
  • 2
  • 15
0
votes
0 answers

SpecificAvro Serdes without registry

Is there a way to configure avro serdes without the http calls to the avro-registry? Basically we'd like to limit the registry to our CI/CD when building or releasing components. The schema-registry maven plugin would be used to check backward…
0
votes
0 answers

Unable to stream avro formated data coming from Kafka Debezium connector

Im streaming mongo oplog data though Kafka. Im using Debezium CDC Kafka connector to tail mongo oplog. Schema registry uses AvroConverter convertor for serialising keys and value bootstrap.servers=localhost:9092 Kafka…
0
votes
2 answers

Differentiating between binary encoded Avro and JSON messages

I'm using python to read messages coming from various topics. Some topics have got their messages encoded in plain JSON, while others are using Avro binary serialization, with confluent schema registry. When I receive a message I need to know if it…
0x26res
  • 11,925
  • 11
  • 54
  • 108
0
votes
1 answer

How to create Transform in Memsql when source is Kafka Avro Format

I am able to push data from Kafka to Memsql. I am trying to push using Transform. I have created Kafka Consumer in Python which is consuming data from Kafka Topic and converting to Json Format. I don't know how to use this as Transform in Memsql.…
0
votes
2 answers

Testing Kafka Processor API that uses SpecificAvroSerde

I'm trying to write unit test for a custom stream processor and got stuck with serializing the message i need to send for the test. I followed this example by kafka: https://kafka.apache.org/11/documentation/streams/developer-guide/testing.html . I…
0
votes
1 answer

log4j:WARN No appenders could be found for logger (org.apache.commons.beanutils.converters.BooleanConverter)

I know this question comes up all the time but I am totally stumped. I am trying to write a KafkaStreams application logging with Log4j2 and I am getting a Log4j (version 1) warning about no appender: log4j:WARN No appenders could be found for…
DVS
  • 783
  • 7
  • 25
0
votes
1 answer

How to find which Schema Registry pod/node is currently the master

I have deployed 3 schema registry pods on a k8s cluster. I am trying to find which schema registry pod is currently acting as a master. I have figured it out one way to find master through schema registry logs. Here I have to check all pods logs…
0
votes
2 answers

Get Avro record as binary array with kafka-avro-console-consumer

I have a Kafka Producer which sends Avro records to a "test" topic. I also have a Schema Registry in which each record's schema is stored. After that, I use the command kafka-avro-console-consumer --topic test --zookeeper localhost:2181 …
0
votes
2 answers

org.apache.kafka.connect.errors.DataException: Invalid JSON for record default value: null

I have a Kafka Avro Topic generated using KafkaAvroSerializer. My standalone properties are as below. I am using Confluent 4.0.0 to run Kafka…
0
votes
1 answer

How to produce avro format data onto topic in kstreams

KStream left = builder.stream("source1"); left.toStream("soure2") want to serialize before sending to source2
0
votes
1 answer

Joins on Avro format data using lambda in kStreams

I have two streams: Stream1: [KSTREAM-MAP-0000000004]: 1, {"id": 1, "name": "john", "age": 26} [KSTREAM-MAP-0000000004]: 2, {"id": 2, "name": "jane", "age": 24} [KSTREAM-MAP-0000000004]: 3, {"id": 3, "name": "julia", "age":…
0
votes
1 answer

Reading message from Kafka with java.util.List in avro schema

I am trying to read the message from Kafka using consumer with the following…