Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
0
votes
1 answer

Incompatible Avro messages between Spring Cloud Stream Kafka Stream and native Kafka Stream applications and producers

Sample applications to verify this can be found in https://github.com/codependent/event-carried-state-transfer/tree/avro kafka-xxx: native applications spring-boot-xxx: Spring Cloud Stream applications The problem is Avro messages produced by a…
0
votes
1 answer

Spring Cloud Stream Kafka application not generating messages with the correct Avro schema

I have an application (spring-boot-shipping-service) with a KStream that gets OrderCreatedEvent messages generated by an external producer (spring-boot-order-service). This producer uses the following schema: order-created-event.avsc { "namespace"…
0
votes
1 answer

Can not produce with dockerized producer into Kafka

Did anyone encounter this error ? io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')\n at [Source:…
0
votes
1 answer

kafka topic name change without changing consumers

We planning to remove versions from kafka topic names. Currently schema version of the relevant message forms part of the topic name. But in future we will have large number of small variations of message and we don't want to create too many topics.…
Sammy Pawar
  • 1,201
  • 3
  • 19
  • 38
0
votes
1 answer

SchemaRegistryClient cache not working, unnecessary Schemaregistry GET requests are the problem,

I am using Kafka to deserialize Avro messages. For that the programm should pull the corresponding Schema from the Schema registry. The streaming app is implemented as a Nifi processor, which works in itself. The problem is, that after every flow, a…
0
votes
1 answer

Numeric values not displaying in Kafka consumer CLI

Writing data from Avro file to topics by running Confluent 5.1.0. when I ran Kafka consumer command, numeric values are not displaying. kafka_2.12-2.1.0 root$ bin/kafka-console-consumer.sh --bootstrap-server 127.0.0.1:9092 --topic Initiate_scans I…
0
votes
1 answer

Exception while processing schema registry deserialiser kafka

I'm following the confluent link to post kafka messages on changes to mysql table. When I try to consume this message from a springboot application, I'm getting the below exception. How can I fix this so that I can read the message. Sometimes I am…
0
votes
0 answers

Using Confluent KafkaAvroSerializer library

I am wondering how i am supposed to use the KafkaAvroSerializer. I have installed the confluent-schema-registry as a docker container. Downloading the sources of the schema registry and building it locally does not work because all the dependencies…
Chris
  • 7,675
  • 8
  • 51
  • 101
0
votes
1 answer

Messages saved Kafka Topic not saving correctly via Kafka Connector

So I have a Confluent Kafka JDBC connector set up. First I start up a schema registry such as ./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties This is the schema-registery.properties…
0
votes
0 answers

Java code for reading Kafka avro messages in spark 2.1.1 structure streaming

I'm looking for how to read avro messages which has complex structure from Kafka using Spark structure streaming I then want to parse these message and compare with hbase reference values, and then save outcome into hdfs or another hbase table. I…
0
votes
1 answer

Kafka topic with different format of data

I have written some avro data to the topic “test-avro” using Kafka-avro-console-producer. Then I have written some plain text data to the same topic “test-avro” using Kafka-console-producer. After this, all the data in the topic got corrupted. Can…
RaAm
  • 1,072
  • 5
  • 22
  • 35
0
votes
1 answer

Kafka :Confluent Schema Registry - Consumer process

As per my understanding, Avro schema will be cached in local and consumer will be using from the local cache for deserlization process. In this process if exception scenario occur like avro schema is not cached in local . What will happen ? Will it…
0
votes
1 answer

Deserializing exception with data generated ksql-datagen utility

Generated sample stream from ksql-datagen utility from following schema - { "type": "record", "name": "users", **"namespace": "com.example",** "fields": [ { "name": "registertime", "type": { …
0
votes
1 answer

Why do I need to create a Kafka Consumer to connect to Schema Registry?

Previous note: I am fairly new to Kafka. I am trying to get all schemas from the Schema Registry, but I am not being able to do so only with a schema registry client. It only works if, prior to that, I instantiate a KafkaConsumer. Can't understand…
LeYAUable
  • 1,613
  • 2
  • 15
  • 30
0
votes
1 answer

MongoDb Sink Connector :: JsonParseException: JSON reader was expecting a value but found 'dist'

I am trying to establish a data flow wherein a mosquitto publisher will send data to the kafka broker via MQTT Source Connector and the kafka broker will forward the input data to a MongoDb database via MonoDb Sink Connector. The MQTT Source…