Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
3
votes
0 answers

Confluent/ Kafka support for the upcoming Java 17 LTS

Has Confluent shared any plans for supporting the upcoming Java 17 LTS release? I am mostly interested in the java client libraries. The official documentation does not mention any version after Java 11 (the current latest LTS release). Apache Kafka…
3
votes
1 answer

Failed to connect to Confluent Platform Schema Registry - Apache Flink SQL Confluent Avro Format

I am using Confluent managed Kafka cluster, Schema Registry service and trying to process Debezium messages in a Flink job. The job is configured to use Table & SQL Connectors and Confluent Avro Format. However the job is not able to connect to…
3
votes
1 answer

Not able to create topic with default schema validation true | Confluent Platform

I am trying to create topic using the below command kafka-topics --create --bootstrap-server confluent-platform-cp-kafka:9092 --replication-factor 1 --partitions 1 --topic push.rdes.portfolios --config…
3
votes
1 answer

MockSchemaRegistryClient not registering avro schema: Cannot get schema from schema registry

I am writing a spring boot Integration test using spring-kafka-test 2.6.3 EmbeddedKafka and Junit 5 for a topology that consumes avro messages. In the test I am using MockSchemaReigstryClient I am registering the mock schema client and configuring…
3
votes
0 answers

How to disable automatic schema registration in Apache Flink's Avro serializer?

Using the Avro serializers provided by the Confluent platform, it is possible to disable schema registration during serialization. I.e., when a schema is not already registered as a schema version for a subject, an exception is raised (by default,…
3
votes
1 answer

Kafka Sink: ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:130)

Am trying to stream data from one stream file to another file. It was working earlier and suddenly it providing the error as ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone:130). Have restarted the zookeeper,…
3
votes
1 answer

Confluent Schema Registry: TopicRecordNameStrategy

I am new to Confluent Schema Registry and I am trying to understand the core concepts first. I am a little fuzzy on TopicRecordNameStrategy: TopicRecordNameStrategy: Derives the subject name from topic and record name, as a way to group logically…
Ihor M.
  • 2,728
  • 3
  • 44
  • 70
3
votes
2 answers

Using 'SchemaRegistryClient' to deserialize AVRO message in Python

We are trying to consume AVRO messages coming from other systems. I am able to read the AVRO message when I specify the schema as a file (.avsc) using the below code, import avro.schema from avro.io import DatumReader, BinaryDecoder ... schema =…
Prasad Sawant
  • 205
  • 1
  • 15
3
votes
4 answers

Kafka Schema Registry getting error Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false')

I am trying to integrate SpringBoot Application with Kafka Schema Registry. I have created a kafkaPrducer which will send message to Kafka Topic after validating to Schema Registry: public class Producer { @Value("${topic.name}") private…
3
votes
1 answer

Produce called with an IAsyncSerializer value serializer configured but an ISerializer is required when using Avro Serializer

I am working with Kafka cluster and using Transactional Producer for atomic streaming (read-process-write). // Init Transactions _transactionalProducer.InitTransactions(DefaultTimeout); // Begin the…
3
votes
1 answer

spring-cloud-stream-binder-kafka configuration for Confluent Cloud Schema Registry Unauthorized error

I'm having trouble configuring a connection to Confluent when using spring-cloud-stream-binder-kafka. Possibly somebody can see what is wrong? When I use the example from…
3
votes
1 answer

How to programmatically update subject schema and compatibility in Confluent Schema Registry

I have a schema already registered in schema registry, which I was able to do using register() like this, from schema_registry.client import SchemaRegistryClient, schema subject_name = "new-schema" schema_url = "https://{{ schemaRegistry }}:8081"…
3
votes
2 answers

Confluent schema-registry how to http post json-schema

Confluent 5.5.0 understands not just Avro schemas, but also json-schema and protobuf. I have a valid json-schema that I'm trying to curl to the schema registry server, but I keep getting the response curl -X POST -H "Content-Type:…
bart van deenen
  • 661
  • 6
  • 16
3
votes
1 answer

How to programatically register Avro Schema in Kafka Schema Registry using Python

I put data and schema to kafka and schema registry with python. from confluent_kafka import avro from confluent_kafka.avro import AvroProducer value_schema_str = """ { "type":"record", "name":"myrecord", "fields":[ …
CompEng
  • 7,161
  • 16
  • 68
  • 122
3
votes
1 answer

How to connect confluent cloud schema registry?

I am using Confluent managed Kafka cluster and Schema Registry service. I can manage connect confluent cloud kafka cluster adding following properties to producer config…