Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
6
votes
1 answer

Using Schema Registry from Confluent with Avro and Kafka in Spring Boot Applications

First of all, I must say I'm not familiar with confluent. I was following this tutorial: https://www.confluent.io/blog/schema-registry-avro-in-spring-boot-application-tutorial/ and I got stuck. I couldn't create the consumer for Kafka because I've…
6
votes
1 answer

How to find the schema id from schema registry used for avro records, when reading from kafka consumer

We use schema registry for storing schemas, and messages are serialised to avro and pushed to kafka topics. Wanted to know, when reading data from consumer, how to find the schema id, for which the avro record is serialised. We require this schema…
6
votes
1 answer

How pass Basic Authentication to Confluent Schema Registry?

I want to read data from a confluent cloud topic and then write in another topic. At localhost, I haven't had any major problems. But the schema registry of confluent cloud requires to pass some authentication data that I don't know how to enter…
6
votes
4 answers

Kafka consumer unit test with Avro Schema registry failing

I'm writing a consumer which listens to a Kafka topic and consumes message whenever message is available. I've tested the logic/code by running Kafka locally and it's working fine. While writing the unit/component test cases, it's failing with avro…
6
votes
2 answers

How can we configure value.subject.name.strategy for schemas in Spring Cloud Stream Kafka producers, consumers and KStreams?

I would like to customize the naming strategy of the Avro schema subjects in Spring Cloud Stream Producers, Consumers and KStreams. This would be done in Kafka with the properties key.subject.name.strategy and value.subject.name.strategy ->…
6
votes
1 answer

Multiple Message Types in a Single Kafka Topic with Avro

I have an event sourced application built on top of Kafka. Currently I have one Topic that has multiple message types in it. All serialized/deserialized with JSON. The schema registry from confluent looks like a good approach to message types…
6
votes
1 answer

Cannot connect to single-node Kafka server through Docker

I'm trying to connect to single-node Kafka server through Docker but I am getting the following error: %3|1529395526.480|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed:…
5
votes
1 answer

kafka streams protobuf cast exception

I am using Kafka streams to read and process protobuf messages. I am using the following properties for the stream: Properties properties = new Properties(); properties.put(ConsumerConfig.GROUP_ID_CONFIG, kafkaConfig.getGroupId()); …
5
votes
1 answer

Avro schema with logicalType can't be used with latest confluent-kafka

We are using kafka with avro schemas and schema registry set to FULL compatibility. Our schemas use logicalType fields, for example: { "name": "MyRecord", "type": "record", "fields": [ { "name": "created_at", "type": [ …
5
votes
1 answer

how to properly register Protobuf schema with Schema Registry / Kafka-Rest

I am trying to post a Protobuf schema to the Schema Registry using kafka-rest interface: curl -X POST -H "Content-Type: application/vnd.kafka.protobuf.v2+json" \ -H "Accept: application/vnd.kafka.v2+json" \ --data '{"value_schema":…
5
votes
2 answers

SpringBoot Embedded Kafka to produce Event using Avro Schema

I have created the below test class to produce an event using AvroSerializer. @SpringBootTest @EmbeddedKafka(partitions = 1, brokerProperties = { "listeners=PLAINTEXT://localhost:9092", "port=9092" }) @TestPropertySource(locations =…
5
votes
2 answers

Kafka schema registry RestClientException: Unauthorized; error code: 401

I am trying to read data from a kafka avro topic using the avro schema from the confluent client registry. I am using io.confluent library version 5.4.1. This is the entry in the gradle file compile (group: 'io.confluent', name:…
Akshata
  • 1,005
  • 2
  • 12
  • 22
5
votes
1 answer

docker-compose kafka wait for zookeeper and schema-registry wait for kafka

I read Docker (Compose) client connects to Kafka too early, but it doesn't give which command to check. How should I configure my kafka broker so it retries when zookeeper is not ready? and my schema-registry also fails due to kafka broker is not…
Holm
  • 2,987
  • 3
  • 27
  • 48
5
votes
1 answer

What value to provide for the Kafka schema.registry.ssl.engine.factory.class

I have a spring boot application with some kafka providers/consumers and some integration tests(using embedded kafka) for them. Everything worked fine, until i have lifted the spring boot version to 2.3(from 2.1.x) and the spring-kafka to 2.6.0. Now…
5
votes
1 answer

Confluent Schema Registry: POST simple JSON schema with object having single property

OS: Ubuntu 18.x docker image (from dockerhub.com, as of 2020-09-25): confluentinc/cp-schema-registry:latest I am exploring the HTTP API for the Confluent Schema Registry. First off, is there a definitive assertion somewhere about what version of…
Kode Charlie
  • 1,297
  • 16
  • 32
1 2
3
71 72