Questions tagged [confluent-schema-registry]

Schema Registry provides a store, backed by Kafka, for storing and retrieving Avro schemas. It also provides Avro serializers for Kafka clients.

Schema Registry is part of Confluent Open Source Stream Processing Platform.

It provides a serving layer for your metadata. It provides a RESTful interface for storing and retrieving Avro schemas. It stores a versioned history of all schemas, provides multiple compatibility settings and allows evolution of schemas according to the configured compatibility setting. It provides serializers that plug into Kafka clients that handle schema storage and retrieval for Kafka messages that are sent in the Avro format.

1076 questions
0
votes
1 answer

kafka topic has two avro classes

I'm trying to understand more about what the schema-registry keeps per schema. Where can I go, or what tool can I use, to know the name of the java package that is associated with the class that was created by the avro file. The specific piece…
0
votes
2 answers

Do we need to manually cache schema registry?

We are currently using Protocol Buffers as serialization mechanism for kafak message. We are going to move to Avro. We tested Avro Confluent consumer with Schema Registry and according to those tests, Avro consumer is little bit slow compare to…
0
votes
0 answers

Compiling kafka schema-registry fails for building kafka-connect-hdfs

I'm trying to build kafka-connect-hdfs by following this FAQ. While trying to compile schema-registry, I get the following error: [ERROR] Failed to execute goal org.apache.maven.plugins:maven- compiler-plugin:3.6.1:compile (default-compile) on…
0
votes
0 answers

Writing failed when using remote Kafka schema_registry

My code implemented for writing Avro data through Schema_Registry server to Kafka broker. The local test is good(I setup a local Broker and Schema_Registry server). but when I changed my configuration file and used the remote kafka and…
Jack
  • 5,540
  • 13
  • 65
  • 113
0
votes
1 answer

Can I use schema registry to get schema when using kafka s3 sink connect?

I have a kafka topic, the value there is avro format, where the schema is stored in schema registry. Now I want to setup a S3 Sink, following this:…
0
votes
1 answer

schema registry : Share partially/ authorization system

We need to share part of our Schema registry with another company and don't want them to see all the schemas. They also need to do the same for theirs. Is there any way that each of us can share only part of our schema registry ?
0
votes
0 answers

Unable to link Kafka MQTT source connecttor to InfluxDB sink connector

We're trying to link a MQTT source connector to an InfluxDB sink connector. Right now the former is working fine but the latter gives the exception below: org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to…
ilcorvo
  • 446
  • 5
  • 18
0
votes
1 answer

Confluent JDBC connector And Flink consumer

We are trying to use SQL-Server JDBC Connector with KafkaAvroSerializer and also providing Customized ProducerInterceptor to Encrypt data before sending it to Kafka. On consumer side, we want to use Flink connector to Decrypt and then use the…
0
votes
1 answer

Schema Registry won't start after upgrading to Confluent 4.1

I have recently upgraded Confluent to 4.1 but schema registry seems to have some issues. On confluent start schema-registry (and consequently ksql-server) cannot start. Here's the error I get in the logs of schema-registry: [2018-04-20…
0
votes
1 answer

How to Set Spring Kafka consumer max attempts when using Schema Registry

I am developing Spring boot server with Spring kafka(1.3.2.RELEASE), apache avro(1.8.2) and io.confluent's Schema Registry(3.1.2). So evenytime the kafka listener gets a kafka message, it will find the schema id in message and get the avro schema…
Azarea
  • 516
  • 7
  • 22
0
votes
1 answer

How to use JDBC connector with Customized Encryption

We have a requirement to use JDBC Connector to read the data from RDBMS and then use our Custom Encryption before pushing data to Kafka. And decrypting the data on the way out and then pushing it to subsequent sinks. To achieve this, do we need to…
0
votes
1 answer

JDBC Confluent kafka Connector and Topic per schema

We recently started using Confluent Kafka-JDBC connector to import RDBMS data. As part of default configuration settings, it seems that one Topic is created for every Table in the schema. I would like to know if there is any way to Create Topic…
0
votes
2 answers

Troubles with ksql running in docker

I have confluent kafka, zookeeper, schema-registry and ksql running in containers on Kubernetes cluster. Kafka, zookeeper and schema registry works fine, a can create topic and write data in Avro format, but when I'm trying to check ksql and create…
0
votes
1 answer

KafkaConnect HDFS Connector with SchemRegistry

I referred the following link to understand HDFS Connect for Kafka https://docs.confluent.io/2.0.0/connect/connect-hdfs/docs/index.html I was able to export data from kafka to HDFS with hive integration. Now I am trying to write avro records to…
0
votes
1 answer

Confluent Schema Registry - 405 error after creating topic

I am running a local version of Confluent (4.0) on MacOS and after starting it up and creating a topic newtopic and going to http://localhost:18081/subjects/newtopic - getting following error: {"error_code":405,"message":"HTTP 405 Method Not…
Joe
  • 11,983
  • 31
  • 109
  • 183