0

Previous note: I am fairly new to Kafka.

I am trying to get all schemas from the Schema Registry, but I am not being able to do so only with a schema registry client. It only works if, prior to that, I instantiate a KafkaConsumer.

Can't understand why. Here's the code (with the consumer in place).

ConsumerConfig is just a class with all configurations needed. Including the Schema Registry URL.

Consumer<String, String>  consumer = new KafkaConsumer<String, String>(ConsumerConfig.get());
CachedSchemaRegistryClient client = new CachedSchemaRegistryClient(ConsumerConfig.getSchemaRegistryURL(), 30);
Collection<String> listOfSubjects = client.getAllSubjects();
consumer.close();

Without the consumer, i get:

io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: No content to map due to end-of-input

With the consumer, everything works fine. I would like if someone could shine some light on why this happens, has I see no reason for me to need to connect to the actual Kafka Cluster via a consumer in order to access the Schema Registry that is on another endpoint.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
LeYAUable
  • 1,613
  • 2
  • 15
  • 30
  • Creating a consumer will not get you all the schemas, it'll only be possible to get the schemas for the topic you've assigned to the consumer – OneCricketeer Jan 03 '19 at 17:10
  • Are there any errors in schema registry log? Do you get correct response when calling schema registry using curl - `curl http://localhost:8081/subjects`? – belo Jan 03 '19 at 21:23
  • @cricket_007 I am able to get every schema information without any further coding regarding consumer instantiation. – LeYAUable Jan 06 '19 at 15:10
  • @belo Yes, I tested it on postman and it works just fine – LeYAUable Jan 06 '19 at 15:10
  • Sorry, I don't understand. Consumer doesn't communicate with the registry. The deserializer does. Again, the Consumer isn't needed to get the subjects or schemas (as shown below). Besides, you have not mentioned what version of the Clients or Kafka you are using. If you think this is really an error, feel free to post the issue on Schema Registry Github. – OneCricketeer Jan 07 '19 at 03:00
  • Yes, I also find it weird precisely for the same reasons you just stated...it shouldn't be necessary but weirdly it is not working without it. I'll further investigate. – LeYAUable Jan 07 '19 at 11:07

1 Answers1

2

You don't have to create KafkaConsumer instance at all. Both are totally independent.

If you just want to get all the subjects and schema from SchemaRegistry, just create an instance of CachedSchemaRegistryClient and call the related operation.

Here is a working example :

 private final static Map<String, Schema> schemas = new ConcurrentHashMap<>();
 protected static SchemaRegistryClient schemaRegistryClient;

 public static void main(String[] args) {
       String registryUrl = "http://localhost:8081";
        try {
            schemaRegistryClient = new CachedSchemaRegistryClient(registryUrl, 30);
            System.out.println(schemaRegistryClient);
            Collection<String> subjects = schemaRegistryClient.getAllSubjects();
            System.out.println(subjects);
        } catch (Exception e){
            throw new RuntimeException(e);
        }
    }
OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Nishu Tayal
  • 20,106
  • 8
  • 49
  • 101
  • Well, that was also my understanding, and was what I was doing in the code I posted (if you remove the instantiation and closing of the KafkaConsumer). But I get the error I referred. – LeYAUable Jan 03 '19 at 10:25