I have a question about set up a stream processor with Kafka and different names of the topic (Kafka broker) and the subject (Schema Registry).
spring:
cloud:
schema-registry-client:
endpoint: http://localhost:8081
cached: true
stream:
function:
definition: process
default:
consumer:
use-native-decoding: true
producer:
use-native-encoding: true
header-mode: none
bindings:
process-in-0:
group: spring-boot-kafka
destination: abc.bla
consumer:
max-attempts: 3
process-out-0:
destination: def.bla
kafka:
binder:
auto-add-partitions: false
auto-create-topics: false
consumer-properties:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
specific.avro.reader: true
schema.registry.url: http://localhost:8081
allow.auto.create.topics: false
auto.register.schemas: false
producer-properties:
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
schema.registry.url: http://localhost:8081
auto.register.schemas: false
brokers:
- localhost:9092
configuration:
allow.auto.create.topics: false
auto.register.schemas: false
application.id: "${spring.application.name}"
First anything seems to be work fine with the Kafka broker and the schema registry but if the processor receives the event the schema registry magic starts.
Instead of sending abc as topic to the schema registry abc.bla will be send. The schema registry answers with not found.
Expected: localhost:8081/subjects/abc/versions Unepected and wrong: localhost:8081/subjects/abc.bla/versions
error_code 40401
message "Subject not found."
I wonder what's wrong because a single producer or consumer client seems to be able to recognizing the correct subject name from the topic without explicit configuration.
Here the processor code:
@SpringBootApplication
@EnableSchemaRegistryClient
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
@Bean
public Function<ABC, DEF> process() {
return Transformer::transform;
}
}
Here the stack trace where I believe could be the problem:
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema: { a long schema }
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Subject not found.; error code: 40401
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:298) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:368) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:453) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.schemaregistry.client.rest.RestService.lookUpSubjectVersion(RestService.java:440) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getIdFromRegistry(CachedSchemaRegistryClient.java:254) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.getId(CachedSchemaRegistryClient.java:444) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.schemaregistry.client.SchemaRegistryClient.getId(SchemaRegistryClient.java:192) ~[kafka-schema-registry-client-7.0.0.jar:na]
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:73) ~[kafka-avro-serializer-5.3.0.jar:na]
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:53) ~[kafka-avro-serializer-5.3.0.jar:na]
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:62) ~[kafka-clients-2.7.1.jar:na]
Does anybody an idea how I can configure io.confluent.kafka.serializers.KafkaAvroDeserializer
or io.confluent.kafka.serializers.KafkaAvroSerializer
correct?
Thanks a lot, Markus