I'm running into a problem that's almost identical to another StackOverflow question, but the solution provided doesn't work for me.
The error I'm getting (at runtime):
2022-09-14 11:53:49,301 [ERROR] [scheduling-1] BindingService - Failed to create producer binding; retrying in 30 seconds {}
java.lang.ClassCastException: class com.sun.proxy.$Proxy152 cannot be cast to class org.springframework.messaging.MessageChannel (com.sun.proxy.$Proxy152 and org.springframework.messaging.MessageChannel are in unnamed module of loader 'app')
at org.springframework.cloud.stream.binder.AbstractMessageChannelBinder.doBindProducer(AbstractMessageChannelBinder.java:92) ~[spring-cloud-stream-3.2.4.jar:3.2.4]
at org.springframework.cloud.stream.binder.AbstractBinder.bindProducer(AbstractBinder.java:152) ~[spring-cloud-stream-3.2.4.jar:3.2.4]
at org.springframework.cloud.stream.binding.BindingService.lambda$rescheduleProducerBinding$4(BindingService.java:351) ~[spring-cloud-stream-3.2.4.jar:3.2.4]
at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54) [spring-context-5.3.22.jar:5.3.22]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
at java.lang.Thread.run(Thread.java:829) [?:?]
But my build.gradle.kts has:
implementation("org.apache.avro:avro:1.11.0")
implementation("io.confluent:kafka-avro-serializer:7.2.1")
implementation("io.confluent:kafka-streams-avro-serde:7.2.1")
implementation("org.springframework.cloud:spring-cloud-starter-stream-kafka:3.2.5")
implementation ("org.springframework.cloud:spring-cloud-stream-binder-kafka-streams:3.2.5")
I've tried altering my function around to be a consumer, producer, or function, and the only thing that changes is if it's a "producer binding" or a "consumer binding" that throws the exception on the BindingService.
@Bean
public Function<KStream<String, InboundObject>, KStream<String, OutboundObject>> streamFunction() {
return input -> {
return input.map((k, v) -> new KeyValue<>(null, null));
};
}
For Serde, I'm using Avro:
default.value.serde: io.confluent.kafka.streams.serdes.avro.GenericAvroSerde
binder:
producer-properties:
key.serializer: org.apache.kafka.common.serialization.StringSerializer
value.serializer: io.confluent.kafka.serializers.KafkaAvroSerializer
schema.registry.url: http://localhost:9091
consumer-properties:
key.deserializer: org.apache.kafka.common.serialization.StringDeserializer
value.deserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
schema.registry.url: http://localhost:9091
specific.avro.reader: true