1

I am new to Spring Cloud Data Flow, and need to listen for messages on a topic from an external kafka cluster. This external kafka topic in confluent cloud would be my Source that I need to pass on to my Sink application.

I am also using kafka as my underlying message broker, which is a separate kafka instance that is deployed on kubernetes. I'm just not sure what is the best approach to connect to this external kafka instance. Is there an existing kafka Source app that I can use, or do I need to create my own Source application to connect to it? Or is it just some kind of configuration that I need to setup to get connected?

Any examples would be helpful. Thanks in advance!

Steve
  • 53
  • 5
  • Even though both systems are Kafka they are different instances and as such constitute multiple binder scenarios described here https://docs.spring.io/spring-cloud-stream/docs/3.1.3/reference/html/spring-cloud-stream.html#multiple-binders. Basically based on your description you don't really need Source/Sink. You can simply have a single application with a Function where its input is binding points to external Kafka and output to your docker Kafka – Oleg Zhurakousky Aug 02 '21 at 13:26
  • 1
    @OlegZhurakousky if I create a single app with a Function, how do i setup the stream in SCDF? From the examples I have seen, typically you would add an app with a Function as type Processor, but can I add it as a Source instead in this case? It won't let me add a stream with just a processor and a sink. Or is there something else i need to add as the Source? – Steve Aug 02 '21 at 15:05

0 Answers0