0

I need to create an application which has to bridge multiple Queue solutions like Rabbitmq, Kafka, Azure eventhub. I am thinking to use Spring cloud Stream so that i can define the bridge rules in the application.yml. Ex, I want to consume a message from Kafka1(Topic: Test) to Rabbitmq(Queue: Test1) and RabbitMq(Queue: Test2) to Kafka(Topic: Test3).To achieve this rite now I have to create two functions one for each and able to transfer the messages. But in few days, If i want to add one more transfer, then i had to change the code for it. Is there anyway where i can do the binding without writing the function in Spring cloud stream ?

Currently I can achieve this like below

spring:
  cloud:
    stream:      
      bindings:      
        receive-in-0:
          destination: test          
          type: kafka
        receive-out-0:          
          destination: test1         
          type: rabbit
      
        receive1-in-0:          
          destination: test2         
          type: rabbit              
        receive1-out-0:
          destination: test3
          type: kafka
      function:
        definition: receive;receive1
      

I created receive and receive1 function like below as two beans.

  @Bean
  public Function<String, String> receive() {
      return foo -> foo;
  }
  
  @Bean
  public Function<String, String> receive1() {
      return foo -> foo;
  }

What i want is to avoid creating this beans so that i can add 'N' number of such transfers just by giving the topic and broker types.

  • If you want to "**consume** from Kafka" doesn't sound like data **out** is the correct label. In any case, Spring _Dataflow_ might make more sense here – OneCricketeer Apr 27 '22 at 20:03
  • Since the data is coming out of the topic test, i have mentioned the label as dataOut. I thought i will be able to achieve my ask with spring cloud stream itself but am just missing the core logic of creating multiple binders programmatically. – user3655499 Apr 28 '22 at 01:14
  • I don't understand the question. Are you asking how to add binders programmatically? That would start in your `@Configuration` class. But your question says "without writing any function", in which case, I'm not sure sure you even want to use Spring, as at least some function code will be required, so Spring Cloud _Dataflow_ has less necessary code to route data between binders – OneCricketeer Apr 28 '22 at 13:58
  • @OneCricketeer I have edited the question with more details. – user3655499 Apr 29 '22 at 10:59
  • Yeah, I still don't think Spring should really be used here. More specifically, there isn't any "identity function" for routing the data, I don't think. Kafka Connect framework is provided by Kafka and has connectors for RabbitMQ, and mirroring to other Kafka clusters. It'll also scale better – OneCricketeer Apr 29 '22 at 13:42

0 Answers0