2

I have simple @Bean (Java 8 Functions) which are mapped to a destination topic (-out and -in).

@Bean
public Function<String, String> transform() {
    return payload -> payload.toUpperCase();
}

@Bean
public Consumer<String> receive() {
    return payload -> logger.info("Data received: " + payload);
}

.yml config:

spring:
  cloud:
    stream:
      function:
        definition: transform;receive
      bindings:
        transform-out-0:
          destination: myTopic
        receive-in-0:
          destination: myTopic

Now, I want to invoke the transform function via a REST call so that it's output goes to the destination topic (i.e. transform-out-0 mapped to myTopic) and is picked up by the consumer from this destination (receive-in-0 mapped to myTopic). Basically, every REST call should spawn a new instance of a KAFKA Producer and close it.

How can I achieve this please using spring-cloud-stream ?

Thanks

Angshuman

Angshuman Agarwal
  • 4,796
  • 7
  • 41
  • 89

1 Answers1

7

You should use StreamBridge instead of having that transform function. This is the new recommended approach for dynamic destinations in Spring Cloud Stream. Here is the basic idea:

@Autowired
private StreamBridge streamBridge;

@RequestMapping
public void delegateToSupplier(@RequestBody String body) {
    streamBridge.send("transform-out-0", body);
}

and then provide this property through configuration - spring.cloud.stream.source: transform

Spring Cloud Stream will create an output binding called transform-out-0 for you. Each time the REST endpoint is called, through StreamBridge, you will send the data to the destination topic.

For more info see this.

sobychacko
  • 5,099
  • 15
  • 26
  • Ah, thanks. So, say, I am a message listener on IBM MQ and I want to forward the message to KAFKA, I will use `StreamBridge` ? Is this also the recommended way of invoking a `Producer` directly (in native api, we do Producer.send(...)? Otherwise, I see that `Supplier` bean gets called infinitely at 1sec interval always. I had replaced the `transform` with `Supplier` and noticed, that getting polled continuously. – Angshuman Agarwal Jun 25 '20 at 14:54
  • If you are a listener on MQ and want to forward the message to Kafka, you don't need `StreamBridge`. A simple function approach should work as shown with your original code. You don't need to directly deal with `Producer`, that is handled for you by Spring Cloud Stream Kafka binder. Nothing prevents you from using the producer API directly though, but i don't see the need for your use case. – sobychacko Jun 25 '20 at 15:13
  • As to your second question about `Supplier`, you can control the polling frequency using the properties listed here: https://github.com/spring-cloud/spring-cloud-stream/blob/master/spring-cloud-stream/src/main/java/org/springframework/cloud/stream/config/DefaultPollerProperties.java – sobychacko Jun 25 '20 at 15:16
  • thanks. I get the supplier polling part now. Sorry, I am a bit lost on MQ message forwarding. So, for any `Foreign event-driven sources`, be it REST or MQ listener, doc says, use `streambridge`. With my above functional approach, how will it forward to Kafka Topic without `streambridge` ? – Angshuman Agarwal Jun 25 '20 at 15:30
  • Sorry, I misunderstood the MQ part. That is kind of a source (supplier) that you want to take an action upon. `StreamBridge` or `Supplier` should work in that case. – sobychacko Jun 25 '20 at 16:21
  • Thank you. I think you meant, I can use `Supplier` if I rewrite `MQ` code with new Functional Pattern (`-in , -out`). But, to plug my existing `MQ` with `KAFKA`, I will need to route it via `streambridge`. Would be great if there is any sample which depicts where I can mix Functional pattern of Spring and use raw Kafka api as well. – Angshuman Agarwal Jun 26 '20 at 13:02
  • I want to call rest endpoint after consumer consumes from topic. How can this be done in spring Spring cloud stream. – Sanjeev Sep 23 '21 at 05:11
  • You can use `StreamBridge` directly inside the consumer method. – sobychacko Sep 23 '21 at 14:26