I have a few requirements for the spring cloud stream I have:
- it needs to take a KStream from a single Kafka topic on one cluster and send a message to multiple topics on another cluster.
- On some occasions it needs to send multiple messages based on one message that has been received.
- These messages all need to be received at least once.
I have looked into using a Function, but I have not been able to resolve how to send multiple messages given one topic, I have also looked into using a Consumer and Supplier, but I can't see this working very well. The way I am currently sending the messages is using a Consumer and then sending via side effects using StreamBridge.
@Bean
@SuppressWarnings("unchecked")
public Consumer<KStream<String, String>> generateMessage() {
return messages -> {
final Map<String, KStream<String, String>> splitMessages =
branchOutput(filterMessages(messages));
KStream<String, MessageData>[] ksArray = splitMessages
.values()
.stream()
.map(message ->
message.mapValues((key, jsonMessage) -> {
try {
return new MessageData(dataTransformService
.transformMessage(key, jsonMessage, extractTopic(jsonMessage)),
removeTopic(jsonMessage), "");
} catch (ClassNotFoundException e) {
return new MessageData(Collections.singletonList(CLASS_NOT_FOUND_EXCEPTION),
removeTopic(jsonMessage), e.getMessage());
}
}))
.toArray(KStream[]::new);
ksArray[0].peek((key, value) -> sendMessage(key, value.getTransformedMessages(),
OUTPUT_BINDING_1, value.getOriginalMessage(), value.getError()));
ksArray[1].peek((key, value) -> sendMessage(key, value.getTransformedMessages(),
OUTPUT_BINDING_2, value.getOriginalMessage(), value.getError()));
ksArray[2].peek((key, value) -> sendMessage(key, value.getTransformedMessages(),
OUTPUT_BINDING_3, value.getOriginalMessage(), value.getError()));
ksArray[3].peek((key, value) -> sendMessage(key, value.getTransformedMessages(),
OUTPUT_BINDING_4, value.getOriginalMessage(), value.getError()));
};
}
// send message(s) to topic or forward to dlq if there is a message handling exception
private void sendMessage(String key, List<String> transformedMessages, String binding, String originalMessage, String error) {
try {
for (String transformedMessage : transformedMessages) {
if (!transformedMessage.equals(CLASS_NOT_FOUND_EXCEPTION)) {
boolean sendTest = streamBridge.send(binding,
new GenericMessage<>(transformedMessage, Collections.singletonMap(
KafkaHeaders.KEY, (extractMessageId(transformedMessage)).getBytes())));
log.debug(String.format("message sent = %s", sendTest));
} else {
log.warn(String.format("message transform error: %s", error));
streamBridge.send(DLQ_OUTPUT_BINDING,
new GenericMessage<>(originalMessage, Collections.singletonMap(KafkaHeaders.KEY,
key.getBytes())));
}
}
} catch (MessageHandlingException e) {
log.warn(String.format("message send error: %s", e));
streamBridge.send(DLQ_OUTPUT_BINDING,
new GenericMessage<>(originalMessage, Collections.singletonMap(KafkaHeaders.KEY,
key.getBytes())));
}
}
What I really need to know is if there is a better way of carrying out these requirements? If not, is there a way to check for acknowledgements from the external kafka cluster (I don't manage it) that we are sending to, so that the message can be resent if not received?