1

I am consuming data from a kafka topic using reactor-kafka library with spring webflux SSE streams. I need to return a special ServerSentEvent when all the messages from the topic are consumed i.e. maximum topic offset get equals to current offset consumed while subscribing from 0 offset. so that clients get to know that there are no more messages present at kafka topic.

Is it possible to achieve something like this using web flux? i.e. if I say after every 100 elements consumed from any finite list of elements and sent over SSE stream as ServerSentEvent this SSE stream should get one more event as SeverSentEvent with comment "consumed".

2 Answers2

0

I might be wrong, but, this does not make any sense to me.
Kafka is event streaming(always on).
I mean, there is no concept as the last message in kafka.

You may write some code at the topic consumer side, so, when, no event arrives at certain time, trigger an action.
But, this is not accurate, and does make much sense.

sgtcortez
  • 407
  • 4
  • 15
  • True, but if we ignore Kafka completely then, Is it possible to achieve something like this using web flux? i.e. if I say after every 100 elements consumed from any finite list of elements and sent over SSE stream as ServerSentEvent this SSE stream should get one more event as SeverSentEvent with comment "consumed". – Harish Vashist Jun 29 '21 at 00:55
0

I don't know if this is what you want.

    @GetMapping(path = "/sse/endpoint", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
    public Flux<Event> sse() {
        return new KafkaConsumer().receive();
    }

    public class KafkaConsumer {

        private final Sinks.Many<Event> sseEventSender = Sinks.many().multicast()
                .onBackpressureBuffer();

        public Flux<Event> receive() {
            return sseEventSender.asFlux();
        }

        public void consume() {
            if(...) {
                sseEventSender.tryEmitNext(event);
            }
        }
    }
echooymxq
  • 56
  • 2
  • My aim is: 1) read from Kafka 2) send Kafka messages as ServerSentEvent drived from flux created in above step 3) if any message satisfy a condition, send a special extra message as ServerSentEvent e.g. 100 message consumed on Kafka then send highwater comment in ServerSentEvent because client cares for such event to kickstart their processing I have done 1 and 2 with the help of spring webflux and reactor Kafka's receive method. Now how to integrate 1 and 2 with 3? – Harish Vashist Jun 29 '21 at 02:53
  • `sseEventSender.tryEmitNext()` can emit any element what you want, you can create a extra message meet with your condition, or else emit the kafka message as event. – echooymxq Jun 29 '21 at 03:11
  • it is not either kafka message or special message. It is to send all kafka messages as SSE and if any condition satisfy on that message then send an extra special message e.g. send first 10 kafka message then special message then send rest all messages present on kafka. – Harish Vashist Jun 29 '21 at 03:32