0

I have a kafka application running on docker, and I have my nestjs app as microserver, I would like to have a route that takes all the existing messages from a topic inside the kafka broker, is there a way?

Below is my nestjs code, if you can help me implement how I get them

import { Controller, Get, OnApplicationShutdown } from "@nestjs/common";
import { EventPattern, MessagePattern, Payload,  } from "@nestjs/microservices";
import { Consumer, ConsumerRunConfig, ConsumerSubscribeTopics, Kafka } from "kafkajs";

@Controller('/kafka')
export class KafkaEventController {
    private readonly kafka = new Kafka({
        brokers: ['localhost:29092']
    }).producer()
    private adminKafka: Kafka


    @Get('getmessages')
    public async getConsumers (): Promise<any> {

    }

    @MessagePattern('Cart')
    public async sincronize (@Payload() payload: any): Promise<void> {

    }
}
  • You would not use a producer to **consume** messages. But sure, it [is possible](https://docs.nestjs.com/faq/hybrid-application) to setup the kafka microservice separately; doesn't look like you've tried anything... – OneCricketeer Nov 04 '22 at 18:14
  • Also, Kafka topics should be considered infinite, so what do you mean "all messages"? And how do you plan on handling out-of-memory issues while consuming and storing those for an HTTP response? – OneCricketeer Nov 04 '22 at 18:17
  • I tried, but I could only get the messages sent in the event, not the ones stored in the kafka – Christian Guimarães Nov 04 '22 at 20:29
  • I will separate topics by month, and also, if there is a possibility for me to get the messages, I could separate them by pagination. Can u help me? – Christian Guimarães Nov 04 '22 at 20:30
  • Can you please share your consumer code? I personally haven't done this in nest myself, since, like I said topics are endless... You'd be better off writing into a postgresql or Mongo database, for example, then querying and paging that data from your api. More specifically, nest Kafka support does not offer any sort of batching functionality, only message by message, and if you try to deploy multiple instances in the same Kafka consumer group, you will never get "all messages" from one instance. There's also a Kafka REST Proxy from Confluent that is open source you could try instead – OneCricketeer Nov 04 '22 at 23:48

1 Answers1

0

For getting all messages from a specific topic in Kafka you need to create a consumer, In Nestjs you can put initiate code for consumer in to the onModuleInit cycle, and you need a store in memory to keep track of any new message and return it in your endpoint something like this:


const store = [];
export class KafkaEventController {

  @Get('getmessages')
  public async getConsumers(): Promise<any> {
    return store;
  }

  @MessagePattern('Cart')
  public async sincronize(@Payload() payload: any): Promise<void> {}

  async onModuleInit() {
    const topic = 'example-topic';
    const kafka = new Kafka({
      clientId: 'admin-kafka',
      brokers: ['localhost:9092'],
    });
    const consumer = kafka.consumer({
      groupId: 'groupId-consumer',
    });
    await consumer.connect();
    await consumer.subscribe({
      topic,
      fromBeginning: true,
    });
    await consumer.run({
      partitionsConsumedConcurrently: 1,
      eachMessage: async ({ message }) => {
        const msg = message.value.toString();
        store.push(JSON.parse(msg));
      },
    });
   
    // To move the offset position in a topic/partition
    await consumer.seek({
      offset: '0',
      topic,
      partition: 0,
    });
  }
}

You should Be aware of memory leak when there is tons of messages in your Kafka.

Mohsen Mahoski
  • 452
  • 1
  • 5
  • 14