2

I'm using node-rdkafka (https://github.com/Blizzard/node-rdkafka) to consume messages, the basic setup is working fine but it triggers the function every time I push something to the queue, irrespective of completion of previous method.

I want the next data unit to be triggered when the previous function is done.

here is my implementation

const Kafka = require('node-rdkafka');
const topic = 'create_user_channel';
const consumer = new Kafka.KafkaConsumer({
    'group.id':'consumer',
    'metadata.broker.list': '*******',
    'sasl.mechanisms': 'PLAIN',
    'sasl.username': '********',
    'sasl.password': '********',
    'security.protocol': 'SASL_SSL',
    'enable.auto.commit':false
}, {});

// Connect the consumer.
consumer.connect({timeout: "1000ms"}, (err) => {
    if (err) {
        console.log(`Error connecting to Kafka broker: ${err}`);
        process.exit(-1);
    }

});
let is_pause = false;
consumer.on('ready', (arg)=>{
    console.log('consumer ready.' + JSON.stringify(arg));
    console.log('Consumer is ready');
    consumer.subscribe([topic]);
    setInterval(function() {
        console.log('consumer has consume on :'+timeMs());  
        consumer.consume();
      }, 1000);
});

consumer.on('data',async (data)=>{
    console.log('consumer is consuming data');
    if(!is_pause) {
        is_pause = true;
        if(data && typeof data !== 'undefined') {
            try {
                console.log('consumer received the data');
                consumer.pause([topic]);
                console.log('consumer has pause the consuming');
                await processMessage(data);
                console.log('consumer is resumed');
                consumer.resume([topic]);
                is_pause = false;
            } catch(error) {
                console.log('data consuming error');
                console.log(error);
            }
        } else {
            is_pause = false;
        }
    }
});


OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Mayank Bansal
  • 23
  • 1
  • 5

1 Answers1

1

You are calling consume() (without any arguments) which returns messages as fast as possible.

If you want to control the consumption pace, you can use the other method consume(size), that returns size Kafka records. For example consume(1) will return the next Kafka record.

See the node-rdkafka Consumer docs.

Mickael Maison
  • 25,067
  • 7
  • 71
  • 68
  • I tried that passing one as an argument also but now it only read one message and waits forever. basically my requirement is to read one message and commit it manually and then read another message and even if there are multiple message already present in the queue consumer should consume one process and commit and then consume the next. – Mayank Bansal Jan 27 '20 at 10:49
  • You need to keep calling `consume(size)` to get next messages – Mickael Maison Jan 27 '20 at 11:07
  • ` setInterval(() => { this.consumer.consume(); }, 1000); ` i am trying this but still it only read first message – Mayank Bansal Jan 27 '20 at 11:11