0

I'm implementing an integration in two softwares. Basically I need to convert records from a database to another database, however, I need to convert the foreign keys, adapting them to the database that will receive it.

To do this i used a debezium connection to generate the producer messages. Apparently that worked fine. The problem is when consuming the messages.

I'm using a node api with typeorm and kafkajs to convert the foreign keys and save the records, but the consumption of messages is much lower than the number of records produced. I've already tried to increase the number of topic partitions to do parallel processing, it did improve well, but it's still not enough.

await consumer.subscribe({
    topic,
    fromBeginning: true
})

await consumer.run({
    partitionsConsumedConcurrently: 22,
    eachBatch: async ({
        batch,
        resolveOffset,
        heartbeat,
        commitOffsetsIfNecessary
    }) => {
        for (const message of batch.messages) {
            if (message.value) {
                const value = JSON.parse(message.value.toString())
                const payload = value as T
                try {
                    //this line does the logic of converting the foreign keys and saving the record
                    await onMessage({ topic: batch.topic, payload })
                } catch (e) {
                    console.log('error:', (e as Error).message)
                }
            }
            resolveOffset(message.offset)

            await commitOffsetsIfNecessary({
                topics: [
                    {
                        topic: batch.topic,
                        partitions: [
                            {
                                partition: batch.partition,
                                offset: message.offset
                            }
                        ]
                    }
                ]
            })

            await heartbeat()
        }
    }
})

Key conversions are currently done per database (select). The database management system is postgresql

consumer lag: enter image description here

I would like to know how I can improve the performance of the consumer without necessarily increasing my processing power

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Lucas Rissi
  • 112
  • 3

0 Answers0