0

I have used kafkajs to expose the data read from a kafka topic to be exposed via a http endpoint for prometheus to scrape the data. But I am not able to expose the data from kafka topic. I have written a producer and consumer like this

Producer.js

 // import the `Kafka` instance from the kafkajs library
const {
    Kafka,
    logLevel
} = require("kafkajs")
const fs = require("fs");
const path = require("path");

// the client ID lets kafka know who's producing the messages
const clientId = "my-app"
// we can define the list of brokers in the cluster
const brokers = ["localhost:9092"]
// this is the topic to which we want to write messages
const topic = "message-log"

// initialize a new kafka client and initialize a producer from it
const kafka = new Kafka({
    clientId,
    brokers,
    // logLevel: logLevel.INFO
})
const producer = kafka.producer({})

// we define an async function that writes a new message each second
const produce = async () => {
    await producer.connect()
    // after the produce has connected, we start an interval timer

    try {
        // send a message to the configured topic with
        // the key and value formed from the current value of `i`
        await producer.send({
            topic,
            acks: 1,
            messages: [{
                key: "metrics on premise",
                value: fs.readFileSync(path.join(__dirname,'metrics.txt'), 'utf8'),
            }, ],
        })

        // if the message is written successfully, log it and increment `i`
        console.log("writes:  #####################")
    
    } catch (err) {
        console.error("could not write message " + err)
    }

}

module.exports = produce

Index.js

const produce = require("./produce")
const consume = require("./consume")
const fs = require("fs");
const path = require("path");

const express = require('express')
const app = express()
const port = 3003


app.get('/metrics', async (req, res) => {
    //res.send(fs.readFileSync(path.join(__dirname,'topic_message.txt'), 'utf8'))

    consume(res).catch(err => {
        console.error("Error in consumer: ", err)
    })
})

app.listen(port, () => {
    console.log(`Example app listening at http://localhost:${port}`)
})



// call the `produce` function and log an error if it occurs
produce().catch((err) => {
    console.error("error in producer: ", err)
})

Below is the consumer Consumer.js

 const {
    Kafka,
    logLevel
} = require("kafkajs")
const fs = require("fs");
const path = require("path");
const clientId = "my-app"
const brokers = ["localhost:9092"]
const topic = "message-log"

const kafka = new Kafka({
    clientId,
    brokers,
    // logCreator: customLogger,
    // logLevel: logLevel.DEBUG,
})
const consumer = kafka.consumer({
    groupId: clientId,
    minBytes: 5,
    maxBytes: 1e6,
    // wait for at most 3 seconds before receiving new data
    maxWaitTimeInMs: 3000,
});

const consume = async (res) => {
    // first, we wait for the client to connect and subscribe to the given topic

    let myString = "";
    await consumer.connect()
    await consumer.subscribe({
        topic,
        fromBeginning: true
    })
    await consumer.run({
        // this function is called every time the consumer gets a new message
        eachMessage: ({
            message
        }) => {
            console.log("Message received ###############################################################################");
            res.send(message.value);
        },
    })

    setTimeout(async () => {
        await consumer.disconnect();
    }, 2000);
}

module.exports = consume

When I hit the api I am not able to send the consumed message to the API

Karthik
  • 377
  • 3
  • 12
  • Please see https://stackoverflow.com/help/how-to-ask - we need at least some code, error messages, what you think should be happening vs. what is happening. – Robert Kawecki Mar 31 '21 at 10:58

1 Answers1

0

Unless you're somehow scraping via streaming HTTP responses or using websockets (which you're not in this code), I'm not sure this is a good approach.

If you really want to send Kafka records to Prometheus, send them through the PushGateway from the consumer rather than have a synchronous HTTP scrape

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245