1

I'm programming a Kafka Consumer in an ASP.NET web application, all seems good but every time I run the consumer, it consumes all messages in the topic.

Here the Consumer config.

ConsumerConfig = new ConsumerConfig
        {
            GroupId = _config["KafkaConfig:GroupId"],
            BootstrapServers = _config["KafkaConfig:BootstrapServer"],
            AutoOffsetReset = AutoOffsetReset.Latest,
        };

I'm using Confluent Kafka Client for .NET

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
SerPecchia
  • 83
  • 6
  • 1
    A few questions come to mind. Is Consumer auto commits disabled? Is the GroupId changing? How long is the gap between running the Consumer? – Nic Pegg Dec 06 '22 at 18:59

2 Answers2

1

If you always want the very last message at any time your app starts, then Kafka is not the correct tool.

You can Seek() your consumer to the very end, grab that offset, then Seek(N-1), but there may be producers that have sent new records between those seek calls.

If you must use Kafka, then will want to use something like ksqlDB / Kafka Streams or Kafka Connect to create a KTable or Database that stores latest values per key that you can query remotely.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
1

im not into web application but on ts for example you can set a fromBeginning flag.

await consumer.subscribe({topic: 'test-topic', fromBeginning: false});

this makes the consumer only get the messages from the point on you subscripe.

hope that helps

Mapi
  • 26
  • 1