0

I have created 4 topics in Kafka consumer. It was receiving data(message) only for within 4 days, after some couple of days unable to receive any data. I am also checking producer it successfully sending data. When i am creating new topic and assign the topic to producer, then it can able to send message to the new topic. Here problem is need to create new topic for every 3 to 4 days and assign the topic to the producer. Am unable to find out the problem, also checking related log files.
Nothing is there which can give the cause of the problem.

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
  • 1
    It would be useful if you provided a [mcve] of your code. By default, Kafka stores data for 7 days – OneCricketeer Oct 01 '21 at 14:29
  • How specifically are you checking the producer is working? Are you actually using `GetOffsetShell`, for example, and ensuring the high-watermark is increasing? How are you checking the consumer actually has messages to read and is not stuck at some particular offset? If you describe its consumer group, what do you see? – OneCricketeer Oct 01 '21 at 14:41
  • @OneCricketeer " How specifically are you checking the producer is working? ". When i am assigning new topic it is able to consume data. If it stuck after getting some specific offset, can i set any properties in my application in to KafakaProperties configuration ? Like earliest, latest etc. – Bimal Kumar Dalei Oct 02 '21 at 04:51
  • In a regular consumer you need to handle errors on your own. Only in Kafka Streams is there a method for skipping exceptions from deserialization. Your slf4j application logging configuration should be able to tell you when/why your code is stopping on which records, not the Kafka client properties... Besides that, if your producer stops working, then so does the consumer, despite how many times you try to restart that consumer since the offset will be committed within the consumer group. So, again, if you describe the consumer group, what do you see? – OneCricketeer Oct 02 '21 at 13:03
  • @OneCricketeer. Thanks for your time, am thinking the problem has resolve, now kafka consumer can able to consume data regularly without any issue. In application level i have made some changes, it will consume only latest data. Before it was consumed latest as well as earliest data. – Bimal Kumar Dalei Oct 18 '21 at 04:43
  • Sounds like you set `auto.offset.reset=latest` or the lag is just zero, or you've added codes to seek to the end of the topic – OneCricketeer Oct 18 '21 at 14:12
  • 1
    @OneCricketeer, Yes, right have made some changes in Kafka properties like: " props.put("auto.offset.reset", "latest"); " – Bimal Kumar Dalei Oct 21 '21 at 04:37

0 Answers0