If any message in kafka fails to be read, I want to save these messages in a new topic and retry the reading process a certain number of times. Meanwhile, I need to pause the reading process at the original location. How can I do that?
Asked
Active
Viewed 1,095 times
0
-
1The consumer would automatically "pause" because there would be an error. Have you tried setting an ErrorHandler? https://github.com/confluentinc/confluent-kafka-dotnet/blob/master/src/Confluent.Kafka/ConsumerBuilder.cs#L37 Then building a new Producer instance within that? – OneCricketeer Jan 10 '22 at 22:27
-
For example, I send a request to the API with kafka and send an invoice. During this sending process, the invoice may be corrupt and may not be sent. In such a case, is it possible for the consumer to stop? I think the situation you mentioned is valid in case of fatal error. – Gülsen Keskin Jan 11 '22 at 06:48
-
1Okay, so in your processing logic, you would catch an exception, then instantiate or use a Producer to send to a new topic, then commit the record for the consumer since that record is corrupt and you wouldn't want to process it again, then you move on consuming new records. Why do you need to pause, though? Why would you want to retry corrupt events? Sounds like you need two dead-letter-queues; one for retry-able events, such as timeouts, and another for corrupted data that can be fixed at a later date – OneCricketeer Jan 11 '22 at 15:52
-
Many thanks for your advice, time and patience. – Gülsen Keskin Jan 12 '22 at 13:07