0

Getting this error :

2020-11-12 20:16:35.463 ERROR 17552 --- [ extractor-3] o.s.k.support.LoggingProducerListener : Exception thrown when sending a message with key='abcdefghi' and payload='{"header": {"id": "ururirehdjd", "Env": "88"}, "etc...' to topic <TOPIC_NAME>:

org.apache.kafka.common.errors.TimeoutException: Topic <TOPIC_NAME> not present in metadata after 60000 ms.

MY spring batch job is by passing this error and updates the writeCount even though message is not published on the topic. How to throw exception in such scenarios.?

Sonia
  • 107
  • 2
  • 11
  • I think this is related to https://github.com/spring-projects/spring-batch/issues/3773. In the meantime, you can use a custom writer and do something like https://stackoverflow.com/a/64648499/5019386. – Mahmoud Ben Hassine Nov 13 '20 at 08:13
  • If in case of job run between two dates, we have 100 messages to publish on Topic . We pull the messages from the database and on 50th message, I see message is not being published successfully and my job throw exception and is marked as failed. and I run my job manually again between same dates, Is it going to publish the first 50 messages again which were published successfully? – Sonia Nov 17 '20 at 06:42
  • What is the size of your chunk? Do you use a `KafkaTransactionManager` in your Spring Batch job? – Mahmoud Ben Hassine Nov 17 '20 at 07:52
  • Chunk size is 500 and I am not using KafkaTransactionManager in my job. It is just basic KafkaItemWriter with topic name to write to mentioned topic. – Sonia Nov 17 '20 at 11:09
  • Since your chunk size is 500 and you have 100 messages, this means you will have a single chunk. Now if the job fails at item 50, the entire chunk will be rolled back and the next run will re-read the first chunk and republish those items. What you are trying to do requires configuring Spring Batch to use a JtaTransactionManager that coordinates a DataSourceTransactionManager (for meta-data tables) and a KafkaTransactionManager (for your item writer), so that when the transaction is rolled back, your items are not published. – Mahmoud Ben Hassine Nov 18 '20 at 09:05
  • 100 messages was an example. We will have messages in hundred thousands. So as you explained above, the whole chunk will be rolled back in case of any failures. To accomplish this I need to have JtaTransactionManager and KafkaTransactionManager in my job? – Sonia Nov 18 '20 at 11:42

0 Answers0