0

I see for example exception like this:

org.apache.kafka.common.errors.RecordTooLargeException: The message is 10000190 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

Is there a way to know more about a ProducerRecord that caused this exception?

In Supervisor or recoverWith I have only info about exception. I can't wrap anything with try catch, because I'm using built in Kafka Flow or Kafka Sink. Probably I'd need to give up with this integration and use Kafka producer manually, because I see no other solution.

Piotr Kozlowski
  • 899
  • 1
  • 13
  • 25
  • Not sure it's possible, since you're getting the exception *while deserializing*, that is , before the `ProducerRecord ` is created, so you won't have access to the record itself, I'd look for the partition/offset instead. Incidentally, 10 megabytes messages, kafka does not like messages that are that large. – Roberto Congiu Apr 02 '19 at 18:09
  • I know, that is only example. No matter what exception is thrown I have no business context of this exception, so it is difficult to figure out what goes wrong. – Piotr Kozlowski Apr 03 '19 at 08:15

0 Answers0