I tried to create message chunks of Kafka producer in nodejs using kafka-node but didn't get any solution. so now I am creating a Kafka stream producer using Java and I need to send large message which has size above 1MB. How can I create chunks of message in Kafka producer and consume the same messages?
Asked
Active
Viewed 2,107 times
1 Answers
1
kafka has a maximum payload size.
if you need to send larger payloads, but their size is still bounded, you can increase that limit in the broker and producer configuration (message.max.bytes
in broker configs and max.request.size
in producer configs). 10MB should still be a reasonable limit.
linkedin maintains (java) kafka clients (https://github.com/linkedin/li-apache-kafka-clients) that are capable of fragmenting large messages on the producer and reassembling them on the consumer, but the solution is imperfect:
- does not work properly with log-compacted kafka topics
- has memory overhead on the consumer for re-assembly and storage of fragments.

radai
- 23,949
- 10
- 71
- 115
-
Thanks for reply. But i have to split large messages into segments with the producing client, using partition keys to ensure that all segments are sent to the same. The consuming client can then reconstruct the original large message. If you have any code for it please share.. – Vaibhav Shelar Sep 13 '19 at 02:26
-
the code i linked above does this. look at https://github.com/linkedin/li-apache-kafka-clients/blob/master/li-apache-kafka-clients/src/main/java/com/linkedin/kafka/clients/largemessage/ConsumerRecordsProcessor.java – radai Sep 13 '19 at 02:30
-
That means, I have to add the dependency and set configuration "large.message.enabled = true" for my producer.. right? – Vaibhav Shelar Sep 13 '19 at 15:21
-
I imported li-apache-kafka-clients .jar file and used the configuration as bellow: props.put("large.message.enabled", "true"); props.put("max.message.segment.bytes", 5000 * 1024); props.put("segment.serializer", DefaultSegmentSerializer.class.getName()); props.put("auditor.class", LoggingAuditor.class.getName()); still I'm getting error: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1112248 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration. – Vaibhav Shelar Sep 17 '19 at 09:33
-
Yes, I changed it, before it was 1MB but it didn't work and my message size is 1.5MB – Vaibhav Shelar Sep 18 '19 at 02:27
-
How do i Implement li-apache-kafka-clients on my Spring Boot application? please find below link to look into my current code. [https://stackoverflow.com/questions/57978818/how-to-use-li-apache-kafka-clients-in-spring-boot-app-to-send-large-message-a] – Vaibhav Shelar Sep 20 '19 at 10:34