I want to send a large message from producer to Kafka so I've changed below properties.
Broker (server.properties)
replica.fetch.max.bytes=317344026
message.max.bytes=317344026
max.message.bytes=317344026
max.request.size=317344026
Producer (producer.properties)
max.request.size=3173440261
Consumer (consumer.properties)
max.partition.fetch.bytes=327344026
fetch.message.max.bytes=317344026
Still I'm getting some error showing as below when I use python Popen and cli command of kafta to run producer.
Code:
def producer(topic_name, content):
p = subprocess.Popen(['/opt/kafka/kafka_2.11-0.9.0.0/bin/kafka-console-producer.sh', '--broker-list', 'localhost:9092', '--topic', 'Hello-Kafka'], stdout=subprocess.PIPE, stdin=subprocess.PIPE)
p.stdin.write(content)
out, err = p.communicate()
print out
Error:
ERROR Error when sending message to topic Hello-Kafka with key: null, value: 1677562 bytes with error: The message is 1677588 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration. (org.apache.kafka.clients.producer.internals.ErrorLoggingCallback)
And I'm getting below error when I uses python module for kafka (https://github.com/dpkp/kafka-python)
Code:
def producer(topic_name, content):
p = KafkaProducer(bootstrap_servers='localhost:9092')
a = p.send(topic_name, content).get()
print a
p.flush()
p.close()
Error:
kafka.errors.MessageSizeTooLargeError: [Error 10] MessageSizeTooLargeError: The message is 217344026 bytes when serialized which is larger than the maximum request size you have configured with the max_request_size configuration
One thing that I've tried successfully is by dividing content in chunks but if anyone has any solution to do this without dividing content.