14

Getting the following error (Kafka 2.1.0):

2018-12-03 21:22:37.873 ERROR 37645 --- [nio-8080-exec-1] o.s.k.support.LoggingProducerListener : Exception thrown when sending a message with key='null' and payload='{82, 73, 70, 70, 36, 96, 19, 0, 87, 65, 86, 69, 102, 109, 116, 32, 16, 0, 0, 0, 1, 0, 1, 0, 68, -84,...' to topic recieved_sound: org.apache.kafka.common.errors.RecordTooLargeException: The message is 1269892 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

I tried all the suggestions in various SO posts.

My Producer.properties:

max.request.size=41943040
message.max.bytes=41943040
replica.fetch.max.bytes=41943040
fetch.message.max.bytes=41943040

Server.properties:

socket.request.max.bytes=104857600
message.max.bytes=41943040
max.request.size=41943040
replica.fetch.max.bytes=41943040
fetch.message.max.bytes=41943040

ProducerConfig (Spring Boot):

configProps.put("message.max.bytes", "41943040");
configProps.put("max.request.size", "41943040");
configProps.put("replica.fetch.max.bytes", "41943040");
configProps.put("fetch.message.max.bytes", "41943040");

ConsumerConfig (SpringBoot):

props.put("fetch.message.max.bytes", "41943040");
props.put("message.max.bytes", "41943040");
props.put("max.request.size", "41943040");
props.put("replica.fetch.max.bytes", "41943040");
props.put("fetch.message.max.bytes", "41943040");

I also changed Strings to numbers in the last 2 files. Started brokers multiple times, and created new topics. I was getting org.apache.kafka.common.errors.RecordTooLargeException: The request included a message larger than the max message size the server will accept error initially, which got fixed by these changes, but still no luck with this new error.

lloiacono
  • 4,714
  • 2
  • 30
  • 46
sapy
  • 8,952
  • 7
  • 49
  • 60
  • Your configurations look fine. I am guessing maybe you did not deploy the changeds to all brokers? [Can you check the broker config using bin/kafka-configs.sh](https://kafka.apache.org/documentation/#dynamicbrokerconfigs) to make sure your configurations are correct on all brokers? – Gal S Dec 04 '18 at 07:03
  • also add `max.partition.fetch.bytes` – Paizo Dec 04 '18 at 09:43
  • 1
    `max.partition.fetch.bytes` is a soft limit. from documentation: `If the first record batch in the first non-empty partition of the fetch is larger than this limit, the batch will still be returned to ensure that the consumer can make progress`. – Lior Chaga Dec 04 '18 at 11:48
  • You might need to add kaka.max.partition.fetch.bytes instead of max.partition.fetch.bytes to the client properties – Nir Feb 26 '19 at 08:56

3 Answers3

10

Set a breakpoint in KafkaProducer.ensureValidRecordSize() to see what's going on.

With this app

@SpringBootApplication
public class So53605262Application {

    public static void main(String[] args) {
        SpringApplication.run(So53605262Application.class, args);
    }

    @Bean
    public NewTopic topic() {
        return new NewTopic("so53605262", 1, (short) 1);
    }

    @Bean
    public ApplicationRunner runner(KafkaTemplate<String, String> template) {
        return args -> template.send("so53605262", new String(new byte[1024 * 1024 * 2]));
    }

}

I get

The message is 2097240 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

as expected; when I add

spring.kafka.producer.properties.max.request.size=3000000

(which is the equivalent of your config but using Spring Boot properties), I get

The request included a message larger than the max message size the server will accept.

If debugging doesn't help, perhaps you can post a complete small app that exhibits the behavior you see.

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
2

You can change message size if Kafka property is file on a server.

for default sever.property file

#/usr/local/kafka/config
#message.max.bytes=26214400

producer.properties->

# the maximum size of a request in bytes
# max.request.size=26214400

same for conusmer

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Adesh
  • 21
  • 1
-3

You should set config in producer in such way

Props.put(ConsumerConfig.FETCH_MAX_BYTES_CONFIG, "41943040");
Pablo Cegarra
  • 20,955
  • 12
  • 92
  • 110