0

I keep getting this error message:

The message is 1169350 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

As indicated in other StackOverflow posts, I am trying to set the “max.request.size” configuration in the Producer as follows:

.writeStream
.format("kafka")
.option(
  "kafka.bootstrap.servers",
  conig.outputBootstrapServer
)
.option(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "10000000")

But this is not working. Am I setting this correctly? Is there a different way to set this property under Spark Structured Streaming?

DilTeam
  • 2,551
  • 9
  • 42
  • 69

1 Answers1

2

If I remember, you have to prefix all kafka properties with "kafka.". Could you try this?

.option(s"kafka.${ProducerConfig.MAX_REQUEST_SIZE_CONFIG}", "10000000")
facha
  • 11,862
  • 14
  • 59
  • 82