We encountered an issue where a message we sent to the topic resulted in the following error from the Producer code:
Confluent.Kafka.ProduceException`2[System.String,com.capgroup.comet.compliance.ComplianceResponse]: Broker: Message size too large
However, the message still gets sent to the topic successfully!
On the broker, we set the topic's max.message.bytes to 1 MB. We determined that if we did not set the MaxMessageBytes or explicitly set it to 1MB (the default value), we would get this error (and I want to stress the message did get published anyway and broker seemed happy with it).
However, we wanted to address this error. We noticed that if we increased the MaxMessageBytes to 5 MB and kept max.message.bytes the same, it would send without the (benign) Producer error.
We send the message in Avro binary format. We do not bundle the schema into the payload/header.
Can someone explain this behavior? For now we just set the MaxMessageBytes to something generously large and will depend on the broker to enforce the 1 MB