1

I am a newbie to Flink/Avro. I am trying Flink 1.14.4 Table API to read the Avro format from Kafka (not the confluent one). I am not able to read any of the message. I am getting the following exception,

Caused by: java.lang.UnsupportedOperationException: Cannot read strings longer than 2147483639 bytes
    at org.apache.avro.io.BinaryDecoder.readString(BinaryDecoder.java:282)
    at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:208)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:469)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:459)
    at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:191)

Surely I don't have any string to have 2gb of length. Even the total Kafka storage is a few MB bytes so I don't think its an issue in the encoded message. What could be the issue here?

Note: I wrote a simple code (without anything with Flink) to parse the Kafka message and to deserialise myself, which is working perfectly fine. This makes me trouble only when using Flink. What could be the issue?

Invisible
  • 179
  • 1
  • 8
  • follow the stack trace or debug your code... surely there must be some errors. maybe somewhere you might have something like `new byte[a&Integer.MAX_VALUE` Or `Integer.MAX_VALUE]` for a buffer where `a` becomes `-1` or something. just debug your code. good debugging :). – D. Sikilai Jul 06 '22 at 11:23
  • Were you able to sort this out? I am facing the same problem. This specifically surfaced when I used KafkaSource instead of FlinkKafkaConsumer class. Changing from former to latter helped once but the issue is transient. Didn't appear every time. – mjennet Dec 17 '22 at 05:18
  • I'm also getting the same error, but in a different context. When trying to read a message from Kafka, where the message is Avro encoded. Were you able to sort the issue out? – Sajith Apr 11 '23 at 12:52

0 Answers0