1

I am getting the following exception on my microservice since my last release. As the exception is not complete (elipise ...), I am not able to make sense out it.

As per my understanding kafka records can have null for key (as then RoundRobinStragey is used to choose the correct partition).

I am out of ideas! Any idea as to what could be causing it?

{"@timestamp":"2023-02-21 15:58:16.947","@version":"1","message":"stream-client [Servicename-954725e4-a291-4fce-8f7e-77f8b5eeab6b] Encountered the following exception during processing and Kafka Streams opted to SHUTDOWN_CLIENT. The streams client is going to shut down now. ","logger":"org.apache.kafka.streams.KafkaStreams","thread":"Servicename-954725e4-a291-4fce-8f7e-77f8b5eeab6b-StreamThread-5","level":"ERROR","stacktrace":"org.apache.kafka.streams.errors.StreamsException: Exception caught in process. taskId=0_16, processor=KSTREAM-SOURCE-0000000000, topic=my-topic-name, partition=16, offset=263041154, stacktrace=java.lang.NullPointerException\n\tat org.apache.kafka.common.header.internals.RecordHeader.key(RecordHeader.java:45)\n\tat org.springframework.cloud.stream.binder.kafka.streams.AbstractKafkaStreamsBinderProcessor.lambda$null$6(AbstractKafkaStreamsBinderProcessor.java:498)\n\tat java.base/java.lang.Iterable.forEach(Iterable.java:75)\n\tat org.springframework.cloud.st...

Gary Russell
  • 166,535
  • 14
  • 146
  • 179
Andy
  • 87
  • 6

1 Answers1

1

As per my understanding kafka records can have null for key...

That is true for records, but not for record headers (each of which has a key and value).

You somehow have a RecordHeader (in record.headers()) with a null key, which should be impossible:

public RecordHeader(String key, byte[] value) {
    Objects.requireNonNull(key, "Null header keys are not permitted");
    this.key = key;
    this.value = value;
}

public RecordHeader(ByteBuffer keyBuffer, ByteBuffer valueBuffer) {
    this.keyBuffer = Objects.requireNonNull(keyBuffer, "Null header keys are not permitted");
    this.valueBuffer = valueBuffer;
}

The NPE is on the keyBuffer here, while trying to extract the key from the buffer:

public String key() {
    if (key == null) {
        key = Utils.utf8(keyBuffer, keyBuffer.remaining());
        keyBuffer = null;
    }
    return key;
}
Gary Russell
  • 166,535
  • 14
  • 146
  • 179
  • Thank a lot Mr. Russel, Like you said this situation should be impossible `Objects.requireNonNull(key, "Null header keys are not permitted");` Could it be an api version issue? spring-cloud-stream-binder-kafka using an older version of this RecordHeader api? I will check further. – Andy Feb 23 '23 at 17:50
  • Unlikely, the line number (45) lines up. Headers were introduced in the 0.11.0.0 version and the code is the same there https://github.com/apache/kafka/blob/e18335dd953107a61d89451932de33d33c0fd207/clients/src/main/java/org/apache/kafka/common/header/internals/RecordHeader.java#L31-L41 Perhaps run in a debugger to see where it is happening. – Gary Russell Feb 23 '23 at 17:57
  • Hello Mr. Russel, The above exception happens on the production system, In order to debug I am trying to print the record (which fails) including all its headers but I don't see them on my kafka consumer console. I am using the following script, `kafka-console-consumer --bootstrap-server kafka-broker:9092 --consumer.config my.properties --partition 47 --offset 14803 --topic my-topic.0 --formatter kafka.tools.DefaultMessageFormatter --property print.key=true --property print.value=false --property print.timestamp=true --property print.headers=true --property parse.key=true ` – Andy Feb 27 '23 at 12:34