4

I am reading a schema using SpecificAvroSerde from the kafka Confluent schema registry. but I am getting this error below:

org.apache.kafka.common.errors.InvalidConfigurationException: Unauthorized; error code: 401
[TestStreamProcess-StreamThread-1] ERROR org.apache.kafka.streams.KafkaStreams - stream-client [TestStreamProcess] Encountered the following exception during processing and the registered exception handler opted to SHUTDOWN_CLIENT. The streams client is going to shut down now. 
org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately.
    at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:82)
    at org.apache.kafka.streams.processor.internals.RecordQueue.updateHead(RecordQueue.java:176)
    at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:112)
    at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:185)
    at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:957)
    at org.apache.kafka.streams.processor.internals.TaskManager.addRecordsToTasks(TaskManager.java:1009)
    at org.apache.kafka.streams.processor.internals.StreamThread.pollPhase(StreamThread.java:907)
    at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:720)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:583)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:556)
Caused by: org.apache.kafka.common.errors.InvalidConfigurationException: Unauthorized; error code: 401

I am loading from config.properties file with the required configurations to connect to Schema Registry successfully:

schema.registry.url=<confluent Schema Registry URL>
basic.auth.credentials.source=USER_INFO
schema.registry.basic.auth.user.info=<Schema Registry API Key>:<schema Registry Secret Key>

serdesConfig has already been set up as well as the necessary imports:

import static io.confluent.kafka.serializers.AbstractKafkaSchemaSerDeConfig.*;

    public static SpecificAvroSerde<TestStream> getTestStreamSerde(Properties props) {
        SpecificAvroSerde<TestStream> testStreamSerde= new SpecificAvroSerde<>();
        testStreamSerde.configure(getSerdeProps(props), false);
        return testStreamSerde;
    }

  protected static Map<String, String> getSerdeProps(Properties props) {
        final HashMap<String, String> map = new HashMap<>();

        final String schemaUrlConfig = props.getProperty(SCHEMA_REGISTRY_URL_CONFIG);
        map.put(SCHEMA_REGISTRY_URL_CONFIG, ofNullable(schemaUrlConfig).orElse(""));
}

        final KStream<String, TestStream> testStream = builder.stream("input-topic",
                Consumed.with(String(), getTestStreamSerde(props.getProperties())));

Schema Registry artifact is already loaded in pom.xml:

<plugin>
    <groupId>io.confluent</groupId>
    <artifactId>kafka-schema-registry-maven-plugin</artifactId>
    <version>${confluent.version}</version>
</plugin>

Can you help me identify what am I missing from my configuration?

OneCricketeer
  • 179,855
  • 19
  • 132
  • 245
Charmee Lee
  • 123
  • 1
  • 9

1 Answers1

0

If you're running into this authentication error, the first thing to check is whether you're using the correct API key.

For example, you might be using the cluster API key and secret rather than the schema registry key and secret.

The cluster API key and the schema registry API key are different values to provide access on the cluster level and the schema registry level. They're called resource-specific API keys.