0

I am using this tech stack:

Spark version: 3.3.1 Scala Version: 2.12.15 Hadoop Version: 3.3.4 Kafka Version: 3.3.1

I am trying to get data from kafka topic through spark structure streaming, But I am facing mentioned error, Code I am using is:

For reading data from kafka topic

result_1 = spark.readStream \
                   .format("kafka") \
                   .option("kafka.bootstrap.servers", "localhost:9092") \
                   .option("subscribe", "sampleTopic1") \
                   .option("startingOffsets", "latest") \
                   .load()

For writing data on console

trans_detail_write_stream = result_1 \
        .writeStream\
        .trigger(processingTime='1 seconds')\
        .outputMode("update")\
        .option("truncate", "false")\
        .format("console")\
        .start()\
        .awaitTermination()

For execution I am using following command:

spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.3.1 streamer.py

I am facing this error "java.lang.NoSuchMethodError: org.apache.spark.kafka010.KafkaTokenUtil$.needTokenUpdate(Ljava/util/Map;Lscala/Option;)"

and on later logs it give me this exception too

"StreamingQueryException: Query [id = 600dfe3b-6782-4e67-b4d6-97343d02d2c0, runId = 197e4a8b-699f-4852-a2e6-1c90994d2c3f] terminated with exception: Writing job aborted"

Please suggest

Edit: Screenshot for Spark Version

enter image description here

  • NoSuchMethodError usually comes when you have a version conflict. Maybe some components are using a kafka of version A and some other components in your application are using version B. In this case, version A requires the method KafkaTokenUtil$.needTokenUpdate but in your application, kafka version B is included which doesn't have such method – Young Jan 16 '23 at 10:11
  • I have only one version of kafka installed .i.e. 3.3.1, Is it possible that this error comes from jar which I am using here "spark-sql-kafka-0-10_2.12:3.3.1" ? – Muhammad Affan Jan 16 '23 at 13:30
  • Kafka version doesn't matter since that class doesn't come from kafka.apache.org itself. Assuming you're sure you have Spark 3.3.1, and Scala 2.12, then yes, that's the correct package – OneCricketeer Jan 16 '23 at 13:32
  • OneCricketeer makes a right point. I didn't notice that the error comes from org.apache.spark.kafka010.KafkaTokenUtil. In your case, it's the 'spark' version you need to check – Young Jan 16 '23 at 15:32
  • I am using spark version 3.3.1. Screenshot can be shown in edited question – Muhammad Affan Jan 17 '23 at 06:57

0 Answers0