0
<dependency>
    <groupId>org.apache.spark</groupId>
    spark-core_2.12
    <version>2.4.0</version>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    spark-streaming_2.12
    <version>2.4.0</version>
    <scope>provided</scope>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    spark-streaming-kafka_2.11
    <version>1.6.3</version>
</dependency>

<dependency>
    <groupId>org.scala-lang</groupId>
    scala-library
    <version>2.11.12</version>
</dependency>

I am using above version. Why am I getting this error?

bartektartanus
  • 15,284
  • 6
  • 74
  • 102
Ravi
  • 15
  • 5

1 Answers1

1

You're using versions of libraries for different Scala: 2.12 and 2.11.

Please use either

spark-core_2.12
spark-streaming_2.12
spark-streaming-kafka_2.12
scala-library  2.12.x

(spark-streaming-kafka_2.12 does not exist) or

spark-core_2.11
spark-streaming_2.11
spark-streaming-kafka_2.11
scala-library  2.11.x

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)

java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V

Dmytro Mitin
  • 48,194
  • 3
  • 28
  • 66
  • 1
    I guess spark-streaming-kafka_2.12 does not exist. I used 11 vesrion but getting different error. – Ravi Sep 28 '20 at 10:05
  • .factory.BeanCreationException: Error creating bean with name 'javaPairInputDStream' defined in class path resource [com/dreampay/reconsonsumer/config/Configuration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.spark.streaming.api.java.JavaPairInputDStream]: Factory method 'javaPairInputDStream' threw exception; nested exception is org.apache.spark.SparkException: org.apache.spark.SparkException: Error getting partition metadata for 'demo'. Does the topic exist? – Ravi Sep 28 '20 at 10:06
  • I can see demo topic is alrady exist in my local kafka – Ravi Sep 28 '20 at 10:14
  • @Ravi You should create a new question with deatails about your bean config and what you're doing in Spark and Kafka. – Dmytro Mitin Sep 28 '20 at 10:16
  • @Ravi Great. What was the issue? (If you're satisfied with the answer you can accept it.) – Dmytro Mitin Sep 28 '20 at 10:18
  • 1
    I commented this line kafkaParams.put("auto.offset.reset", "smallest") in my code and it worked. – Ravi Sep 29 '20 at 11:03
  • 1
    i am creating one pipeline using spring boot,kafka and spark for big file processing(like computation ,validation etc) – Ravi Sep 29 '20 at 11:07