1

I am trying to integrate Kafka and Spark. I have created a producer in Kafka and I want to consume messages in Spark cluster.

I am using the below command for spark job submit but it is giving error. Can you please help me with the error ?It seems I am missing some dependent jar files.

My current spark version : 2.1.1.2.6.2.0-205

Python version : 2.7.5

Python Script:

import sys

import os 

from pyspark import SparkContext, SparkConf

from pyspark.streaming import StreamingContext

from pyspark.streaming.kafka import KafkaUtils

if __name__ == "__main__":

    conf = SparkConf().setMaster("local").setAppName("Spark-Kafka Integration")

    sc = SparkContext(conf = conf)

    ssc = StreamingContext(sc, 2)

    lines = KafkaUtils.createStream(ssc,'localhost:2181','raw-event-streaming-
    consumer',{'NewTopic':1}) 

    print(lines)

    ssc.start()

    ssc.awaitTermination()

Error logs:

 File "/usr/hdp/2.6.2.0-205/spark2/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o37.createStream.
: java.lang.NoClassDefFoundError: org/apache/spark/Logging
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:91)
        at org.apache.spark.streaming.kafka.KafkaUtils$.createStream(KafkaUtils.scala:168)
        at org.apache.spark.streaming.kafka.KafkaUtilsPythonHelper.createStream(KafkaUtils.scala:632)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
        at py4j.Gateway.invoke(Gateway.java:280)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:214)
        at java.lang.Thread.run(Thread.java:745)
Sumit D
  • 171
  • 1
  • 3
  • 14
  • I am using below command to submit spark job. ./spark-submit --master local[*] --jars /ldaphome/user/spark-streaming-kafka-0-8-assembly_2.11-2.2.0.jar,/ldaphome/user/spark-core_2.11-2.2.0.jar /ldaphome/user/test.py 100 – Sumit D Dec 04 '17 at 08:33
  • 1
    After some [search](https://stackoverflow.com/questions/40287289/java-lang-noclassdeffounderror-org-apache-spark-logging) it looks like a version mismatch between your Kafka jars and your spark. You are using Kafka jars for 2.2.0 spark, not for 2.1. Could you please change the jars and try again? – mkaran Dec 04 '17 at 09:00
  • Hi Karan, I tried with 2.1 jar also but it is giving the same error. ./spark-submit --master local[*] --jars /ldaphome/user/spark-streaming-kafka-0-8_2.11-2.1.0.jar,/ldaphome/user/spark-core_2.11-2.1.1.jar /ldaphome/user/test.py 100 – Sumit D Dec 04 '17 at 09:12
  • Hmm ok, two questions: 1. what is the `..spark-core..` jar for and 2. what about trying this way: `--packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.1.1` instead of `--jars` ? – mkaran Dec 04 '17 at 09:24
  • Hi Karan, Even I tried with packages but still getting the same error. ./spark-submit --master local[*] --packages org.apache.spark:spark-streaming-kafka-0-8_2.11:2.1.1 /ldaphome/user/test.py 100 – Sumit D Dec 04 '17 at 10:05

0 Answers0