0

I have set up a three node Hadoop cluster and trying to run Spark using Hadoop's YARN and HDFS.

I set the various environment variables like HADOOP_HOME , HADOOP_CONF_DIR, SPARK_HOME etc.

Now, when I try to run the master process of spark using start-master.sh , it is giving me some exceptions,the main contents of exception file is below:

Spark Command: /usr/local/java/bin/java -cp     /usr/local/spark/conf/:/usr/local/spark/jars/*:/usr/local/hadoop/etc/hadoop/ -    Xmx1g org.apache.spark.deploy.master.Master --host master.hadoop.cluster --port 7077 --webui-port 8080


Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)

As it says ClassNotFound exception,I am not able to understand how to provide this class, and which Jar to use from which it can pick the class file. Does this jar come as bundled with the Spark download?

Can anyone please help in fixing this issue.

CuriousMind
  • 8,301
  • 22
  • 65
  • 134

0 Answers0