0

I am trying to run Hadoop (HDFS and YARN) in multi-node cluster (2 nodes) but the resource manager fails to start on slave node. Basically, it fails due to the below exception - not able to find a class called javax.activation.DataSource (which is present in Java 8).

Versions I tried with: Hadoop 3.1.3/Java 1.8.0_u251 and 1.8.0_u152 Hadoop 3.2.1/Java 1.8.0_u251

All the above combinations give the same error.

        at com.google.inject.internal.InjectorImpl$2.get(InjectorImpl.java:1012)
        ... 52 more
Caused by: java.lang.ClassNotFoundException: javax.activation.DataSource
        at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
        at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
        at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:521)
        ... 86 more
2020-05-08 07:31:07,375 INFO org.apache.hadoop.yarn.server.nodemanager.NodeManager: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NodeManager at rajesh2-VirtualBox/127.0.1.1
************************************************************/

Also, surprisingly the resource manager runs fine on master node (has same Hadoop and Java version as slave node).

Please help. Thanks.

Note - HDFS runs fine. Only YARN has issues.

UPDATE: There are other StackOverflow questions which talk about the same exception but they are running on Java 9 or above. Java 8 should not have this issue.

mazaneicha
  • 8,794
  • 4
  • 33
  • 52
Learner
  • 533
  • 5
  • 18
  • 1
    I might be wrong but your stacktrace does look like it was spit by JDK 9 or higher. – mazaneicha May 08 '20 at 02:23
  • You are right. Hadoop was using system's JRE. Although I have set JAVA_HOME to Java 8, it was somehow picking up system's JRE (at least for nodemanager). For everyone's sake - I have set JAVA_HOME in $HADOOP_HOME/etc/hadoop/hadoop-env.sh and then it worked fine. – Learner May 09 '20 at 08:30

0 Answers0