1

I know this has to do with the the difference between the Java versions during compile and runtime, however I think I have set all the environments variables properly so I don't really know that is still causing this issue.

$ java -version
java version "1.7.0_79"
Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)
$ javac -version
java 1.7.0_79
$ echo $JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home
$ hadoop version
Hadoop 2.7.1

In RStudio, I have

> Sys.getenv("JAVA_HOME")
[1] "/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home"
> library(rhdfs)
Loading required package: rJava

HADOOP_CMD=/usr/local/Cellar/hadoop/2.7.1/bin/hadoop

Be sure to run hdfs.init()
Warning message:
package ‘rJava’ was built under R version 3.1.3 
> hdfs.init()
Error in .jnew("org/apache/hadoop/conf/Configuration") : 
  java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0

Also I set the $JAVA_HOME in Hadoop's hadoop-env.sh to 1.7.0 as well

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home

I would really appreciate if someone can point out what's going on here.

angerhang
  • 327
  • 4
  • 13

1 Answers1

1

You've no doubt searched around to find that Java "Major" version 51 is 1.7, so you're close.

The only clear way to me that you can actually figure this out is to inspect the Class file under scrutiny -- org.apache.hadoop.conf.Configuration. Below is the beginning definition of a Class file. Notice that minor_version and major_version are the 2nd and 3rd fields, respectively. That should tell you what the class was compiled under, therefore the minimum runtime you'll need to execute.

struct Class_File_Format {
   u4 magic_number;

   u2 minor_version;   
   u2 major_version;

   u2 constant_pool_count;   

   cp_info constant_pool[constant_pool_count - 1];

   u2 access_flags;
Keith
  • 3,079
  • 2
  • 17
  • 26