0

I am trying to submit a Hadoop Map-Reduce job from a CDH3u4 cluster to cluster which runs on CDH4.3. (The fs.default.name and mapred.job.tracker Configuration parameters are set to point to the CDH4.3 cluster). Following is the stack trace.

1) Can we submit a hadoop job to a remote cluster working on different versions? 2) Is there a workaround to do this?

hadoop jar Standalone.jar

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 7 cannot communicate with client version 4
    at org.apache.hadoop.ipc.Client.call(Client.java:1107)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
    at $Proxy0.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:129)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:255)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:217)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1563)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1597)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1579)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
    at com.poc.standalone.HDFSRemoteAccess.main(HDFSRemoteAccess.java:43)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
Community
  • 1
  • 1

2 Answers2

0

You need to set the HADOOP_PREFIX environment variable to point to a directory where your have a version of hadoop 2.X.X installed.

e.g.

export HADOOP_PREFIX=pathtohadoop-2.2.0
Martin Prikryl
  • 188,800
  • 56
  • 490
  • 992
Rob
  • 21
  • 3
0

I have faced this exception when I was trying to connect to hdfs. I'm using cdh4.6 version.

I've solved this issue by adding cloudera mvn dependencies. You can find a dependency list here.

Firstly, you should check your dependencies.

Another point is you should try to use fs.deafultFS config parameter instead of(or beside of) fs.default.name param. because fs.default.name is deprecated in cdh4X

1) You should have dependencies of the both versions and may be able to swicth between them.
2) take look at here to keep the different versions of dependencies.

Community
  • 1
  • 1
husnu
  • 303
  • 1
  • 4
  • 9