I am trying to submit a Hadoop Map-Reduce job from a CDH3u4 cluster to cluster which runs on CDH4.3. (The fs.default.name
and mapred.job.tracker
Configuration parameters are set to point to the CDH4.3 cluster). Following is the stack trace.
1) Can we submit a hadoop job to a remote cluster working on different versions? 2) Is there a workaround to do this?
hadoop jar Standalone.jar
Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 7 cannot communicate with client version 4
at org.apache.hadoop.ipc.Client.call(Client.java:1107)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at $Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:129)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:255)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:217)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1563)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1597)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1579)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
at com.poc.standalone.HDFSRemoteAccess.main(HDFSRemoteAccess.java:43)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:197)