-1

How can the version of remote Hadoop cluster be identified through api/web-services? i.e. whether the cluster is 1.x type or 2.x type? Is there any API/web-service available for the same?

I have researched webhdfs? hadoop Filesystem but could not identify the way to do it?

Pankaj Khattar
  • 111
  • 1
  • 10

3 Answers3

0

One way of doing it is via identifying the exceptions (more or less a hit & trial method):

If using 1.x API's in client & connecting to 2.x Hadoop Cluster or vicecersa, by:

    final String uri = "hdfs://remoteHostName:9000/user/myusername";
    final FileSystem fs = FileSystem.get (URI.create (uri), conf);

Then we get the following exception

Exception in thread "main" org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4

The above exception identifies that the client API & Remote Hadoop Cluster are not compatible but unable to get a absolute method to identify the version of remote Hadoop Cluster.

Pankaj Khattar
  • 111
  • 1
  • 10
0

If you have Ambari installed (as many clusters will, especially those based on HDP) you can get cluster version information by doing a GET to'http://your.ambari.server/api/v1/clusters'. The resulting JSON will contain something that looks like:

{ "href" : "http://your.ambari.server/api/v1/clusters",
   "items" : [
      {  "href" : "http://your.ambari.server/api/v1/clusters/c1",
         "Clusters" : { "cluster_name" : "c1",
                        "version" : "HDP-1.2.0"  }  }
   ]
}

The full API reference can be found at: https://github.com/apache/ambari/blob/trunk/ambari-server/docs/api/v1/index.md

and the specifics of this call are at: https://github.com/apache/ambari/blob/trunk/ambari-server/docs/api/v1/clusters.md

Cloudera seems to have something that is at least similar, although I don't know if it's backed by Ambari: http://cloudera.github.io/cm_api/apidocs/v1/path__clusters.html

RickH
  • 2,416
  • 16
  • 17
  • Thanks Rick for the reply, but the case I am working on, there is no ambari installed & its pure apache installations & need to figure out the versions. Though thanks for the clue. – Pankaj Khattar Apr 17 '14 at 13:39
0

Do you heck through linux command of ssh. Do you have username and password of that cluster. Then do bellow command after you will get version of hadoop.

ssh username@cluseterip hadoop version

Then it will ask password of remote machine give password it will give version.

example:: Hadoop 1.1.2

user3539638
  • 116
  • 1
  • 8
  • Yes, that's an option which I have tried earlier & worked too but in this case I don't have the username/password of the server. – Pankaj Khattar Apr 17 '14 at 17:09