0

Some information about my dev env:

openstack: juno

hadoop: 2.4.1

pywebhdfs: 0.4.0

I create a hadoop cluster through openstack sahara client API, then I want to create a file to HDFS using pywebhdfs (for launching jobs on the created hadoop cluster). But an error occurs when creating HDFS file, the error message is:

HTTPConnectionPool(host='vanillacluster-vanillacluster-slave-vanilla-002.novalocal', port=50075): Max retries exceeded with url: /webhdfs/v1/user/hadoop/test/pg20417.txt?op=CREATE&user.name=hadoop&namenoderpcaddress=vanillacluster-vanillacluster-master-vanilla-001:9000&overwrite=false (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f2455bd5750>: Failed to establish a new connection: [Errno -2] Name or service not known',))

I checked /etc/hosts, all nodes' ip and host name are listed. Does anyone know how to detect possible problems? Thanks

1 Answers1

1

I solved this issue modifying the file /etc/hosts and mapping the IP and the host name of each node, like this:

x.x.x.x vanillacluster-vanillacluster-slave-vanilla-002

x.x.x.x vanillacluster-vanillacluster-master-vanilla-001