I am installing Hadoop on window 7 64bits with Cygwin. After I format the Hadoop successfully, I want start it use the command: start-dfs.sh. but it reports as:
]tarting namenodes on [DATASCIENCES01
: Name or service not knownstname datasciences01
Starting datanodes
]tarting secondary namenodes [0.0.0.0
: Name or service not knownstname 0.0.0.0
]tarting journal nodes [
: Name or service not knownstname
2018-06-26 10:47:10,208 WARN util.Shell: Did not find winutils.exe: {}
java.io.FileNotFoundException: Could not locate Hadoop executable: C:\cygwin64\usr\local\hadoop-3.1.0\bin\winutils.exe -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.getQualifiedBinInner(Shell.java:620)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:593)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:690)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
at org.apache.hadoop.hdfs.tools.GetConf$Command.<clinit>(GetConf.java:86)
at org.apache.hadoop.hdfs.tools.GetConf.<clinit>(GetConf.java:136)
2018-06-26 10:47:10,540 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
I used the command : bash -x ./start-dfs.sh to try to debug:
]tarting namenodes on [DATASCIENCES01
+ hadoop_uservar_su hdfs namenode /cygdrive/c/cygwin64/usr/local/hadoop-3.1.0/bin/hdfs --workers --config /usr/local/hadoop-3.1.0/etc/hadoop --hostnames $'DATASCIENCES01\r' --daemon start namenode
+ declare program=hdfs
+ declare command=namenode
+ shift 2
+ declare uprogram
+ declare ucommand
+ declare uvar
+ declare svar
+ hadoop_privilege_check
+ [[ 1063539 = 0 ]]
+ /cygdrive/c/cygwin64/usr/local/hadoop-3.1.0/bin/hdfs --workers --config /usr/local/hadoop-3.1.0/etc/hadoop --hostnames $'DATASCIENCES01\r' --daemon start namenode
: Name or service not knownstname datasciences01
And I see that my hostname is refered as 'DATASCIENCES01\r' instead of 'DATASCIENCES01'.
I guess that the \r corresponds to Windows typical end of line so I used dos2unix in order to convert the file core-site.xml where the NameNode name is refered :
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/opt/hadoop_data</value>
<description>directory for hadoop data</description>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://DATASCIENCES01:54311</value>
<description> data to be put on this URI</description>
</property>
<property>
<name>fs.defaultFS</name>
<value>hdfs://DATASCIENCES01:54311</value>
<description>Use HDFS as file storage engine</description>
</property>
<property>
<name>dfs.permissions</name>
<value>false</value>
</property>
</configuration>
But it doesn't change this '\r' at the end.
I found a similar topic at this link :
When Hadoop 2.7 start on windows 7 64 bits: Name or service not knownstname localhost
But there is still no answer.
Thanks in advance