0

I am installing Hadoop on window 7 64bits with Cygwin. After I format the Hadoop successfully, I want start it use the command: start-dfs.sh. but it reports as:

]tarting namenodes on [DATASCIENCES01
: Name or service not knownstname datasciences01
Starting datanodes
]tarting secondary namenodes [0.0.0.0
: Name or service not knownstname 0.0.0.0
]tarting journal nodes [
: Name or service not knownstname

 2018-06-26 10:47:10,208 WARN util.Shell: Did not find winutils.exe: {}

java.io.FileNotFoundException: Could not locate Hadoop executable: C:\cygwin64\usr\local\hadoop-3.1.0\bin\winutils.exe -see https://wiki.apache.org/hadoop/WindowsProblems
    at org.apache.hadoop.util.Shell.getQualifiedBinInner(Shell.java:620)
    at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:593)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:690)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
    at org.apache.hadoop.hdfs.tools.GetConf$Command.<clinit>(GetConf.java:86)
    at org.apache.hadoop.hdfs.tools.GetConf.<clinit>(GetConf.java:136)
2018-06-26 10:47:10,540 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I used the command : bash -x ./start-dfs.sh to try to debug:

]tarting namenodes on [DATASCIENCES01
+ hadoop_uservar_su hdfs namenode /cygdrive/c/cygwin64/usr/local/hadoop-3.1.0/bin/hdfs --workers --config /usr/local/hadoop-3.1.0/etc/hadoop --hostnames $'DATASCIENCES01\r' --daemon start namenode
+ declare program=hdfs
+ declare command=namenode
+ shift 2
+ declare uprogram
+ declare ucommand
+ declare uvar
+ declare svar
+ hadoop_privilege_check
+ [[ 1063539 = 0 ]]
+ /cygdrive/c/cygwin64/usr/local/hadoop-3.1.0/bin/hdfs --workers --config /usr/local/hadoop-3.1.0/etc/hadoop --hostnames $'DATASCIENCES01\r' --daemon start namenode
: Name or service not knownstname datasciences01

And I see that my hostname is refered as 'DATASCIENCES01\r' instead of 'DATASCIENCES01'.

I guess that the \r corresponds to Windows typical end of line so I used dos2unix in order to convert the file core-site.xml where the NameNode name is refered :

<configuration>
<property>
<name>hadoop.tmp.dir</name>
 <value>/opt/hadoop_data</value>
 <description>directory for hadoop data</description>
 </property>
 <property>
 <name>fs.default.name</name>      
 <value>hdfs://DATASCIENCES01:54311</value>
 <description> data to be put on this URI</description>
 </property>
 <property>
 <name>fs.defaultFS</name>
 <value>hdfs://DATASCIENCES01:54311</value>
 <description>Use HDFS as file storage engine</description>
 </property>
 <property>
 <name>dfs.permissions</name>
 <value>false</value>
 </property>
 </configuration>

But it doesn't change this '\r' at the end.

I found a similar topic at this link :

When Hadoop 2.7 start on windows 7 64 bits: Name or service not knownstname localhost

But there is still no answer.

Thanks in advance

Cielronix
  • 1
  • 1
  • copy and paste the text. Do not link images – matzeri Jun 26 '18 at 09:08
  • You are going in the right direction but you need to run the command in opposite way. Because you want to convert the text files from Unix to dos. – Abhinav Jun 26 '18 at 09:38
  • If I use the unix2dos command, it will not delete the \r carriage return of Windows. I've tried it anyway and it hasn't change anything but thanks for helping me – Cielronix Jun 26 '18 at 09:45
  • Hadoop runs in windows for serveral years. You don't need Cygwin, but you do need winutils – OneCricketeer Jun 26 '18 at 13:37
  • @Cielronix Earlier I had this issue because I copied a script from windows to my Linux machine and I had to run dos2unix command for it. That is why I thought the opposite may work in your case. Did you find any other solution? – Abhinav Jun 26 '18 at 13:38
  • @cricket_007 so you think that the warning I have can have an impact on the error I have ? – Cielronix Jun 28 '18 at 08:10
  • @Abhinav I tried dos2unix and unix2dos and it is still not working. I'm trying to fix the "Did not find winutils.exe " perhaps it can have a direct impact on my problem – Cielronix Jun 28 '18 at 08:12
  • You can find that winutils executable on Github if you'd search it there. Download the one for your version of Hadoop... I suggest you also actually read the link in the error that explains your problem and how you can fix it https://wiki.apache.org/hadoop/WindowsProblems – OneCricketeer Jun 28 '18 at 08:19

0 Answers0