0

I am a beginner in Hadoop.

Two big issues that I am trying to fix are :-

1.)While starting hadoop through cmd (start-all.cmd) my namenode is not starting. It is showing the below error .

SHUTDOWN_MSG :Shutting down NameNode at Admin/192.168.1.6

I tried to fix it by following the below steps :-

a) Firstly getting the host name of my computer. It can be obtained by running $hostname command. My hostname is Admin. Then adding 192.168.1.6 localhost hostname into the /etc/hosts file.

But I am not getting file named hosts in etc folder. I am confused on whether I have to create a file (.txt) in etc folder or I need to do anything else ?

2.) Also, on my analysis, I checked that while firing the command (start-all.cmd), three folders are being created automatically, i.e, hortonworks, USER and tmp. I am wondering about the reason. Please help.

1 Answers1

0

/etc/hosts is a text file named exactly that as-is on most all Linux/Unix operating systems. Are you running a Linux OS ? You can check if your Linux OS is using it by running

cat /etc/nsswitch.conf | grep host

Looks for a line in the output that says

hosts files dns

If you see the word files in the row with hosts your system should use it.

You can view its contents by running cat /etc/hosts.

You can run the following to add your entry to hosts the run cat /etc/hosts to check it.

echo "192.168.1.6 localhost Admin" >> /etc/hosts

Blake Russo
  • 181
  • 1
  • 9