1

I was trying to set up Hadoop single node cluster on my pc with Ubuntu 12.10. with reference to tutorial on Michael-noll.com

  • Everything went smooth till I executed '/usr/local/hadoop/bin/start-all.sh' command

  • After that I executed java process status "/usr/local/hadoop$ jps"

Then I found that the Data Node, Task Tracker, Name Node are not listed (ie not working)

can any one help in this situation.

venkatvb
  • 681
  • 1
  • 9
  • 24
christy
  • 19
  • 1
  • 3
  • Without the error messages, no one would be able to help you !!! – Praveen Sripati Oct 24 '12 at 16:16
  • If you followed the installation instructions, all log output should be available in /usr/local/hadoop/logs. Please look at the log files and post relevant stacktraces if you need assistence in interpreting these. – rretzbach Oct 26 '12 at 07:23

5 Answers5

1

Reset your core-site.xml file

  <property>
    <name>hadoop.tmp.dir</name>
    <value>/home/rlk/hduser</value>
  </property>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost/</value>
  </property>
Bucket
  • 7,415
  • 9
  • 35
  • 45
  • Oh dude.. this answer saved my life. When I had localhost:9000 it didnt work.. thanks a ton – Dharam May 15 '15 at 10:16
0

Format the name node and again start all services with start-all.sh command and check the status of working with jps. This may occur if you quit the application without stopping the services. So, Before you quit from application make sure you have stopped all the service with stop-all.sh.

venkatvb
  • 681
  • 1
  • 9
  • 24
0
  1. Well, I think you install Hadoop to /usr/local/hadoop

  2. When you run start-all.sh, hadoop will write log to /usr/local/hadoop/logs directory, However, maybe Hadoop doesn't have write permission to this directory.

  3. Please check all log files in /usr/local/hadoop/logs/ and find whether the problem.

  4. How to solve it?

    4.1 Modify /usr/local/hadoop/conf/hadoop-env.sh, i.e. add export HADOOP_LOG_DIR=/tmp/hadoop/logs

    4.2 Restart Hadoop, and jps again, and check log files

  5. I suggest you post related log here if you meet some problem again.:)

venkatvb
  • 681
  • 1
  • 9
  • 24
brian.chen
  • 16
  • 2
0

You should have write permissions on directory which you mention in core-site.xml for property hadoop.tmp.dir. I have explained this in this link Hadoop Series Single Node Installation.

venkatvb
  • 681
  • 1
  • 9
  • 24
Harry
  • 41
  • 2
0

Package Verification

  1. $ rpm ­‐ql hadoop-­‐0.20-­‐conf-­‐pseudo

Format NameNode

  1. sudo ­‐u hdfs hdfs namenode-­‐format

  2. $ for service in /etc/init.d/hadoop* > do > sudo $service stop > done

Start HDFS

  1. Start HDFS

    $ for service in /etc/init.d/hadoop-­‐hdfs-­‐* > do > sudo $service start > done

Change Localhost property

5.change localhost property http://localhost:50070 By following above steps you can start your namenode and datanode services.

venkatvb
  • 681
  • 1
  • 9
  • 24