0

I have Hadoop on ubuntu 16.10

Everything works fine: I'm able to upload input file in HDFS & perform map-reduce operations. But when I reboot my PC, all HDFS blocks are corrupted and NameNode starts in SafeMode

So I have to

1) Leave the SafeMode

2) Delete all corrupted blocks with

hdfs fsck -delete

3)Re-upload input file

Then it works fine until next reboot.

Can someone please get me some solution for this. Thanks

PradeepKumbhar
  • 3,361
  • 1
  • 18
  • 31
Pizze99
  • 23
  • 6

2 Answers2

1

i solved my problem.I use this link to check my config files http://www.bogotobogo.com/Hadoop/BigData_hadoop_Install_on_ubuntu_single_node_cluster.php

i forgot to use sudo chown -R hduser:hadoop /usr/local/hadoop_tmp on my hdfs directory

Pizze99
  • 23
  • 6
0
Create folder like /dfs/ in your machine 
open hdfs-site.xml or hdfs-default.xml  
set this property "dfs.namenode.name.dir".

    Example:
    <property>
      <name>dfs.namenode.name.dir</name>
      <value>/dfs/</value>
      <description>Determines where on the local filesystem the DFS name node
          should store the name table(fsimage).  If this is a comma-delimited list
          of directories then the name table is replicated in all of the
          directories, for redundancy. </description>
    </property>