I have Hadoop on ubuntu 16.10
Everything works fine: I'm able to upload input file in HDFS & perform map-reduce operations. But when I reboot my PC, all HDFS blocks are corrupted and NameNode starts in SafeMode
So I have to
1) Leave the SafeMode
2) Delete all corrupted blocks with
hdfs fsck -delete
3)Re-upload input file
Then it works fine until next reboot.
Can someone please get me some solution for this. Thanks