0

I'm running this command line to resolve the problem of corrupt HDFS FIles :

hdfs fsck /

And I get this warning :

Connecting to namenode via http://master1:50070
FSCK started by root (auth:SIMPLE) from /192.168.1.30 for path / at Mon Oct 24 05:06:23 EDT 2016
FSCK ended at Mon Oct 24 05:06:23 EDT 2016 in 1 milliseconds
Permission denied: user=root, access=READ_EXECUTE, inode="/accumulo":accumulo:accumulo:drwxr-x--x

Any help please !!

G.Saleh
  • 509
  • 1
  • 11
  • 29

3 Answers3

3

You cannot execute fsck with normal user. You should run it with hdfs as superuser

sudo -u hdfs hdfs fsck /

In the case of only accumulo, you can try below

sudo -u accumulo hdfs fsck /accumulo
BruceWayne
  • 3,286
  • 4
  • 25
  • 35
1

You should run this command as an "hdfs" user:

sudo -u hdfs hdfs fsck /
facha
  • 11,862
  • 14
  • 59
  • 82
0

Its a permission issue.

Run this command from hadoop bin and then try again.

hadoop fs -chmod -R 777 /accumulo

To disable permission check, set the below property in hdfs-site.xml and restart your cluster

<property>
  <name>dfs.permissions</name>
  <value>false</value>
</property>
Kumar
  • 3,782
  • 4
  • 39
  • 87
  • IT dosen't work, but when I do : `sudo -u hdfs hadoop fs -chmod -R 777 /accumulo` it work fine. But when I do this I have another permission denied on another inode. – G.Saleh Oct 25 '16 at 08:45