0

I have a small application which reads a file from my local machine and writes the data into hdfs.

Now i want to list the files present in the hdfs folder, say HadoopTest. When i try to do that , i am getting the below exception:

org.apache.hadoop.security.AccessControlException: Permission denied: user=rpoornima, access=EXECUTE, inode="/hbase/HadoopTest/Hadoop_File_1.txt":rpoornima:hbase:-rw-r--r--
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:205)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:161)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:128)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:4547)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:4523)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListingInt(FSNamesystem.java:3312)
    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getListing(FSNamesystem.java:3289)
    at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getListing(NameNodeRpcServer.java:652)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getListing(ClientNamenodeProtocolServerSideTranslatorPB.java:431)
    at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44098)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)

I'm not sure how to resolve this issue. kindly give you inputs.

PravinS
  • 2,640
  • 3
  • 21
  • 25
dove4evr
  • 83
  • 4
  • 14

1 Answers1

2

You exception is clear enough to show the problem.

As the exception says

Permission denied: user=rpoornima, access=EXECUTE,
inode="/hbase/HadoopTest/Hadoop_File_1.txt":rpoornima:hbase:-rw-r--r--`

This means your account rpoornima only has -rw-r--r-- permission(no execute) on the file /hbase/HadoopTest/Hadoop_File_1.txt. So you have to use another full privilege account to do the execution.


UPDATE

If you want to give access to specified user. Use a chmod command.

chown

Usage: hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI ]

Change the owner of files. The user must be a super-user. Additional information is in the Permissions Guide.

Options

The -R option will make the change recursively through the directory structure.

Community
  • 1
  • 1
luoluo
  • 5,353
  • 3
  • 30
  • 41
  • Thank you for the solution. Another user 'hdfs' and 'root' have full access. But not sure how to access the 'HadoopTest' folder as user 'root' from java code. any idea how to access as a different user? – dove4evr Aug 31 '15 at 07:03
  • The exception said you are using `user=rpoornima`, which has no privilege on the specified directory. Not the `root`. – luoluo Aug 31 '15 at 08:56
  • The user=rpoornima is given full access as given below in the passwd file: rpoornima:x:102:106 I figured out that each time i write a new file, the permission is getting reset to -rw-r--r--. Hence in the java code, i set the permission as given below: 'Path path = new Path((hdfsurl.append("/hbase/HadoopTest/Hadoop_File_"+filecount+".txt")).toString()); os = hdfs.create(path); hdfs.setPermission(path,new FsPermission(FsAction.ALL,FsAction.ALL,FsAction.ALL));' After this the file is written with the below permission: -rw-rw-rw- 3 rpoornima hbase HadoopTest/Hadoop_File_1.txt – dove4evr Sep 01 '15 at 05:46
  • Still, when i execute the code in java, i am getting the same exception: org.apache.hadoop.security.AccessControlException: Permission denied: user=rpoornima, access=EXECUTE, inode="/hbase/HadoopTest/Hadoop_File_1.txt":rpoornima:hbase:-rw-rw-rw- – dove4evr Sep 01 '15 at 05:47
  • You failed to add `EXECUTE` access, as the exceptions said `-rw-rw-rw-`. – luoluo Sep 01 '15 at 06:03
  • How do i add EXECUTE access in java code? I implemented the code 'hdfs.setPermission(path,new FsPermission(FsAction.ALL,FsAction.ALL,FsAction.ALL)); hdfs.setPermission(path,new FsPermission(FsAction.EXECUTE,FsAction.ALL,FsAction.ALL));', but still getting the access exception. – dove4evr Sep 01 '15 at 06:41
  • If i have excute access, what will the syntax of the permission look like?? – dove4evr Sep 01 '15 at 06:42
  • `-rwxrwxrwx` is full access for file. – luoluo Sep 01 '15 at 06:47
  • Any idea how to write the file in hdfs with execute access? I tried various methods. Still not working: Currently my code looks like below:hdfs = FileSystem.get( new URI(hdfsurl.toString()), conf); Path path = new Path((hdfsurl .append("/hbase/HadoopTest/Hadoop_File_"+filecount+".txt")) .toString()); os = hdfs.create(path); hdfs.setPermission(path,new FsPermission(FsAction.ALL,FsAction.ALL,FsAction.ALL)); hdfs.setPermission(path,new FsPermission(FsAction.EXECUTE,FsAction.ALL,FsAction.ALL)); IOUtils.copyBytes(is, os, 4096, false); – dove4evr Sep 01 '15 at 06:59