1

Looking at Cloudera's installation instructions, I don't see any mention of how to run jobs as regular users.

When I try to run a sample job, this is what I get:

hadoop jar /usr/lib/hadoop/hadoop-*-examples.jar pi 2 100000
Number of Maps  = 2
Samples per Map = 100000
Wrote input for Map #0
Wrote input for Map #1
Starting Job
org.apache.hadoop.security.AccessControlException: org.apache.hadoop.security.AccessControlException: Permission denied: user=myuser, access=WRITE, inode="/":hdfs:supergroup:drwxr-xr-x

One solution would be to set perms of "/" to allow writing by all users.

Another solution I'm seeing online is to set a property mapreduce.jobtracker.staging.root.dir but I'm not sure where that is set at: http://getsatisfaction.com/cloudera/topics/unable_to_run_mapreduce_job_in_cdh3_cluster_permission_denied

I'm guessing there is a standard way this is handled (i.e. not all users running hadoop jobs have root access, nor is leaving permissions of HDFS root wide-open standard as well)

Edit: still stuck on this, but I reposted question to Cloudera's mailing list.. hopefully someone there or here will reply :) thanks!

Dolan Antenucci
  • 329
  • 1
  • 4
  • 16

1 Answers1

0

I was able to get this working with the following setting:

<configuration>
    <property>
        <name>mapreduce.jobtracker.staging.root.dir</name>
        <value>/user</value>
    </property>

    #...

</configuration>

Restart of jobtracker service required as well (special thanks to Jeff on Hadoop mailing list for helping me track down problem!)

Dolan Antenucci
  • 329
  • 1
  • 4
  • 16