1

I work at a large corporate organization where we have a Hadoop cluster. I got the admin to install virtualenv on all the Hadoop worker nodes so that I can submit mrjobs with standard Python dependencies that may not exist on the worker nodes. As per the documentation here, this is how my mrjob.conf file looks like:

runners:
  hadoop:
    setup:
    - virtualenv venv
    - . venv/bin/activate
    - pip install nltk

I have a simple job that uses nltk package. I can verify that this setup script runs on the worker nodes (I can put simple commands like write some data to a file in /tmp and it works). However, I get the following error:

New python executable in venv/bin/python
Installing setuptools............done.
Installing pip...
  Error [Errno 13] Permission denied while executing command /storage5/hadoop/map...env/bin/easy_install /usr/share/python-virtualenv/pip-1.1.tar.gz
...Installing pip...done.
Traceback (most recent call last):
  File "/usr/bin/virtualenv", line 3, in <module>
    virtualenv.main()
  File "/usr/lib/python2.7/dist-packages/virtualenv.py", line 938, in main
    never_download=options.never_download)
  File "/usr/lib/python2.7/dist-packages/virtualenv.py", line 1054, in create_environment
    install_pip(py_executable, search_dirs=search_dirs, never_download=never_download)
  File "/usr/lib/python2.7/dist-packages/virtualenv.py", line 643, in install_pip
    filter_stdout=_filter_setup)
  File "/usr/lib/python2.7/dist-packages/virtualenv.py", line 976, in call_subprocess
    cwd=cwd, env=env)
  File "/usr/lib/python2.7/subprocess.py", line 679, in __init__
    errread, errwrite)
  File "/usr/lib/python2.7/subprocess.py", line 1249, in _execute_child
    raise child_exception
OSError: [Errno 13] Permission denied
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
    at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:362)
    at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:572)
    at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:136)
    at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:57)
    at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1438)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)

What may be causing this error?

Cœur
  • 37,241
  • 25
  • 195
  • 267
abhinavkulkarni
  • 2,284
  • 4
  • 36
  • 54

1 Answers1

0

Thanks for this idea for deploying packages to the cluster.

As for your problem I think it looks like it doesn't have permission to write to the directory.

dreyco676
  • 911
  • 1
  • 8
  • 14