3

I have an account on a supercomputing cluster where I've installed some packages using e.g. "pip install --user keras".

When using qsub to submit jobs to the queue, I try to make sure the system can see my local packages by setting "export PYTHONPATH=$PYTHONPATH:[$HOME]/.local/lib/python2.7/site-packages/keras" in the script.

However, the resulting log file still complains that there is no package called keras. How can I make sure the system finds my packages?

user1634426
  • 563
  • 2
  • 5
  • 12

2 Answers2

1

If you are using pbs professional then try to export PYTHONPATH in your environment and then submit job using "-V" option with qsub. This will make qsub take all of your environment variables and export it for the job. Else, try setting it using option "-v" (notice small v) and then put your environment variable key/value pair with that option like qsub -v HOME=/home/user job.sh

wabbit
  • 133
  • 1
  • 8
1

Are the python packages accessible from the compute nodes? If they aren't, then the solution is that you need to get them installed.

Once the packages are accessible, it should be just a matter of setting up your environment correctly. For Torque, you can set the environment on a per-job basis using -V and or -v. The -V option exports the environment from where the job is being submitted to the job. If you want to just send a few variables, -v may take care of you:

qsub script.sh -v PYTHONPATH=<desiredpath>[,var2name=var2value[,...]]

Documentation

dbeer
  • 6,963
  • 3
  • 31
  • 47