We are facing the problem that a lot of people want to run different scientific software on our high performance computing cluster. Every user requires a different set of libraries and library versions and we do not want the administrator to deal with the installation of a new library every time.
So we are thinking about using docker containers for this purpose: Every user can setup his own container with the userland libraries that he requires and then run the batch processing jobs using this container.
However, as I see it, docker is mainly focused on services instead of batch processing jobs: usually you have a (e.g. web) service that is suppose to run all the time and process new jobs (which is basically always the same task with new input data) as soon as they come in.
Our situation is quite different: a new user should be able to setup new tasks that should run on the hardware and should just get a certain amount of resources for his batch processing job.
I am thus wondering if there is already a solution for this scenario. I had a look at https://github.com/NERSC/shifter which seems to go into the right direction, but development has stalled.