2

I am using my department's computing cluster with Sun Grid Engine.

When I have to run multiple R jobs, I usually write shell script files with names s01.sh, s02.sh,...,s50.sh which have 'R CMD BATCH r01.r','R CMD BATCH r02.r',...,'R CMD BATCH r50.r' as its contents.

Then I open 'PUTTY', log in, and then have to type 'qsub s01.sh', 'qsub s02.sh'....etc.

If there are hundreds of jobs, it is a real labor to manually type hundred of jobs. Is there a way to run this multiple 'qsub' commands simultaneously?

clusterdude
  • 616
  • 3
  • 16
user67275
  • 1
  • 9
  • 38
  • 64

2 Answers2

4

Assuming the scripts to be run are in the current folder:

for file in s*.sh; do qsub $file; done
Hamfry
  • 506
  • 1
  • 5
  • 11
phenomenon
  • 203
  • 2
  • 14
1

I think you just need to run the qsub commands sequentially, since qsub itself should be fairly quick. (The submitted commands will probably run in parallel.)

You just need a loop.

Assuming you've already created the r*.r files, this is easy to do with a small shell script:

#!/bin/bash

for file in r*.r ; do
    script=$(echo $file | sed 's/^r/s/;s/\.r$/.sh/')
    (   
        echo "#/bin/sh"
        echo "R CMD BATCH $file"
    ) > $script
    chmod +x $script
    qsub $script
done
Keith Thompson
  • 254,901
  • 44
  • 429
  • 631
  • Thank you, but I still don't understand anything. So, should I type the code directly, or should I make a blahblah.sh file and contain your code into the .sh file? – user67275 Oct 15 '16 at 02:45
  • @user67275: The latter is much easier. Create a script called, say, `submit-jobs.sh` containing the commands I wrote, run `chmod +x submit-jobs.sh`, then run the script. (Please note that I haven't tested it; I don't have `qsub` on my system.) – Keith Thompson Oct 15 '16 at 02:47