-1

I have a .sh that is reading in lines from a .txt to be used as parameters for another shell script.Every two lines process in parallel. Currently: My current code will read in ALL lines in the file (2 at a time) for the first .sh call, then ALL lines for second .sh call, then ALL for the last .sh call

Problem: I need to first two lines in first .sh, then second .sh, then last .sh..THEN loop back and process with next two lines HHEEELLLPPP!!! :)

Now:

cat XXXXX.txt | while read line; do
export step=${line//\"/}
export step=ExecuteModel_${step//,/_}
export pov=$line

$dir"/hpm_ws_client.sh" processCalcScriptOptions  "$appName" "$pov" "$layers" "$stages" "" "$stages" "$stages" FALSE > "$dir""/"$step"_ProcessID.log"
$dir_shell_model"/check_process_status.sh" "$dir" "$step" > "$dir""/""$step""_Monitor.log" &


$dir"/hpm_ws_client.sh" processCalcScriptOptions "$appName" "$pov" "$layers" "" ""  "$stage_final" "" TRUE > "$dir""/""$step""_ProcessID.log"
$dir"/check_process_status.sh" "$dir" "$step" > "$dir""/""$step""_Monitor.log" &

$dir"/hpm_ws_client.sh" processGenealogyExecutionPaths "$appName" "$pov" "$layers" "$genPath_01" > "$dir""/""$step""_ProcessID.log"
$dir"/check_process_status.sh" "$dir" "$step" > "$dir""/""$step""_Monitor.log" &

if (( ++i % 2 == 0))
then
echo "Waiting..."
wait
fi
done
  • 3
    It's really hard to see what you are trying to do - can you simplify it and also show the file(s) that you are trying to read. It seems to me that the way to do this is with `GNU Parallel` which can read 1 or more arguments from one or more input files and process as many at a time as you like. It will probably remove all your loops and make your code several times smaller - I am guessing around 3-5 lines in total - and more efficient and more readable. – Mark Setchell Apr 29 '15 at 17:42
  • @MarkSetchell, I'd tend to suggest the GNU xargs `-P` flag long before GNU Parallel -- much simpler implementation (when used with `-0` or `-d $'\n'`, to avoid codepaths related to compatibility with some of the worst-considered bits of POSIX), in the "obviously no bugs" vs "no obvious bugs" sense. – Charles Duffy Apr 29 '15 at 19:32
  • Tangentially, what's with all the `export`s? By quick glance, none of these variables need to be exported unless the tools you call require them; but certainly, nothing needs to be exported more than once. – tripleee Apr 29 '15 at 19:37
  • @tripleee the exports are used to define the names of log files for the various processes....the tools do require them as well, this is just a piece of a much larger puzzle....that piece I am not too concerned with, I am just looking to process the TRANSFER DATA shell, in serial, when the first two parallel executions are complete, which is tracked by check_process_status.sh – Preston Alexander Apr 29 '15 at 19:52
  • for clarification....this works, but they are all in parallel, which will cause the execution of TRANSFER DATA to fail because the tool we are using can only handle one transfer at a time (although it can handle multiple executions simultaneously...I know, craziness!) – Preston Alexander Apr 29 '15 at 19:55

2 Answers2

1

I cannot see what you are really trying to do, but hope one of these two syntaxes will help - either reading two lines at a time, or loading the parameters into an array an re-using them.

So, if your file.txt looks like this:

line 1
line 2
line 3
line 4
line 5
line 6

Example 1 - with two reads

#!/bin/bash
while read a && read b; do
   echo $a, $b
done < file.txt

Output

line 1, line 2
line 3, line 4
line 5, line 6

Example 2 - with a bash array

#!/bin/bash
declare -a params
while IFS=$'\n' read -r z; do
    params+=("${z}")
done < file.txt

# Now print the elements out
for (( i=0;i<${#params[@]};i++ )) do
   echo ${params[$i]}
done

Output

line 1
line 2
line 3
line 4
line 5
line 6

Example 3 - with GNU Parallel

Or, as I suggested in my comment, use GNU Parallel like this

parallel -k -L2 echo {1} {2} < file.txt

Output

line 1 line 2
line 3 line 4
line 5 line 6

where -k means "keep the output in order" and -L2 means "take 2 lines at a time from file.txt".

This has the advantage that, if you want to run 8 scripts at a time in parallel, you just specify -j 8 to parallel and the job is done.

Mark Setchell
  • 191,897
  • 31
  • 273
  • 432
0

Well, it's not pretty and hopefully someone will offer a more elegant way to read successive pairs of lines in those loops, but one possibility is using a variable to track where you are, like

LOCATION=0
for file in first second last; do
    LOCATION=$((LOCATION+2))
    LINES=$(head -n $LOCATION $file | tail -n 2)
    # process lines
Eric Renouf
  • 13,950
  • 3
  • 45
  • 67
  • @marksetchell thanks for that, its helpful to understand the approaches I could use here. I came up with a partial solution by just using && between the shell scripts and checking for two running processes. The problem I now have is for the second shell I am running with the parameters (dependent on success of the first one) needs to be run in serial. I am thinking maybe an if statement? – Preston Alexander Apr 29 '15 at 19:12
  • Having a numeric index to tell you how many lines to read *again* is a common anti-pattern. Try something like `while read first; do read second; command something "$first" "$second"; done – tripleee Apr 30 '15 at 03:47