4

What I have done so far is :

#!/bin/bash

exec 2> >(sed 's/^/ERROR= /')

var=$(
        sleep 1 ; 
        hostname ; 
        ifconfig | wc -l ; 
        ls /sfsd; 
        ls hasdh;
        mkdir /tmp/asdasasd/asdasd/asdasd;
        ls /tmp ;
) 

echo "$var"

This does prepend ERROR= at the start of each error lines, but displays all errors first and then stdout, (not in order in which it was executed).

If we skip storing the output in variable and execute the commands directly, the output comes in desired order.

Any expert opinion would be appreciated.

avg598
  • 505
  • 1
  • 4
  • 13

3 Answers3

6

The primary problem with your script is that the command substitution $(...) only captures the subshell's standard output; the subshell's standard error still just flows through to the parent shell's standard error. As it happens, you've redirected the parent shell's standard error in a way that ends up populating the parent shell's standard output; but that completely circumvents the $(...), which is only capturing the subshell's standard output.

Do you see what I mean?

So, you can fix that by redirecting the subshell's standard error in a way that ends up populating its standard output, which is what gets captured:

var=$(
    exec 2> >(sed 's/^/ERROR= /')
    sleep 1
    hostname
    ifconfig | wc -l
    ls /sfsd
    ls hasdh
    mkdir /tmp/asdasasd/asdasd/asdasd
    ls /tmp
)

echo "$var"

Even so, this does not guarantee proper ordering of lines. The problem is that sed is running in parallel with everything else in the subshell, so while it's just received an error-line and is busy planning to write to standard output, one of the later commands in the subshell can be plowing ahead and already writing more things to standard output!

You can improve that by launching sed separately for each command, so that the shell will wait for sed to complete before proceeding to the next command:

var=$(
    sleep 1 2> >(sed 's/^/ERROR= /')
    hostname 2> >(sed 's/^/ERROR= /')
    { ifconfig | wc -l ; } 2> >(sed 's/^/ERROR= /')
    ls /sfsd 2> >(sed 's/^/ERROR= /')
    ls hasdh 2> >(sed 's/^/ERROR= /')
    mkdir /tmp/asdasasd/asdasd/asdasd 2> >(sed 's/^/ERROR= /')
    ls /tmp 2> >(sed 's/^/ERROR= /')
)

echo "$var"

Even so, sed will be running concurrently with each command, so if any of those commands is a complicated command that writes both to standard output and to standard error, then the order that that command's output is captured in may not match the order in which the command actually wrote it. But this should probably be good enough for your purposes.

You can improve the readability a bit by creating a wrapper function for the simple-command (non-pipeline) case:

var=$(
    function fix-stderr () {
       "$@" 2> >(sed 's/^/ERROR= /')
    }

    fix-stderr sleep 1
    fix-stderr hostname
    fix-stderr eval 'ifconfig | wc -l'   # using eval to get a simple command
    fix-stderr ls /sfsd
    fix-stderr ls hasdh
    fix-stderr mkdir /tmp/asdasasd/asdasd/asdasd
    fix-stderr ls /tmp
)

echo "$var"
ruakh
  • 175,680
  • 26
  • 273
  • 307
  • Your answer explains lots of things that were happening unexpectedly, thank you. – avg598 Aug 15 '16 at 00:38
  • One problem with adding `2> >(sed 's/^/::ERROR:: /')` on each line is, if we want to execute another script file we would execute `./test.sh 2> >(sed 's/^/::ERROR:: /')`, which still gives random ordered output. – avg598 Aug 15 '16 at 00:40
  • @avg598: Yes, that's right. The whole Unix concept of "streams" is extremely powerful, but it doesn't offer any sort of concept of preserving chronological order between separate streams. You'd have to change your `./test.sh` to support the same sort of functionality. – ruakh Aug 15 '16 at 00:52
5

The sed command runs asynchronously from the rest of the shell; its output goes to standard error as soon as it processes its input from the commands in the command substitution. The standard output of those commands, however, are captured in $var and not displayed until the echo command runs.

Even if you weren't capturing the output, there is a chance the standard error and standard output of those commands wouldn't appear as you expect, because the sed command that ultimately produces the error messages might not be scheduled by the OS when you expect it to be, delaying the appearance of the error messages.

When you run a command in the usual way from the terminal, that command's standard error and standard output point to the same file: the terminal itself. As such, writes to the file maintain the order in which they occur in the program. As soon as you pipe one or the other to another process, you lose all control over how the two are spliced back together, if ever. In your case, you are redirecting standard error to sed, which writes modified lines back to standard output. But you have no control over when the the OS schedules sed to run and when your shell runs, so you can't control the order in which lines are written.

It helps to redirect standard error separately for each command:

tag_error () { sed 's/^/ERROR= /'; }

hostname 2> >(tag_error)
{ ifconfig | wc -l ; } 2> >(tag_error)
# etc

but this still doesn't guarantee that writes from within the same program are ordered as if they were all writing to the same file.

(ruakh has covered how to combine this with capturing standard output, so I won't bother adding it now. See his answer.)

Community
  • 1
  • 1
chepner
  • 497,756
  • 71
  • 530
  • 681
  • 1
    You can't; there are no markers to indicate which lines coming from `sed` and which lines in `$var` correspond to each individual command in the command substitution. The relative output is only merged in the "regular" case because each process's stdout and stderr is the exact same file handle. – chepner Aug 14 '16 at 22:44
  • The whole prepending thing I wanted to do was to mark the lines to identify which are from stdout and which are from stderr and at the same time maintaining the order. Is there any other way to do so? – avg598 Aug 14 '16 at 23:14
2

One possible solution would be putting the commands in an array then execute them within a loop:

declare -a cmds=('sleep 1' 'hostname' 'eval ifconfig | wc -l' 'ls /sfsd' 'ls /tmp' 'ls hasdh')

for i in "${cmds[@]}"; do
    $i 2> >(sed -E 's/^/ERROR=/')
done

When an error occurs it should print in the same order that it occurred in the execution. Using a command such as sh script.sh within the array should also reveal any stdout or stderr from the resulting external script. For piped command an eval will likely be needed as well.

l'L'l
  • 44,951
  • 10
  • 95
  • 146
  • But it won't work if one of the command is to run a script file which gives both stderr and stdout. – avg598 Aug 14 '16 at 23:20
  • I'm not sure that I follow... If you had a command such as `sh myscript.sh` within the array it would still output both `stdout` and `stderr`. – l'L'l Aug 14 '16 at 23:25
  • Ok, let me verify, meanwhile check this [link](http://stackoverflow.com/questions/38947524/bash-how-to-store-output-of-set-of-commands-in-3-variables-out-err-combined). Main purpose of doing this is in that link. – avg598 Aug 14 '16 at 23:31
  • I would recommend deleting that question and editing your question here with any additional concerns. – l'L'l Aug 14 '16 at 23:32
  • 1
    The idea of redirection standard error separately for each command is a good idea; storing the commands in an array is not. – chepner Aug 14 '16 at 23:33
  • @l'L'l 1, I think this question limits the solution to that question. There might be other ways to do that. This is why I asked the two separate questions. – avg598 Aug 14 '16 at 23:34
  • @chepner, What other suggestion would you have, and why not store the commands in an array? – l'L'l Aug 14 '16 at 23:34
  • Read [Bash FAQ 050](http://mywiki.wooledge.org/BashFAQ/050) for details. – chepner Aug 14 '16 at 23:36
  • @l'L'l 1, can you please remove -ve vote on the other question, since these two questions are different? – avg598 Aug 14 '16 at 23:40
  • @avg598: What makes you think I downvoted it? If you do you're incorrect. – l'L'l Aug 14 '16 at 23:41
  • @chepner: Thanks for the link, and what it talks about makes sense. So I'm guessing by what it says that parsing the command from a .txt file would be better, although I'm not sure how that would be a whole lot different actually if an array of commands was constructed properly. I'm sort of in a hurry though, so can't add additional info to my answer atm; feel free to edit it if you want, cheers! :) – l'L'l Aug 14 '16 at 23:43
  • @l'L'l 1, I checked by executing `./test.sh 2> >(sed -E 's/^/ERROR=/')`, and it prints the error lines at the end after all stdouts. So, doesn't work. – avg598 Aug 14 '16 at 23:52