2

I'm making a script that executes some commands inside, and these commands show some output on STDOUT (and STDERR as well, but that's no problem). I need that my script generates a .tar.gz file to STDOUT, so the output of some commands executed in the script also go to STDOUT and this ends with a not valid .tar.gz file in the output.

So, in short, it's possible to output the first commands to the screen (as I still want to see the output) but not via STDOUT? Also I would like to keep the STDERR untouched so only error messages appear there.

A simple example of what I mean. This would be my script:

#!/bin/bash

# the output of these commands shouldn't go to STDOUT, but still appear on screen
some_cmd foo bar
other_cmd baz

#the following command creates a tar.gz of the "whatever" folder,
#and outputs the result to STDOUT
tar zc whatever/

I've tried messing with exec and the file descriptors, but I still can't get it to work:

#!/bin/bash

# save STDOUT to #3
exec 3>&1

# the output of these commands should go to #3 and screen, but not STDOUT
some_cmd foo bar
other_cmd baz

# restore STDOUT
exec 1>&3

# the output of this command should be the only one that goes to STDOUT
tar zc whatever/

I guess I'm lacking closing STDOUT after the first exec and reopen it again or something, but I can't find the right way to do it (right now the result is the same as if I didn't add the execs

Carlos Campderrós
  • 773
  • 2
  • 6
  • 17
  • I think you should add this "I want to output the tar.gz to stdout because this script will be called from a remote client using ssh using ssh-keys (ssh user@remoteserver the_script.sh > file.tar.gz). " comment to your question. – Maciek Sawicki Apr 16 '12 at 21:12

3 Answers3

5

stdout is the screen. There isn't a separation between stdout and "the screen".

In this instance, I would just redirect stdout to stderr temporarily with 1>&2 within a subshell. This will cause the commands' output to be shown on screen but won't be in the programs stdout stream.

#!/bin/bash

# the output of these commands shouldn't go to STDOUT, but still appear on screen

# Start a subshell
(
    1>&2                # Redirect stdout to stderr
    some_cmd foo bar
    other_cmd baz
)
# At the end of the subshell, the file descriptors are 
# as they usually are (no redirection) as the subshell has exited.

#the following command creates a tar.gz of the "whatever" folder,
#and outputs the result to STDOUT
tar zc whatever/

Is there a reason you need to pipe the output of this script into something else? Typically you'd just have tar writing to a file using the -f flag or perform a redirect on the tar command only: tar zc whatever > filename.tar.gz (unless you were putting it onto a device such as a tape or using it as a form of copy).

webtoe
  • 1,976
  • 11
  • 12
  • I want to output the tar.gz to stdout because this script will be called from a remote client using ssh using ssh-keys (`ssh user@remoteserver the_script.sh > file.tar.gz`). – Carlos Campderrós Apr 16 '12 at 10:34
  • I considered the option of sending it to the `STDERR`, but if there's a way to leave the `STDERR` alone with only errors I would prefer that. I thought of sending output from the commands to `readlink -f /proc/$$/fd/1`, but I was searching for a more elegant way with shell builtins. – Carlos Campderrós Apr 16 '12 at 10:38
  • I assumed as much. I think the easiest way would be to avoid trying to pipe as you are doing and just have two commands `ssh` to run your script and generate the tar file and then `scp`/`rsync`. Or you could set up a network pipe using `netcat` and send the output of tar over that: see [here](http://compsoc.dur.ac.uk/~djw/tarpipe.html). – webtoe Apr 16 '12 at 11:37
  • These other options are less desirable than redirecting the `STDOUT` to `STDERR` for some commands. Anyway thanks for giving it some more thoughts, your efforts are appreciated :) – Carlos Campderrós Apr 16 '12 at 11:51
  • STDOUT is whatever FD 1 is attached to. In `cmd > /dev/null`, STDOUT is `/dev/null`. – Tom Hale Jun 30 '18 at 08:36
1

Wouldn't it be much easier if you use the -f switch to the tar command to tell tar to write to a file ?

tar zcf whatever.tar.gz whatever/

If that doesn't do what you want then you will have to redirect individually each command that may write to STDOUT

some_cmd foo bar 1>&2
user9517
  • 115,471
  • 20
  • 215
  • 297
0

I think you are looking for some kind of multiplexing.

This is simple example how to append timestamp to each std out line: http://www.podciborski.co.uk/programming/perl/add-a-timestamp-to-every-stdout-line/

You can use some special tag instead of timestamp to mark log lines. Than you would have to remove those lines on the other end. So usage would be like this:

ssh user@remoteserver the_script.sh | create_tar.sh filename

create_tar.sh should be script that prints lines with log tag and redirects another lines to file.

Maciek Sawicki
  • 790
  • 1
  • 8
  • 21
  • No, this doesn't serve my purpose as it is making things complicated and also I would need to have the 'create_tar.sh' script in all clients that would call my script. – Carlos Campderrós Apr 16 '12 at 11:26
  • how about changing script to producing logfile and bundle it with tar archive? Usage would be like this ssh user@remoteserver the_script.sh | tar -x -f - ; cat logfile – Maciek Sawicki Apr 16 '12 at 21:11