3

I am trying to run a Perl script in Linux and log all output to STDOUT and STDERR to a file using:

open (STDOUT, "| tee -i $transcript_file");
open (STDERR, "| tee -ai $transcript_file");

The script that uses this works roughly as follows:

  1. Create an environment for running a tool. Has many print, warn and possibly die statements.
  2. Run the tool (Currently using system command). This produces a lot of output which I want to appear on STDOUT, but not in the logfile (The tool creates its own logfile).
  3. Analyze the results, cleanup and exit. Has many print, warn and possibly die statements.

Everything works correctly except I would like to exclude the output of step 2 from the log. Is there a simple way to achieve this?

Thanks,

PS: This is my first question on stackoverflow. Please help me in asking questions correctly if I have not done so.

Hans Lub
  • 5,513
  • 1
  • 23
  • 43
Abhishek
  • 77
  • 6
  • 1
    Very OK question. In general: don't use tag names ("perl") in your title. I changed the title to state the programming problem, instead of the effect you want to achieve with it. – Hans Lub Feb 06 '15 at 12:48

2 Answers2

4

I agree with Sobrique's advice to use a special function print_and_log. But if you really want to do it the way you set out to do, you can dup STDOUT and STDERR, redirect them to your log and then use open3 to run your tool with the dup'ed original standard output and error file descriptors

use  IPC::Open3;

# dup the old standard output and error 
open(OLDOUT, ">&STDOUT") or die "Can't dup STDOUT: $!\n";
open(OLDERR, ">&STDERR") or die "Can't dup STDERR: $!\n";

# reopen stdout and stderr
open (STDOUT, "|tee $transcript_file") or die "Can't reopen STDOUT: $!\n";
open (STDERR, ">&STDOUT")              or die "Can't reopen STDERR: $!\n";

# print statements now write to log
print "Logging important info: blah!\n";
print STDERR "OOPS!\n";

# run your command; output will go to original stdout
# this will die() instead of returning say -1, so use eval() to catch errors
my $pid = open3(">&OLDOUT", "<&STDIN", ">&OLDERR", $command); 

# wash those dishes....
waitpid( $pid, 0 );
Hans Lub
  • 5,513
  • 1
  • 23
  • 43
  • Hmm, nice line of thinking. Hadn't considered that approach. Certainly would lend itself to some more complicated tasks well. – Sobrique Feb 06 '15 at 12:32
  • Thanks, this worked wonderfully. It took some time to capture return value of the shell script (exit EXITCODE), but using $? >> 255 helped along with using eval like you suggested.. – Abhishek Feb 09 '15 at 14:12
0

Given you're reassigning STDIN and STDOUT then the short answer is no. You're capturing everything on STDOUT which includes your intermediate output.

You could probably close/reopen STDOUT for the bit where you didn't want it logging. But I'd suggest instead that you might want to consider what you're trying to accomplish - would a 'print_and_log' subroutine do what you're after?

Sobrique
  • 52,974
  • 7
  • 60
  • 101
  • 1
    `system` calls always write to file descriptors 1 and 2. These usually correspond to `STDOUT` and `STDERR` in Perl, but it is possible to manipulate them to map to other file descriptors. – mob Feb 06 '15 at 20:59
  • not exactly. manipulation with STDOUT & STDERR give any effect only on most internat printings inside Perl. Unfortunately it doesn't affect externally called commands. Only physically reopenning works everywhere, system streams 1 & 2 must be reopenned and assigned into selected stream, file, terminal or .... /dev/null . – Znik Jan 05 '17 at 12:00