56

I need to launch a number of long-running processes with subprocess.Popen, and would like to have the stdout and stderr from each automatically piped to separate log files. Each process will run simultaneously for several minutes, and I want two log files (stdout and stderr) per process to be written to as the processes run.

Do I need to continually call p.communicate() on each process in a loop in order to update each log file, or is there some way to invoke the original Popen command so that stdout and stderr are automatically streamed to open file handles?

jww
  • 97,681
  • 90
  • 411
  • 885

3 Answers3

93

You can pass stdout and stderr as parameters to Popen()

subprocess.Popen(self, args, bufsize=0, executable=None, stdin=None, stdout=None,
                 stderr=None, preexec_fn=None, close_fds=False, shell=False,
                 cwd=None, env=None, universal_newlines=False, startupinfo=None, 
                 creationflags=0)

For example

>>> import subprocess
>>> with open("stdout.txt","wb") as out, open("stderr.txt","wb") as err:
...    subprocess.Popen("ls",stdout=out,stderr=err)
... 
<subprocess.Popen object at 0xa3519ec>
>>> 
Saullo G. P. Castro
  • 56,802
  • 26
  • 179
  • 234
John La Rooy
  • 295,403
  • 53
  • 369
  • 502
  • 21
    Woa! I had no idea you could chain `open`s like that in a context manager. #mindblown – hwjp Oct 31 '14 at 18:18
  • Doesn't work for me! `with open("server.log", "wb") as out: sp.call(server_cmd, stdout = out)` – Ayush Apr 15 '15 at 13:53
  • Why does it have to be "wb"? Is there anyway to force the output to be unicode? – O.rka Jul 26 '19 at 20:04
  • 1
    You can use the encode argument with open like so with open("abc.txt","wb",encoding="utf-8") as file_object – Dhruv Marwha Aug 09 '19 at 06:06
  • 1
    How long do those filehandles remain valid? If the `with` scope closes, but the subprocess is still running beyond that, do the file handles remain valid until the subprocess terminates? Do they then automatically close? – davidA Jan 13 '22 at 02:25
  • @davidA, the file handles close at the end of the with block. Whatever is done with the Popen object should occur within the with block – John La Rooy Jan 19 '22 at 00:55
  • @JohnLaRooy Does it though? I've tried running `subprocess.Popen("sleep 30 && echo OK", shell=True, stdout=f)` within a `with` block and it worked just fine, even though the file was closed before the process actually printed to `stdout`. – Aratz Feb 28 '23 at 14:30
41

Per the docs,

stdin, stdout and stderr specify the executed programs’ standard input, standard output and standard error file handles, respectively. Valid values are PIPE, an existing file descriptor (a positive integer), an existing file object, and None.

So just pass the open-for-writing file objects as named arguments stdout= and stderr= and you should be fine!

Alex Martelli
  • 854,459
  • 170
  • 1,222
  • 1,395
  • Thanks. I could have swore I tried that before and got an error, but that's exactly what I was hoping would work. –  Feb 25 '10 at 04:53
  • That doesn't work for me. I am simultaneously running two processes and save the stdout and stderr from both into one log file. If the output gets too big, one of the subprocesses hangs; don't know which. I can't use formatting in a comment so I'll append an "answer" below. – jasper77 Sep 22 '10 at 21:01
3

I am simultaneously running two subprocesses, and saving the output from both into a single log file. I have also built in a timeout to handle hung subprocesses. When the output gets too big, the timeout always triggers, and none of the stdout from either subprocess gets saved to the log file. The answer posed by Alex above does not solve it.

# Currently open log file.
log = None

# If we send stdout to subprocess.PIPE, the tests with lots of output fill up the pipe and
# make the script hang. So, write the subprocess's stdout directly to the log file.
def run(cmd, logfile):
   #print os.getcwd()
   #print ("Running test: %s" % cmd)
   global log
   p = subprocess.Popen(cmd, shell=True, universal_newlines = True, stderr=subprocess.STDOUT, stdout=logfile)
   log = logfile
   return p


# To make a subprocess capable of timing out
class Alarm(Exception):
   pass

def alarm_handler(signum, frame):
   log.flush()
   raise Alarm


####
## This function runs a given command with the given flags, and records the
## results in a log file. 
####
def runTest(cmd_path, flags, name):

  log = open(name, 'w')

  print >> log, "header"
  log.flush()

  cmd1_ret = run(cmd_path + "command1 " + flags, log)
  log.flush()
  cmd2_ret = run(cmd_path + "command2", log)
  #log.flush()
  sys.stdout.flush()

  start_timer = time.time()  # time how long this took to finish

  signal.signal(signal.SIGALRM, alarm_handler)
  signal.alarm(5)  #seconds

  try:
    cmd1_ret.communicate()

  except Alarm:
    print "myScript.py: Oops, taking too long!"
    kill_string = ("kill -9 %d" % cmd1_ret.pid)
    os.system(kill_string)
    kill_string = ("kill -9 %d" % cmd2_ret.pid)
    os.system(kill_string)
    #sys.exit()

  end_timer = time.time()
  print >> log, "closing message"

  log.close()
jasper77
  • 1,553
  • 5
  • 19
  • 31