45

So I noticed subprocess.call while it waits for the command to finish before proceeding with the python script, I have no way of getting the stdout, except with subprocess.Popen. Are there any alternative function calls that would wait until it finishes? (I also tried Popen.wait)

NOTE: I'm trying to avoid os.system call

result = subprocess.Popen([commands...,
                        self.tmpfile.path()], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = result.communicate()
print out+"HIHIHI"

my output:

HIHIHI

NOTE: I am trying to run wine with this.

Ciro Santilli OurBigBook.com
  • 347,512
  • 102
  • 1,199
  • 985
Stupid.Fat.Cat
  • 10,755
  • 23
  • 83
  • 144
  • 1
    *subprocess.call()* can be used to read out/err. [please check the manual](http://docs.python.org/2/library/subprocess.html). ensure the command doesn't generate lot of output. – tuxuday Nov 15 '12 at 13:25

5 Answers5

92

I am using the following construct, although you might want to avoid shell=True. This gives you the output and error message for any command, and the error code as well:

process = subprocess.Popen(cmd, shell=True,
                           stdout=subprocess.PIPE, 
                           stderr=subprocess.PIPE)

# wait for the process to terminate
out, err = process.communicate()
errcode = process.returncode
Alex
  • 41,580
  • 88
  • 260
  • 469
24
subprocess.check_output(...)

calls the process, raises if its error code is nonzero, and otherwise returns its stdout. It's just a quick shorthand so you don't have to worry about PIPEs and things.

Katriel
  • 120,462
  • 19
  • 136
  • 170
17

If your process gives a huge stdout and no stderr, communicate() might be the wrong way to go due to memory restrictions.

Instead,

process = subprocess.Popen(cmd, shell=True,
                           stdout=subprocess.PIPE, 
                           stderr=subprocess.PIPE)

# wait for the process to terminate
for line in process.stdout: do_something(line)
errcode = process.returncode

might be the way to go.

process.stdout is a file-like object which you can treat as any other such object, mainly:

  • you can read() from it
  • you can readline() from it and
  • you can iterate over it.

The latter is what I do above in order to get its contents line by line.

Stan Prokop
  • 5,579
  • 3
  • 25
  • 29
glglgl
  • 89,107
  • 13
  • 149
  • 217
  • 1
    Could you explain to me what the for loop does? – Stupid.Fat.Cat Nov 15 '12 at 13:35
  • @Stupid.Fat.Cat It reads the `stdout` produced from the process line by line and processes each line. – glglgl Mar 08 '16 at 19:51
  • I get `ValueError: I/O operation on closed file` for the operation `for line in process.stdout` – Yash89 Feb 10 '17 at 23:49
  • 2
    @glglgl You might want to change process(line) to something like do_something(line) to avoid confusion with the process identifier/variable that you set on the first line. :) – NYCeyes May 24 '17 at 17:53
  • do_something(line) was not clear for me. I just ask for the display of each output. for line in process.stdout: print ("stdout :", line) for line in process.stderr: print ("stderr :", line) – MaxiReglisse Apr 16 '19 at 13:02
2

I'd try something like:

#!/usr/bin/python
from __future__ import print_function

import shlex
from subprocess import Popen, PIPE

def shlep(cmd):
    '''shlex split and popen
    '''
    parsed_cmd = shlex.split(cmd)
    ## if parsed_cmd[0] not in approved_commands:
    ##    raise ValueError, "Bad User!  No output for you!"
    proc = Popen(parsed_command, stdout=PIPE, stderr=PIPE)
    out, err = proc.communicate()
    return (proc.returncode, out, err)

... In other words let shlex.split() do most of the work. I would NOT attempt to parse the shell's command line, find pipe operators and set up your own pipeline. If you're going to do that then you'll basically have to write a complete shell syntax parser and you'll end up doing an awful lot of plumbing.

Of course this raises the question, why not just use Popen with the shell=True (keyword) option? This will let you pass a string (no splitting nor parsing) to the shell and still gather up the results to handle as you wish. My example here won't process any pipelines, backticks, file descriptor redirection, etc that might be in the command, they'll all appear as literal arguments to the command. Thus it is still safer then running with shell=True ... I've given a silly example of checking the command against some sort of "approved command" dictionary or set --- through it would make more sense to normalize that into an absolute path unless you intend to require that the arguments be normalized prior to passing the command string to this function.

Jim Dennis
  • 17,054
  • 13
  • 68
  • 116
  • 1
    I like this example, raising attention to shlex as a proper way to parse command line input. Humor in the example, too (shlep() and the exception comments). :) – NYCeyes May 24 '17 at 18:39
2

With Python 3.8 this workes for me. For instance to execute a python script within the venv:

import subprocess
import sys
res = subprocess.run(
        [
          sys.executable, # venv3.8/bin/python
          'main.py', 
          '--help',
        ], 
        stdout=subprocess.PIPE, 
        text=True
)
print(res.stdout)
Jaroslav Bezděk
  • 6,967
  • 6
  • 29
  • 46
Christian Schulzendorff
  • 1,431
  • 1
  • 18
  • 15