I have a subprocess that can output a lot of data to stdout. When too much data is generated it causes the subprocess to hang since it is waiting for the stdout buffer to be emptied.
Here is a small example below...
test.py
#!/usr/local/bin/python2.7
# test.py
import subprocess
proc = subprocess.Popen(["python","./_ping_con.py"], stdout = subp.PIPE)
while proc.poll() is None:
pass
print proc.stdout.read()
...and the subprocess:
#!/usr/local/bin/python2.7
# _ping_con.py
print(96000 * "*") # Hangs here because it's too much data for the stdout pipe
What I'd like to know is, can this buffer be expanded to allow for more data to be handled? If not, is there a different way I could be sending my data that would avoid this issue? OR in the main process, is there a way to tell if the stdout buffer is full and do a read?