Below is a simple python program that logs rsyslog data via StdIn to py.output.txt.
My issue is that is doesn't log the data from STDin in realtime to the output.
If ishome.py runs as a background child process (of rsyslog) ... no output is send to py.output.txt Only when i stop master process .. py.output.txt receives the output
When BackGround Process
When i terminate rsyslog it send an EOF to his childprocess ishome.py .. and that might trigger the actual write out of the data.
When ForeGround Process
However when i run ishome.py as a foreground process py.output.txt gets updated in realtime forever new entry. I do not need to close ishome.py to write out on every new event.
Bash Output
>>ps
root 4328 1 0 21:04 ? 00:00:00 /usr/sbin/rsyslogd -c5
root 4360 4328 1 21:04 ? 00:00:00 python/home/pi/script/ishome.py
>>pi@rasp ~/script $ cat py.output.txt
>>pi@rasp ~/script $ sudo service rsyslog stop
[ ok ] Stopping enhanced syslogd: rsyslogd.
>>pi@rasp ~/script $ cat py.output.txt
2016-01-24 21:05:32.112457 :2016-01-24T22:04:22+00:00 192.168.0.198
2016-01-24 21:05:32.113029 :2016-01-24T22:04:33+00:00 192.168.0.198
ishome.py
#!/usr/bin/env python
import sys
from datetime import datetime
filename = open("/home/pi/script/py.output.txt",'a',0)
sys.stdout = filename
for line in sys.stdin:
print (str(datetime.now())+' :'+line)
First is believed that StdIn was buffered and that the stream was processed at closure. However when i look at the time when a StdIn line was processed, i clearly see that StdIn is processed in realtime. Only the write out is .. not-happening ?
I've tested this scenario with a hundreds of input lines that are written to a MongoDB via PyMongo . Again the db is updated when process is termated.
Any idea's what is causing this delay in writing, i would like to have every new event, been written in realtime to my output (be it DB or a file).