2

I am trying to find easiest way to share real time variable from one script to another. First script will read seonsor data and other will make calculations based on real-time data from the first script. I want to run them separetly. I want to be able to kill second script and run it again without any problems.

I would like to have second script to print real-time data whenever it is started.


Update:

I have finaly got some time to play with os.pipe(). I have managed to run some scripts that use os.fork() but when I tried to split one script into two separate programs I start haveing some issues.

Program I have initated and was working:

    #!/usr/bin/python
     import os, sys
     r, w = os.pipe() 
     processid = os.fork()
     if processid:
           os.close(w)
           r = os.fdopen(r)
           print("Parent reading")
           str = r.read()
           print("text =", str)   
           sys.exit(0)
     else:
           os.close(r)
           w = os.fdopen(w, 'w')
           print("Child writing")
           w.write("Text written by child...")
           w.close()
           print("Child closing")
           sys.exit(0)

Based on that script I tried to write my own separate scripts.

First script that prints time to pipe:

   #!/usr/bin/python
   import os, sys, time
   stdout = sys.stdout.fileno()
   r, w  = os.pipe()
   #os.close(r)
   w = os.fdopen(w, 'w')
   i = 0
   while i < 1000:
         i = i + 1
         w.write('i' + " ")
         time.sleep(1)

Second script that reads time from pipe:

   #!/usr/bin/python

   import os, sys, time
   r, w = os.pipe()
   r = os.fdopen(r)
   str = r.read()
   print(str)

When I try to run my scripts nothing happens. Any suggestions what am I doing wrong? Maybe I missed some details about standard input and output and os.pipe()?

  • The simplest mechanism is the use the filesystem. Have script1 append to a text file. Then have script2 poll for new lines. – user590028 Jun 27 '17 at 19:14
  • ...or to use a named pipe. With files or pipes, be aware of buffering. – cdarke Jun 27 '17 at 19:32
  • I think pipe is something I was looking for. I will need to learn how it works now. Thanks! – Łukasz Żurek Jun 27 '17 at 22:12
  • Or you could use multithreading/multiprocessing and a queue inside a 3rd script and have the 2 scripts interact there. – Adonis Jul 24 '17 at 16:31
  • Is there any particular module for multiprocessing? – Łukasz Żurek Jul 24 '17 at 22:31
  • @ŁukaszŻurek (when replying to someone please use "@username", so that the user can be notified, it helps a lot), and yes there is: https://docs.python.org/3.6/library/multiprocessing.html I can write a small example if you're interested – Adonis Jul 25 '17 at 12:28
  • @asettouf I am new here so I will keep in mind your advice I tried to go through that module but there are too many things. I just need to write a script I will base on. Can you write me short example or just explain how multiprocess module works? I just barely start pipes. – Łukasz Żurek Jul 26 '17 at 20:31

1 Answers1

0

Edit: Just to illustrate a crude example of a possible poison pill approach. Unfortunately, python's (multiprocessing) Queues do not implement a peek call so I used a second queue:

Writer:

import time
import sys

def write(queue_msg, queue_term):
     i = 0
     sys.stdout = open("writer" + ".out", "w")
     sys.stderr = open("writer_err" + ".out", "w")
     stop_writing = True
     while i<1000:
        if not(stop_writing):
            print("Writer at iteration ", i)
            queue_msg.put("Iteration " + str(i)) # put a value inside the queue
            i += 1
        time.sleep(1)
        q_term_val = queue_term.get() if not(queue_term.empty()) else "CONTINUE" #Check that the queue is not empty otherwise put a non important value
        print(q_term_val)
        if q_term_val == "START":
            print("Start working")
            stop_writing = False 
        elif q_term_val == "STOP":
            print("Stop working")
            stop_writing = True 
        sys.stderr.flush() #flush stdout and stderr to disk 
        sys.stdout.flush()

Reader:

import time
import sys

def read(queue_msg, queue_term):
    sys.stdout = open("reader" + ".out", "w") #redirect process stdout to file "reader.out"
    sys.stderr = open("reader_err" + ".out", "w")
    queue_term.put("START")
    while True:
        value = queue_msg.get()
        print(value) # write value to reader.out filer as mentioned above
        if int(value.split(" ")[1]) > 10:
            print("Stop working, I am tired...")
            queue_term.put("STOP")
            time.sleep(5) #wait 5 seconds before sending start to the worker
            queue_term.put("START")
        time.sleep(1)
        sys.stderr.flush()
        sys.stdout.flush()

Main:

import reader
import writer
import multiprocessing as mp

def main():

    q_msg = mp.Queue()
    q_term = mp.Queue()
    r = mp.Process(target=reader.read, args=(q_msg, q_term,))
    w = mp.Process(target=writer.write, args=(q_msg, q_term,))
    r.start()
    w.start()
    print("Processes started, check that in your folder there are a reader.out and a writer.out files")

if __name__ == "__main__": # "workaround" needed on Windows
    main()

And you should have the reader.out file (also writer.out) in your folder showing that the reader indeed read values written to the queue by the writer and in "writer.out", you can see that the writer stops writing to the queue when he received the poison pill. And the reader after the time.sleep(5) call starts again from the right position.

Adonis
  • 4,670
  • 3
  • 37
  • 57
  • @asettourf Thanks, for your help. There is one thing that I am missing. It is sharing data however it's queue. If I will delay reader script with time.sleep(3) then it is not going to print real time data but data that was queued. – Łukasz Żurek Jul 29 '17 at 08:49
  • @ŁukaszŻurek I'm not sure I understand your comment, if you want real time, that will be hard to achieve with Python, and most of the OSes. Now if you're worried about the `time.sleep` calls, you can remove them, the scripts still works, they are simply not waiting between each loop. – Adonis Jul 30 '17 at 10:53
  • I have mentioned that earlier. I want to run one script to count time and second script to read current time - even if I close and run again second script. Your example will print every value in queue. When I changed `time.sleep(3)` I expected to get time: 1, 4, 7, 10, ... . – Łukasz Żurek Jul 30 '17 at 19:26
  • @ŁukaszŻurek I edited my post to illustrate a possible solution to your issue, using 2 queues to synchronize the work between each worker when needed. – Adonis Aug 02 '17 at 12:15
  • did not expect it will be that difficult to solve such simple problem. At this moment I have tried to run your script but it freeze without any result/output. I got your idea. I will take a look deeper into your script this evening. What if we combine poison pill with pipes? Just to trigger writing into the pipe? Second script reads with lower frequency so it should be ok? And about pipe, do I have to read every data writien to the pipe? Looks like communication of a few programs is quite a big deal. – Łukasz Żurek Aug 03 '17 at 10:05
  • @ŁukaszŻurek "It freezes", have you checked the directory where you run the script for 2 files "writer.out", and "reader.out"? (I tested those scripts both on Linux and Windows). I never had the occasion to use `os.pipe`, though I'd guess you can do the same as above with the file descriptors returned by the call. Books and protocols are written to manage program intercommunication, see for instance [the AMQP Protocol](https://en.wikipedia.org/wiki/Advanced_Message_Queuing_Protocol) – Adonis Aug 03 '17 at 10:12
  • @ŁukaszŻurek If you feel that an answer has been provided please mark it as accepted. It helps keep the focus on unanswered questions. If answers did not help, can you provide a feedback as to what is missing? – Adonis Aug 08 '17 at 10:17
  • i can't currently check it due to my vacations ;-) when I am back I will play with it to see if it works. Thanks for your help! Cheers – Łukasz Żurek Aug 10 '17 at 07:36