0

I have a script which could be run 1,2... x number of times in parallel.

foo.py &
foo.py &
...

Each script has to check the existence of a piece of hardware (open all found FTDI device, read some data, close it). foo.py checks for FTDI devices once a second & then sleeps (what happens next is user chooses one to connect to).

Now obviously each instance cannot open the same FTDI device at the same time so I am in need of some form of "communication" for locking purposes.

What I have considered so far

  1. locking file (slow... but if I have to)
  2. sockets (I can use the condition that if a socket is open then a foo.py instance is querying the USB bus for ftdi devices)
  3. python's multiprocessing library & Lock (I however cannot see how this could work since no way to share the existence of x number of foo.py's running)

Anyone have any ideas?

Naib
  • 999
  • 7
  • 20
  • You may want to read up about semaphore and race conditions http://en.wikipedia.org/wiki/Semaphore_(programming) http://en.wikipedia.org/wiki/Race_condition – Tymoteusz Paul Jan 04 '14 at 00:25
  • A locking file is too slow? For running a smallish number of processes that sleep for a minute at a time between tasks? – abarnert Jan 04 '14 at 01:12
  • Apologies, I meant once per second. I am aware of race conditions (I'm mainly hardware) as well as semiphore. This is partly why I am asking, I am not to sure the best way todo this. – Naib Jan 04 '14 at 13:18

0 Answers0