1

There is a pytest test suite that gets run in multiprocess/parallel via pytest-xdist. Some of these tests write to the same file, which can become problematic when multiprocessing.

I thought that doing something like this would do the trick, but hasn't been successful:

In util.py:

import multiprocessing

LOCK = multiprocessing.Lock()

And in test_something.py:

from util import LOCK
...
def test_something():
    ...
    LOCK.acquire()
    write_to_file()
    LOCK.release()
    ...

Sometimes the tests would hang, or sometimes there would be a WRITE/READ issue.

Am I placing the LOCK at an incorrect location? Is there a way to pass a global object across all of the tests? Or am I thinking about it the wrong way?

hainabaraka
  • 595
  • 2
  • 5
  • 13
  • I'm not familiar with `pytest-xdist` but you need to make sure that the lock is actually shared with all the processes, i.e. that (on Linux/MacOS) all child processes get forked from the original process in which you created the lock or that you pass the lock to those child processes explicitly via `multiprocessing.Process(target, args=(LOCK, ))`. So the question is: How does `pytest-xdist` create the processes and can you somehow adjust the way this is done? – balu Jan 11 '22 at 15:18

1 Answers1

1

For that, you'll probably want to use https://py-filelock.readthedocs.io/en/latest/

pip install filelock, then try the examples, eg:

from filelock import Timeout, FileLock

lock = FileLock("high_ground.txt.lock")
with lock:
    open("high_ground.txt", "a").write("You were the chosen one.")
Michael Mintz
  • 9,007
  • 6
  • 31
  • 48