I have a Python 2.7 script running on Linux that crashes with IOError: [Errno 24] Too many open files
. When I run lsof -p <script_pid>
to see what files the script has open, I see an increasing number of anon_inode
files.
This script first downloads files from S3 using eventlet
for concurrency. It then processes the downloaded files using multiprocessing.dummy
for multithreading. I have run the multithreaded code in isolation and found that it only leaks file descriptors when I include the following monkey patching for eventlet:
patcher.monkey_patch(thread=False)
Any ideas on how I could resolve this would be much appreciated!