0

I have a list in Python that I share between a few different processes (listproxy) and I need the ability to limit this list to a certain number of elements. The unit is running on a Raspberry Pi with all data stored in memory so I need the list to be able to be limited over time. The list stores log messages just to fill you in on the context, and we are willing to remove the earliest items as it fills up so it's a rolling list essentially. I know I could use an array and just pop off the bottom but then I'm running an O(n-1) operation to move all the elements over 1 element in that array. I can use a Queue, Stack, or Array for this, so long as they are multiprocessing friendly. I know I could create a linked list but I wanted to reach out in case someone else has an idea as a linked list isn't supported out of the box for multiprocessing so I would have to create one.

Thanks in advance!

Phillip McMullen
  • 453
  • 2
  • 5
  • 15
  • You probably want a deque, a double-ended queue, to allow for push/pop from both ends. Usually implemented with a doubly linked list, I think. – a p Feb 21 '18 at 18:39
  • 1
    Perhaps a `collections.deque`: https://stackoverflow.com/questions/8554153/is-this-deque-thread-safe-in-python – juanpa.arrivillaga Feb 21 '18 at 18:40
  • Yup, your right @ap. Thankfully, Python already anticipated such a [case](https://docs.python.org/3.6/library/collections.html#deque-objects). – Christian Dean Feb 21 '18 at 18:41
  • I saw deque but, while thread safe, it isn't multiprocessing-supported. I might be able to use it though to use two different data structures together. I really only have one process that is needing to directly interact with the LogMessages so I could possibly store the incoming messages to a queue and have the one process push these to the deque. – Phillip McMullen Feb 21 '18 at 18:54

1 Answers1

2

based on the already existing comments we can make the only thread safe deque into a multi-processing-safe deque object. It involves the SyncManager from multiprocessing.managers. This will create a "Manager Process" which coordinates the access on the deque object for all other processes and the resulting deque object can be shared:

from collection import deque
from multiprocessing.managers import SyncManager

class DequeueManager(SyncManager):
    '''
    An dequeue manager object to allow for multi process access to an
     dequeue object
    '''
    pass
# Register a shared dequeue
DequeueManager.register("deque", deque)



# code before all the other processes are forked
deque_manager = DequeueManager()
deque_manager.start()

# get shareable multiprocess deque object
deque = deque_manager.deque()

# fork processes and share queue object

# more info about SyncManager: https://docs.python.org/2/library/multiprocessing.html#multiprocessing.managers.SyncManager

For future readers: If you are going for a longterm production app you probably should invest the time to set up rabbitmq or a simmilar queueing system for this use case though.

quantumbyte
  • 379
  • 1
  • 5