I need to size a box classification system where boxes get on a queue until the packaging quantity is reached. Then they all leave the queue at a fixed rate.
This is done real time with production. I can estimate the volume for each SKU but I can't predict the order in which they will arrive at the classification/sorting facility. However I can look at previous manufacturing data to test the algorithm.
The key point is how would you estimate the needed bins/queues to accomplish the sorting (minimizing the "all queues used" condition)
I thought of queue theory but I want to run some simulations with the known data (data is not totally random) and most of what I searched assumes random entry to queues.
I'm starting to write a python script to model the queue behavior myself with given fixed times for queue evacuation.
Any suggestions?
Thanks in advance.
Ideally should be python based
The expected output should be the used queues vs time and, in case of a limited number of queues, the number of boxes "discarded" vs time