Update 1: It seems like BlockingCollection suggested below is the perfect fit, and I will try to apply that first thing tomorrow. Thank you for the replies.
Update 2: Indeed it seems to perform admirably. Thanks again for all the help.
__
I have a scientific application in C# with two internal tasks that need to communicate:
Tasks: (a) rapidly acquires a large dataset (some times larger than memory), (b) processes each entry.
To prevent getting the entire dataset at once, I plan for (a) to take some amount of data at a time, then for (b) to process all of that, and repeat until done. (a) and (b) both have multiple threads, but are in the same executable.
The data comes out of (a) in form of a list of small individually processable chunks, so I'm wondering the speediest strategy to keep this dance going between the two processes in C# (Windows .NET Standard), and wonder if anyone has experience that can help this decision? I plan is either to:
Have a list in A that B acquires a lock{} on at say 1000 entries, to stop processes in A from adding more data while processing is done.
Have A send B a event when the list is over say 1000, and then have B stop A's threads while it processes.
I not a very experience programmer when it comes to these things, and wonder if anyone has insights that may help, or some terms I can google for to clarify? (: Help is appreciated (: