3

I'm a newbie to C# .Net world and am creating a .NET API service that writes to an MS Access database table which does full table locks, there will be up to 20 people writing to the table at the same time (we are getting rid of Access, but not soon enough!). We need to ensure "writers" don't get locked out.

While a threading solution would work in most languages, Apple recommends using Dispatch Queues to support concurrency challenges such as above (see excerpt below), where requests to a shared resource are queued and processed one at a time to avoid conflicts.

Is there an equivalent approach or idiom in C#?

I did see this SO question, but it did not really have an answer in the correct context I need.

For more details:

Excerpt from Dispatch Queues Objective-C docs:

Dispatch queues are a C-based mechanism for executing custom tasks. A dispatch queue executes tasks either serially or concurrently but always in a first-in, first-out order. (In other words, a dispatch queue always dequeues and starts tasks in the same order in which they were added to the queue.) A serial dispatch queue runs only one task at a time, waiting until that task is complete before dequeuing and starting a new one. By contrast, a concurrent dispatch queue starts as many tasks as it can without waiting for already started tasks to finish.

Dispatch queues have other benefits:

They provide a straightforward and simple programming interface. They offer automatic and holistic thread pool management. They provide the speed of tuned assembly. They are much more memory efficient (because thread stacks do not linger in application memory). They do not trap to the kernel under load. The asynchronous dispatching of tasks to a dispatch queue cannot deadlock the queue. They scale gracefully under contention. Serial dispatch queues offer a more efficient alternative to locks and other synchronization primitives.

The tasks you submit to a dispatch queue must be encapsulated inside either a function or a block object

EDIT: Turns out that the .Net "Main Loop" or main thread is where you can make requests to process your code. The main loop is where all the UI work is typically done. According to this SO question The Windows GUI Main Loop in C#...where is it? You can also access it via Application.Run and Timer

SilentNot
  • 1,661
  • 1
  • 16
  • 25
  • 1
    Since most of the time the sequential order guarantee of dispatch queues is useless you can just use thread pools which exist in basically every language there is - they will be more efficient to boot with. If you need sequential dispatching you're usually doing GUI work in which case your framework provides the right tool (such as dispatchers for WPF/WinForms). For the rare situations where you need such behaviour yourself a simple blocking queue or similar will do fine, but I'd question the design. – Voo Feb 18 '17 at 20:31
  • Thanks @Voo, let me look into those options. – SilentNot Feb 19 '17 at 19:34

1 Answers1

2

I found your question interesting and related to an area of the .NET framework I could use more knowledge about, so I did a little research on the topic. Here you go:

There are several .NET options related to friendly-managing threads that might help with what you are trying to do. The stand outs are TaskScheduler.QueueTask and ThreadPool.QueueUserWorkItem. BackgroundWorker is also possibly applicable to your architecturally unenviable situation.

Neither documentation for TaskScheduler or ThreadPool task/thread queues mention any guarantee about the order of queued items. If the order of your threaded tasks is very important, based on my admittedly limited knowledge of the .NET framework, you might want to guarantee the queuing yourself with a queue starter method that takes the write requests and writes to the database or queues the write until the write rights are available. That is going to be a little messy in that you will need to lock for concurrency. This github hosted code from a similar SO question might be good for that. I have not tested it.

Alternatively you could queue your write tasks to a thread pool limited to one thread at a time using SetMaxThreads. No guarantees from me about that being appropriate for your situation, or ever, though it does seem simple.

Hope that helps

EDIT: Further research from the similar SO question, pointed out by the original question poster, the Concurrent Queue, new with .NET 4.0, ensures processing a next task out of the ConcurrentQueue queued tasks without the extra concurrency lock code necessary if a regular Queue were used. That approach would allow simultaneously multiple tasks where possible instead of always waiting for the previous task to complete as in the single thread processor thread pool approach.

Community
  • 1
  • 1
stackuser83
  • 2,012
  • 1
  • 24
  • 41
  • Thanks @stackuser83, this stuff gets a bit hairy so I want to look into it a bit a see what make sense. – SilentNot Feb 19 '17 at 19:36
  • I definitely need last in wins. @stackuser83, [your SO reference](http://stackoverflow.com/questions/12375339/threadsafe-fifo-queue-buffer?noredirect=1&lq=1) led me to [ConcurrentQueue](http://stackoverflow.com/questions/12375339/threadsafe-fifo-queue-buffer?noredirect=1&lq=1) which bears investigation!! – SilentNot Feb 20 '17 at 13:45
  • The QueueUserWorkItem will also work if I limit the thread queue to 1, but ConcurrentQueue is the most elegant. If you can work that into your answer I'll accept your answer!! – SilentNot Feb 21 '17 at 12:52
  • Oops @stackuser83, sorry I was late on the upvote. I missed the notification. – SilentNot Mar 03 '17 at 23:02