0

I need to know how I can have a small block of adjacent lines of code run sequentially without any other thread being able to have CPU time.

if (status != Status.Pending) 
    return;

status = Status.Locked;

Background

I'm developing an application with a user interface on a main thread that fills a list of outstanding asynchronous tasks that an army of helper-threads chip away at. I began development on custom objects to represent these activities before realizing that tasks were built-in. For this project, I'd like to continue with my custom task-like objects.

I have threads wait around for tasks to appear in the list, then the first one to lock it gets to keep it. The very first line of each tasks used to set its status to Locked - but even in the tiny gap between finding the task and locking the task multiple threads occasionally take the same task.


Priorities, Priorities

My first idea was to look into how Priorities of threads are handled. This documentation seems to describe what I was hoping - where a high priority thread takes all the resources before any lower priority threads. That seemed like a good idea that if my body of work code was preceeded by raising the current thread's priority temporarily.

Thread.CurrentThread.Priority = ThreadPriority.Highest;

But from the same site this code example shows that over a period of time the tasks all receive enough attention that I couldn't rely on one high priority thread completing its body of work. I expect that multi-cores in the CPU is the culprit.

// The example displays output like the following:
//    ThreadOne   with      Normal priority has a count =   755,897,581
//    ThreadThree with AboveNormal priority has a count =   778,099,094
//    ThreadTwo   with BelowNormal priority has a count =     7,840,984

Oh and here is the nail in the coffin to that idea delivered by our glorious leader Jeff Atwood.


Freeze!

My next idea has been to freeze every other helper-thread temporarily, then thaw them after my block is complete. Urk! In a multi-core CPU environment I can imagine this code potentially running on two cores at the same time; and that would be game over.

foreach (Thread thread in ActivityQueue.threads)
    if (thread != Thread.CurrentThread)
        thread.Suspend();

What else?

What other options do I have for making sure that I can check if something is locked, then lock it or back out when code could be running simultaneously? Examples using the built in Task object are welcome.


This question was marked as a duplicate. I say that the fundamental answer from @Daniel Hilgarth of No, you can't points me away from that direction. As pointed out in the comments, I've boo-boo'd with a typical XY question - but that is easy for me to identify only in hind-sight.

Uwe Keim
  • 39,551
  • 56
  • 175
  • 291
  • 9
    You're scratching at an XY Problem here. Your stated problem is "How do I make sure nothing interupts this particular thread between these two lines" but your actual problem is "How to I write multithreaded code properly" - and the answer to that is too broad. – Jamiec Sep 01 '17 at 11:45
  • 1
    Possible duplicate of [Prevent context-switching in timed section of code (or measure then subtract time not actually spent in thread)](https://stackoverflow.com/questions/16917092/prevent-context-switching-in-timed-section-of-code-or-measure-then-subtract-tim) – mjwills Sep 01 '17 at 11:45
  • 2
    `Interlocked.CompareExchange` – L.B Sep 01 '17 at 11:49
  • 1
    "_multiple threads occasionally take the same task_" - Then all you need to do is synchronizing the task queue ... or better use a concurrent one in the first place. – Fildor Sep 01 '17 at 11:49
  • 3
    Windows is not a real-time operating system. User applications cannot prevent context switches. That's from the bad old days of cooperative multitasking on Win16, and that didn't work out too well. If you want one thread to stay out of the business of another thread, use mutual exclusion (with `lock` statements, or, if you're on tasks, [`AsyncLock`](https://github.com/StephenCleary/AsyncEx/wiki/AsyncLock)). There are no race conditions there -- one lock, one owner. – Jeroen Mostert Sep 01 '17 at 11:49
  • 1
    Or simply write code so that it *isn't* affected by interruptions. What is the *actual* problem and the *actual* question? The question text doesn't help at all. If anything, you shouldn't be trying to mess with the number of threads. Remember, *consumer* CPUs have 4-20 hyperthreaded cores nowadays. – Panagiotis Kanavos Sep 01 '17 at 11:57
  • There are some good ideas here, what @Fildor has said has given me an idea. Instead of having threads pluck tasks from a queue, have one thread in charge of divvying them out should fix my problem. – Walledhouse Sep 01 '17 at 11:58
  • As stated, if you need to prevent another thread from executing the code, `lock`. But if predictable execution of your code depends on nothing happening on unrelated threads doing other things, something is already very wrong. What is it about this code that will cause it to fail because another thread is doing something else? Whatever it is, that's the problem. – Scott Hannen Sep 01 '17 at 11:58
  • For example, if you use *Task.Run* and pass each task the data it needs, you can have as many concurrent threads as possible. You can use an `ActionBlock< T>`, post messages to it and not care if it uses 1 or 10 concurrent tasks to process them – Panagiotis Kanavos Sep 01 '17 at 11:59
  • @Walledhouse please explain in *one sentence* what you want to do. Most likely, .NET already supports the scenario. For example, if you want to post messages to a queue for concurrent processing, you can use `ActionBlock< T>`. Or you can create a pipeline of TPL Dataflow blocks, each of which will use a different thread/task to process messages. In almost every case though you'll have to avoid global state – Panagiotis Kanavos Sep 01 '17 at 12:02
  • 1
    The general problem (check a value, if it's equal to a specific value, change it to something else) is easily addressed by the `Interlocked.CompareExchange` family of functions. (but you wouldn't be able to use `Enum`s, specifically) – Damien_The_Unbeliever Sep 01 '17 at 12:02

1 Answers1

2

"thread that fills a list of outstanding asynchronous tasks" - In that sentence is the key to your problem. You don't need to synchronize the execution of a task. You need to synchronize the taking from the queue.

The easiest way would be to use a concurrent one in the first place, so any two threads never will be able to dequeue the same item (task in your case).

BlockingCollection for example also provides many other conveniences besides being concurrent. For example: you can await availability of items without using spin/wait. You can easily signal "no more items to be expected" to gracefully terminate consumers ...

Mind, that I am assuming you are not using Task-based Asynchronous Pattern. If you do, there are also built-in mechanisms to achieve your desired behavior.

Fildor
  • 14,510
  • 4
  • 35
  • 67
  • @Downvoter: Care to explain? I am willing to learn! – Fildor Sep 05 '17 at 13:27
  • I solved my problem from this advice. It seems obvious to me now: - Previously, a list existed and threads plucked problems from the list. It was possible for two threads to take the same problem. - Now, one thread manages the list and _assigns_ problems to threads. Obviously, this one thread isn't going to make a mistake in assigning two threads the same problem. – Walledhouse Oct 27 '17 at 22:10