0

I have a high priority thread which offers some events that tell about its actions:

    public event EventHandler<EventArgs> Connecting;
    private _MyMethod()
    { 
       // ...some code
       OnConnecting(my_event_args);
       // ...more code
    }

The thread should not be blocked by whatever processing is done within the subscriber's event processing method like that would happen with plain vanilla event handling:

    protected virtual void OnConnecting(EventArgs e)
    {
        EventHandler<EventArgs> temp = Connecting; // for thread-safety
        if (temp != null) temp(this, e); // execute subscriber within "my" thread
    }

The first solution is to simply pass the event (delegate) as lambda to Task.Run:

    protected virtual void OnConnecting(EventArgs e)
    {
        EventHandler<EventArgs> temp = Connecting; // for thread-safety
        if (temp != null) Task.Run(() => temp(this, e));
    }

Do I err or is this implicitly sequencing all subscribers in a non-parallel order of execution, like calling them directly would have? Is that next version a sensible way of placing all subscribers in parallel into the Threadpool?

    protected virtual void OnConnecting(EventArgs e)
    {
        foreach (var t in Connecting.GetInvocationList()) Task.Run(()=>t.DynamicInvoke(this,e));
    }

Also, somewhere I read that lambdas aren't as performant as simply calling a delegate, is there a way to get rid of it or was the recommendation wrong?

Vroomfondel
  • 2,704
  • 1
  • 15
  • 29
  • 2
    The event handlers were designed to run sequentially because that is how a `MultiCastDelegate` works (an event is essentially a `MultiCastDelegate`). So every developer will expect that. You will need to explicitly make clear so that everyone reading the code gets the non-standard behavior. `Task.Run` is "fire-and-forget" - you should keep that in mind (no exception handling, no clue when the task is completed, no way to cancel etc). The performance issue is not with lambdas but rather with `DynamicInvoke` in your case - that's slow. Look [here](https://stackoverflow.com/a/1516165/2846483). – dymanoid Sep 25 '19 at 11:23
  • @dymanoid thanks for the info. While I don't dispute what you said I find it puzzling that entirely unconnected subscribers are enforced to run sequentially - the order will be entirely unpredictable anyway for most scenarios. Is it because of the possible access on the sender object? Is it unusual to decouple event producer and consumer by using different execution contexts? I was under the impression that the event facility had something like that as idea in the background. – Vroomfondel Sep 25 '19 at 11:29
  • _" Is it unusual to decouple event producer and consumer by using different execution contexts"_ - No. Think "UI Thread". – Fildor Sep 25 '19 at 11:30
  • @Fildor ok, but that is not what is happening in the naive first solution, is it? – Vroomfondel Sep 25 '19 at 11:33
  • 1
    The events in C# were designed to run on the event raiser thread - single-threaded. That is why they are executed sequentially. It's the design decision. We can agree or disagree, but that's how it works. If you want different execution contexts, implement it by yourself. There are e.g. event aggregator implementations that allow such kind of things. And see the answer I referenced too. – dymanoid Sep 25 '19 at 13:04
  • 1
    @Vroomfondel No, you are right. I just wanted to say that such a concept is not out of the world. A long time ago, I also had to do something similar to ensure near-real-time properties for a system. (Well, that's what "they" called it. In fact it was just "don't make the eventhandling slow down that one important computation") It's a mouthful of work but definitely not impossible. – Fildor Sep 25 '19 at 13:20

0 Answers0