0

Here's my scenario: multiple producers, single consumer. The consumer works synchronously, and should only process the last input (even abort processing of what is currently being processed and start with new input if any).

So I've connected a BroadcastBlock with an ActionBlock

var broadcastBlock = new BroadcastBlock<PresenceStateChange>((input) => input,     new DataflowBlockOptions { CancellationToken = serverShutDownSource.Token });
var processTeamPresenceUpdateBlock = new     ActionBlock<PresenceStateChange>((data) => processData(data), new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 1 });
broadcastBlock.LinkTo(processTeamPresenceUpdateBlock, new DataflowLinkOptions { PropagateCompletion = true });

The data class I'm using contains the data that is supposed to be processed (processing involves sending something across the network), and a CancellationTaskSource so that processing can be aborted if new data enters the pipeline.

So here's my PresenceStateChange class

public class PresenceStateChange
{
    public string Data { get; set; }

    public CancellationTokenSource ShutDownSource { get; set; }
}

Data is generated as follows

private void generateData()
    {
        string data = null;
        lock (myLock)
        {
            data = "data element " + messageCounter;
            messageCounter++;
            if (cancelLastJobSource != null)
                cancelLastJobSource.Cancel();
            cancelLastJobSource = new CancellationTokenSource();
        }
        log("Generated data : " + data, 4);
        PresenceStateChange change = new PresenceStateChange { Data = data, ShutDownSource = cancelLastJobSource };
        broadcastBlock.Post(change);
    }

And processing works as follows

private async Task processData(PresenceStateChange change)
    {
        log("processing data " + change.Data, 4);
        await Task.Delay(2000, change.ShutDownSource.Token).ConfigureAwait(false);
        if (change.ShutDownSource.IsCancellationRequested)
        {
            log("processing of data " + change.Data + ", was aborted", 3);
        }
        else
        {
            log("processing data " + change.Data + " is complete", 4);
        }
    }

All this works just fine and does what it is supposed to do.

Now if I try to apply this to my real problem processData has a return value. When I post data to the broadcastBlock, I'd need to get something back that I can await on and lets me know if the operation was successful (that being either a) the message was sent across the network - currently being simulated with Task.Delay, or b) processing was aborted because a new message was posted. Since the BroadcastBlock does not return anything other than an indicator whether the post was successful - I'm wondering if there's a way to deal with my requirement in TPL or if I have to look elsewhere (and pointers how that elsewhere would look would also be appreciated).

user3566056
  • 224
  • 1
  • 12
  • Dataflow doesn't support that out of the box. You can try doing that yourself if you like. – i3arnon Oct 02 '15 at 15:38
  • You wouldn't happen to have any pointers where to begin with? It looks rather daunting since I basically need a Task from the ActionSource and there's not really any mechanism that would get data back across the pipeline. – user3566056 Oct 02 '15 at 17:59
  • I wouldn't do that at all. It doesn't fit with the mindset of TPL Dataflow (or the actor paradigm as a whole). – i3arnon Oct 02 '15 at 18:35
  • I finally went down another route after all. I do have my classes to perform perfectly serialized execution based on an msdn article, and every task I queue actually returns a task (and wouldn't you know.. took a CancellationToken).. so it was a quick matter of changing all Action to Func> and now I can track my tasks and abort them at will. – user3566056 Oct 05 '15 at 17:04

0 Answers0