0

I have an interface like this:

interface IProcessor{
    IObservable<Item> Process(Item item);
}

I have an array of workers:

IProcessor[] _workers = ....

I want to pass an item through all the workers:

var ret = Observable.Return(item);
for (var i = 0; i < _workers.Length; i++)
{
    int index = i;
    ret = ret
        .SelectMany(r => _workers[index].Process(r))
    ;
}
return ret;

I'm not too happy with how this looks -- is there a cleaner way?

Judah Gabriel Himango
  • 58,906
  • 38
  • 158
  • 212
Sergey Azarkevich
  • 2,641
  • 2
  • 22
  • 38

2 Answers2

2

This works for me:

IObservable<Item> ret = _workers.Aggregate(
    Observable.Return(item),
    (rs, w) =>
        from r in rs
        from p in w.Process(r)
        select p);

Please keep in mind that this kind of aggregation of observables - both in your question and in my answer - can cause memory issues (i.e. stack overflow) quickly. In my tests I could get 400 workers working, but 500 caused a crash.

You're better off changing your IProcessor to not use observables and implement your observable like this:

interface IProcessor{
    Item Process(Item item);
}

var f =
    _workers.Aggregate<IProcessor, Func<Item, Item>>(
            i => i,
            (fs, p) => i => p.Process(fs(i)));

var ret = Observable.Start(() => f(item), Scheduler.ThreadPool);

With this approach I can get over 20,000 nested workers before a stack overflow and the results are almost instantaneous up to that level.

Enigmativity
  • 113,464
  • 11
  • 89
  • 172
  • I have about 3-10 processors in array, so stack overflow is not a problem. Also, seems that solution with thread pool will be limited to pool size, but here will be many concurrent asynchronous IO operations. But I start learning Rx a couple weeks ago, so I can't say for sure how they will correlate. – Sergey Azarkevich Sep 27 '11 at 08:21
  • @SergeyAzarkevich - You could change `Scheduler.ThreadPool` to `Scheduler.NewThread`, but you might find that you get worse performance as each new thread takes over 1MB of memory to create. – Enigmativity Sep 27 '11 at 08:59
0

Maybe something like this:?

var item = new Item();
_workers
  .ToObservable()
  .SelectMany(worker => worker.Process(item))
  .Subscribe(item => ...);

I made an assumption that the workers can process the item in parallel.

P.S. If you'd like sequential processing, it would be

var item = new Item();
_workers
  .ToObservable()
  .Select(worker => worker.Process(item))
  .Concat()
  .Subscribe(item => ...);
Sergey Aldoukhov
  • 22,316
  • 18
  • 72
  • 99
  • Yes, I want sequential processing. But all processors should be 'chained' - result of first passed to second, result of second to third.... So, processor can throw error or eat item and this result should be provided to subscriber. I will try second example, but seems it is not work as I want. – Sergey Azarkevich Sep 27 '11 at 08:04
  • .Concat would make the processes sequential, while item holding the state. – Sergey Aldoukhov Sep 27 '11 at 14:49