2

I have the following code:

Subject<Action> _uiActions = new Subject<Action>();
_uiSubscription = _uiActions
                .Buffer(TimeSpan.FromMilliseconds(200))
                .Where(x => x.Any())
                .ObserveOnDispatcher()
                .Subscribe(ExecuteOnUi);

Is there some elegant way to limit the number of items that are pushed into the function ExecuteOnUi every 200 milliseconds?

Ideally, I'd like to process no more than 50 items every 200 milliseconds. This might prevent the application from freezing.

Contango
  • 76,540
  • 58
  • 260
  • 305

2 Answers2

2

There's a overload of Buffer which also takes a count, that'll limit the buffer as you want.

Gluck
  • 2,933
  • 16
  • 28
  • This won't rate limit. If he receives 100 values within the 200ms ExecuteOnUi will get called twice. – James Hay Oct 29 '15 at 14:51
  • 1
    Sure, but I don't think that was required by the author, I think he wanted not to have a single long UI operation, but several small ones, which will leave up room for other UI events (paint, clicks ...) to be processed, although I may be wrong, in which case your answer may apply. – Gluck Oct 29 '15 at 20:15
1

I'm assuming if more than 50 values came through within 200ms then the newest would be buffered until the next timer event.

In the absence of being able to think of an elegant way to do this through composing existing operators. You could write an extension method to do this for you. The bellow solution may be a little crude. Over a long running sequence that produces lots of values you could end up with memory issues but for sequences that complete at some point it may do the job.

public static IObservable<IEnumerable<TSource>> Limit<TSource>(
    this IObservable<TSource> source,
    int count,
    TimeSpan timeSpan,
    IScheduler scheduler)
{
    return Observable.Create<IEnumerable<TSource>>(
        observer =>
            {
                var buffer = new Queue<TSource>();

                var guard = new object();

                var sourceSub = source
                    .Subscribe(x =>
                            {
                                lock (guard)
                                {
                                    buffer.Enqueue(x);
                                }
                            }, 
                        observer.OnError, 
                        observer.OnCompleted);

                var timer = Observable.Interval(timeSpan, scheduler)
                    .Subscribe(_ =>
                        {
                            var batch = new List<TSource>();

                            lock (guard)
                            {
                                while (batch.Count <= count && buffer.Any())
                                {
                                    batch.Add(buffer.Dequeue());
                                }
                            }

                            observer.OnNext(batch.AsEnumerable());
                        });

                return new CompositeDisposable(sourceSub, timer);
            });
}
James Hay
  • 12,580
  • 8
  • 44
  • 67
  • 1
    This is utterly brilliant code! I will see if I can test it out today. I found a solution yesterday: add a ConcurrentQueue to decouple the producer and consumer, this solution worked well (but its not quite as clean and reusable as yours). – Contango Oct 30 '15 at 08:10