0

I hope someone is able to point me in the right direction, because I am struggling with combining concurrency and the ability to cancel queued requests in rxjs. I am going to try and explain this in sequential events. Say we have observable A, which receives an array of strings.

Events: A observes: ['dog', 'cat', 'elephant', 'tiger'] Downstream checks whether the string network response is cached, if it exists in cache it gets it from cache, if not it requests it from the web and saves the observable to the cache with a publishReplay / shareReplay. There is a limit of 2 network requests happening at once, so it tries to fetch 'dog' and 'cat' from the api (this operation takes over 2000ms). After 1000ms A observes another set of values: ['dog', 'rat', 'horse', 'rabbit'].

Next the following should happen. I don't want 'dog' and 'cat' requests to be cancelled, I want them to finish their request, but 'elephant' and 'tiger' from the first request I want to ignore. Once 'dog' and 'cat' gets their response 'rat' and 'horse' from the second frame should be requested from the network, and lastly once either of those resolve 'rabbit' is requested.

Here is my current code. I have tried switching between defer and from for the networkObservable and the behavior is different, and neither is what I want.


const cache = new Map();

// Fake Promise to fake a api request
function later(delay, value) {
  console.log('requesting', value);
  return new Promise(resolve => setTimeout(resolve, delay, value));
}


const testObservable = of(['dog', 'rat', 'horse', 'rabbit']).pipe(
  delay(1000),
  startWith(['dog', 'cat', 'elephant', 'tiger'])
);

testObservable.pipe(
  map(array => from(array).pipe(
    publish(arrayObservable => {
      const cachedObservable = arrayObservable.pipe(
        filter(id => cache.has(id)),
        flatMap(id => cache.get(id), 1)
      );
      const uncachedObservable = arrayObservable.pipe(
        filter(id => !cache.has(id)),
        flatMap(id => {
          const networkObservable = from(later(2000, id)).pipe(
            tap(e => console.log('response', e)),
            map(e => 'parsed: ' + e),
            tap(e => console.log('parsed', e)),
            publishReplay(1),
            refCount(),
            take(1)
          );
          cache.set(id, networkObservable);
          return networkObservable;
        }, 2)
      );
      return merge(cachedObservable, uncachedObservable);
    })
  )),
  switchAll()
)

This results in an output of:

requesting dog
requesting cat
requesting rat
requesting horse
response dog
parsed parsed: dog
response rat
parsed parsed: rat
requesting rabbit
response horse
parsed parsed: horse
response rabbit
parsed parsed: rabbit

Which is close to the wanted behavior, but with one glaring defect. Rat and horse is being requested and is not waiting for dog and cat to resolve before being executed. However, the 'tiger' and 'elephant' has been properly disposed, so that functionality works.

Will I have to create a separate subject which handles the requests?

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62
eXigiouS
  • 21
  • 4

1 Answers1

0

I have tried to wire up a solution for this interesting problem, at least for what I have understood of it.

The starting point is testObservable which is a stream of Arrays<string>. Every string of these arrays represents a potential request to a back end service. There can not be more than 2 requests on the fly at each given time, so there must be some kind of queue mechanism in place. For this I use the concurrency parameter of mergeMap.

The key point here is that, any time a new Array is emitted by testObservable, any request related to strings contained in arrays emitted previously which has not yet been sent to the remote service should be stopped.

So I start creating a stream of objects containing the string that is the input for the remote service call as well as a stop indicator like this

testObservable
  .pipe(
    mergeMap((a) => {
      i++;
      if (arrays[i - 1]) {
        arrays[i - 1].stop = true;
      }
      const thisArray = { stop: false, i };
      arrays[i] = thisArray;
      return from(a.map((_v) => ({ v: _v, ssTop: arrays[i] }))).pipe(
        mergeMap((d) => {
          // d is an object containing the parameter for the remote call and an object of type {stop: boolean, i: number}
          // for every string of every array a d is emitted
        }, 2)
      );
    })
  )

Then, for each d emitted I can implement a logic that makes sure that a call to the remote service is performed only if the stop flag is not set to true like this

d.ssTop.stop
            ? NEVER
            : from(later(2000, d.v))

Notice that the stop flag for the ith array is set to true any time the i+1th array is emitted by testObservable ensuring so that no calls related to the ith array are made after the i+1th array has been emitted.

This could be how the complete code could look like

const cache = new Map();

// Fake Promise to fake a api request
function later(delay, value) {
  console.log("requesting", value);
  return new Promise((resolve) => setTimeout(resolve, delay, value));
}

const testObservable = of(["dog", "rat", "horse", "rabbit"]).pipe(
  delay(1000),
  startWith(["dog", "cat", "elephant", "tiger"])
);

let i = 0;
let arrays: { stop: boolean; i: number }[] = [];

testObservable
  .pipe(
    mergeMap((a) => {
      i++;
      if (arrays[i - 1]) {
        arrays[i - 1].stop = true;
      }
      const thisArray = { stop: false, i };
      arrays[i] = thisArray;
      return from(a.map((_v) => ({ v: _v, ssTop: arrays[i] }))).pipe(
        mergeMap((d) => {
          return d.ssTop.stop
            ? NEVER
            : cache[d.v]
            ? of(`${d.v} is the returned from cache}`)
            : from(later(2000, d.v)).pipe(
                map((v: any) => {
                  cache[v] = v;
                  return `${v} is the returned value ${d.ssTop.i}`;
                })
              );
        }, 2)
      );
    })
  )
  .subscribe({
    next: (d) => {
      console.log(d);
    },
  });
Picci
  • 16,775
  • 13
  • 70
  • 113
  • Hi @picci, I tested out your solution, thanks for taking your time to look at this. One defect with your proposal is that "cat", which was not part of the second frame, gets returned by the observable. Also from what I can see in your solution the network call to requesting "dog" is called twice, so something is strange with the cache. – eXigiouS Apr 17 '20 at 04:20
  • @eXigiouS here the responses to your 2 points: "cat" is the second item of the first array emitted, so I would expect it to be passed to the remote office and return. Regarding the cache, it is true that you see 2 calls for "dog" and this is due to the fact that, when the second call is performed, the first call has not yet returned and therefore no value for "dog" is present in the cache. If you want to stop or delay the second call, the solution would have to be made even more sophisticated. – Picci Apr 17 '20 at 07:20