I hope someone is able to point me in the right direction, because I am struggling with combining concurrency and the ability to cancel queued requests in rxjs. I am going to try and explain this in sequential events. Say we have observable A, which receives an array of strings.
Events: A observes: ['dog', 'cat', 'elephant', 'tiger'] Downstream checks whether the string network response is cached, if it exists in cache it gets it from cache, if not it requests it from the web and saves the observable to the cache with a publishReplay / shareReplay. There is a limit of 2 network requests happening at once, so it tries to fetch 'dog' and 'cat' from the api (this operation takes over 2000ms). After 1000ms A observes another set of values: ['dog', 'rat', 'horse', 'rabbit'].
Next the following should happen. I don't want 'dog' and 'cat' requests to be cancelled, I want them to finish their request, but 'elephant' and 'tiger' from the first request I want to ignore. Once 'dog' and 'cat' gets their response 'rat' and 'horse' from the second frame should be requested from the network, and lastly once either of those resolve 'rabbit' is requested.
Here is my current code. I have tried switching between defer and from for the networkObservable and the behavior is different, and neither is what I want.
const cache = new Map();
// Fake Promise to fake a api request
function later(delay, value) {
console.log('requesting', value);
return new Promise(resolve => setTimeout(resolve, delay, value));
}
const testObservable = of(['dog', 'rat', 'horse', 'rabbit']).pipe(
delay(1000),
startWith(['dog', 'cat', 'elephant', 'tiger'])
);
testObservable.pipe(
map(array => from(array).pipe(
publish(arrayObservable => {
const cachedObservable = arrayObservable.pipe(
filter(id => cache.has(id)),
flatMap(id => cache.get(id), 1)
);
const uncachedObservable = arrayObservable.pipe(
filter(id => !cache.has(id)),
flatMap(id => {
const networkObservable = from(later(2000, id)).pipe(
tap(e => console.log('response', e)),
map(e => 'parsed: ' + e),
tap(e => console.log('parsed', e)),
publishReplay(1),
refCount(),
take(1)
);
cache.set(id, networkObservable);
return networkObservable;
}, 2)
);
return merge(cachedObservable, uncachedObservable);
})
)),
switchAll()
)
This results in an output of:
requesting dog
requesting cat
requesting rat
requesting horse
response dog
parsed parsed: dog
response rat
parsed parsed: rat
requesting rabbit
response horse
parsed parsed: horse
response rabbit
parsed parsed: rabbit
Which is close to the wanted behavior, but with one glaring defect. Rat and horse is being requested and is not waiting for dog and cat to resolve before being executed. However, the 'tiger' and 'elephant' has been properly disposed, so that functionality works.
Will I have to create a separate subject which handles the requests?