I have a web server that needs to fetch and cache around 1000 items that come from a slow api.
I would like the first request and any that hit a ttl expiration to kick off this operation, but then coninue with the request. It has other values to return, and can serve stale or null for the soon-to-be refreshed data.
I've recreated a simplified example:
// ... within some handleRequest function
if (
!cache.isFetching &&
(!cache.isSet || cache.isStale)
) {
// start fetch/cache, but don't block
fetchAndCacheThatStuff();
}
// ... continue with request handling
I am trying to reason about how/when this will block when requests are coming in during cache refreshing:
async function fetchAndCacheThatStuff() {
cache.isFetching = true;
for (const twoHundoIds of chunk(oneThousandIds, 200) {
const result = await fetchItems(twoHundoIds);
await cache.mset(result.map(mapResultToCacheEntries));
}
cache.isSet = true;
cache.isStale = false;
}
Is there a time when the server will be waiting for a response and unable to process another incoming request?