I'm working on some back-end service which is asynchronous in nature. That is, we have multiple jobs that are ran asynchronously and results are written to some record.
This record is basically a class wrapping an HashMap
of results (keys are job_id
).
The thing is, I don't want to calculate or know in advance how many jobs are going to run (if I knew, I could cache.invalidate()
the key when all the jobs has already been completed)
Instead, I'd like to have the following scheme:
- Set an expiry for new records (i.e.
expireAfterWrite
) - On expiry, write (actually
upsert
) the record the database - If a cache miss occurs,
load()
is called to fetch the record from the database (if not found, create a new one)
The problem: I tried to use Caffeine cache but the problem is that records aren't expired at the exact time they were supposed to. I then read this SO answer for Guava's Cache and I guess a similar mechanism works for Caffeine as well.
So the problem is that a record can "wait" in the cache for quite a while, even though it was already completed. Is there a way to overcome this issue? That is, is there a way to "encourage" the cache to invalidate expired items?
That lead me to question my solution. Would you consider my solution a good practice?
P.S. I'm willing to switch to other caching solutions, if necessary.