0

I'm using this to create my cache:

Cache<Long, Info> cache = newBuilder()
                .expireAfterWrite(getCacheMaxNbDays(), DAYS)
                .maximumSize(getCacheMaxSize())
                .build();

I was wondering what happens when I call cache.asMap() and there are some entries eviction is performing on. cache.asMap() will wait until eviction operation is done or it returns just those entries eviction is not performing on?

My second question is: is expireAfterWrite a blocking operation? In other words, how caffeine works when there is concurrency between read, write and eviction operation (expireAfterWrite, expireAfterAccess, etc) on same entries?

1 Answers1

0

The cache is built on top of ConcurrentHashMap so most of its behavior applies. That includes lock-free reads and fine-grained locking for writes. A computeIfAbsent style of call is possibly a read or a write depending on if the entry is present so it may block. The asMap() call is simply a view so there is no waiting or overhead to obtain it.

An eviction is performed through a Map.compute, which will block other writes for that entry. While the computation is in progress a read will observe the entry and won't after it completes.

A read will validate that an entry is not expired and if it is then simulate a cache miss to avoid returning a stale result. If this is a computeIfAbsent type of call then its loader will replace the expired entry, handle notifications, etc. As another thread may in parallel be trying to evict it, the per-entry locking will make it all atomic and the loser handles that gracefully.

An expired entry is not usable, so it will cause blocking if a loading call. The refreshAfterWrite setting can be combined with expiration to hide the penalty of active content having a periodic latency spike as calls wait on a fresh load. When an entry is eligible to refresh, but not yet expired, then the next subsequent read will trigger a background reload. If instead the entry is not accessed within the refresh interval then it will expire and be evicted. This way active content stays fresh and fast, while allowing inactive content to fade away.

Ben Manes
  • 9,178
  • 3
  • 35
  • 39
  • While the computation is in progress a read will observe the entry and won't after it completes. This means that the read (Cache.asMap) will observe the entry till the computation completes. After that, it will validate that the entry is not expired? – Mamadou Bachir Barry Nov 17 '21 at 20:51
  • @MamadouBachirBarry Yes. `ConcurrentHashMap` will return the existing value for any read of an entry until the compute completes and updates the mapping. If a read obtains an expired entry then the validation will ignore it and the eviction's compute will complete for removing it. From the clients perspective they only observe non-expired entries, load on a miss, and might temporarily observe a `Map.size()` that counts some of these entries pending removal. – Ben Manes Nov 17 '21 at 21:33
  • Thank you @Ben for detailed responses – Mamadou Bachir Barry Nov 17 '21 at 21:47