0

(Context) I need to build a cache that loads information from database and make it available to the application. This data will not be changed in this application and will be heavily used (retrieved) by lots of threads. Data might be (rarely) changed from another app. Updating it every 1 hour is more than enough. Guess Ehcache would be a good fit, creating a read-through cache.

Thought about some cache like this:

CacheManager cacheManager = CacheManagerBuilder.newCacheManagerBuilder()
        .withCache("preConfigured",
                CacheConfigurationBuilder.newCacheConfigurationBuilder(Integer.class, String.class, ResourcePoolsBuilder.heap(100))
                        .withExpiry(ExpiryPolicyBuilder.timeToLiveExpiration(Duration.ofHours(1)))
                        .withLoaderWriter(new CacheDataProvider())).build();

cacheManager.init();

Cache<Integer, String> myCache = cacheManager.getCache("preConfigured", Integer.class, String.class);

Imagine the cache is expired (after 1 hour), and a thread asks for entry with key 1, and shortly after another thread asks the same entry 1. When 1st thread asks for entry 1, cache will check it's expired and will ask the LoaderWriter to load entry 1.

My question is: While LoaderWriter is loading entry 1for first request, would the cache put the second thread to 'wait' until entry 1 is loaded, or, could the second request trigger a (almost) simultaneous load for entry 1 again. That is, resulting in same entry being loaded twice from LoaderWriter ?

Robson Hermes
  • 425
  • 5
  • 12

1 Answers1

1

Ehcache 3 in cache through mode will block all access to a given key while the loader executes. This indicates also that loader performance is important.

In your scenario, it means that a single request will go to the database and when the entry is in the cache, after the first thread finishes loading it, the second thread will be able to read it from the cache.

Note that this system is not about thread safety, your loader must still be thread safe for different keys, but about efficient resource usage. It assumes that loading from the DB is much slower than a cache hit.

Louis Jacomet
  • 13,661
  • 2
  • 34
  • 43