0

As I have a bulk operation for obtaining data from, so rather than using refresh() to update an individual item, I want to take advantage of bulk call to periodically to update the cache.

I have some approaches in mind, as I have get() call to cache all the time, I wonder which approach is the optimal.

Approach 1 Periodically call getAll(), which calls loadAll()

LoadingCache<String, String> cache = CacheBuilder.newBuilder()
            .ticker(ticker)
            .refreshAfterWrite(5,  TimeUnit.Minut)
            .expireAfterAccess(30, TimeUnit.MINUTES)
            .build(cacheLoader);

private static final class CacheLoader extends CacheLoader<String, String> {
    @Override
    public String load(final String key) {
           return db.getItem(key);
    }
    @Override
    public ListenableFuture<String> reload(String key) {
      ListenableFutureTask<String> task = ListenableFutureTask.create(new Callable<String>() {
               public String call() {
                 return db.getItem(key);
               }
             });
             executor.execute(task);
             return task;
    } 
    @Override
    public String loadAll(final List<String> key) {
           return db.bulkGetItems(keyList);
    }

}

private class RefresherRunnable implements Runnable {
    private volatile boolean stopped = false;
    @Override
    public void run() {
        while (!stopped) {
            try {
               List<String> keys = getKeys();
                cache.getAll(keys);
            } catch (Exception e) {
              //log

            }

            final long nextSleepInterval = 1000 * 60;
            try {
                Thread.sleep(nextSleepInterval);
            } catch (InterruptedException e) {
                Thread.currentThread().interrupt();
            }
        }
    }
}

Approach 2

Just use Cache.asMap().putAll

private class RefresherRunnable implements Runnable {
    private volatile boolean stopped = false;
    @Override
    public void run() {
        while (!stopped) {
            try {
               List<String> keys = getKeys();
               Map<String, String> keyValueMap = db.bulkGetItems(keys);
               cache.asMap().putAll(keyValueMap);
            } catch (Exception e) {
              //log

            }

            final long nextSleepInterval = 1000 * 60;
            try {
                Thread.sleep(nextSleepInterval);
            } catch (InterruptedException e) {
                Thread.currentThread().interrupt();
            }
        }
    }
}

I read about from https://github.com/google/guava/wiki/CachesExplained#inserted-directly it sounds like getAll is better than asMap.put(), so approach 1 is preferred, I guess?

  • Could you give an example please ? The loading cache should be effective only without any `bulk` operations. I.e. cache will throwing away least recently used/expired items automatically. Rules for pull out such items can be set using CacheBuilder fluent interface. I.e. `CacheBuilder.newBuilder() .maximumSize(1000) .refreshAfterWrite(1, TimeUnit.MINUTES).build(...);` – Victor Gubin Apr 04 '18 at 15:51
  • i have overriden reload() with asyn call to load a single item from db, but that is still a single item call, I have an api that can bulk load from db, i want to take advantage of that, so i can reload the whole cache with one call. i know where is no bulk refresh, so i m thinking periodically call getAll(), which will call the loadAll(), where calls the bulk api. – simplygaogao Apr 04 '18 at 16:08
  • Ok, since Guava Cache is [`similar to ConcurrentMap`](https://github.com/google/guava/wiki/CachesExplained) I think it will not affect the get. But you don't need the `expireAfterAccess(30, TimeUnit.MINUTES)` in your case, it is simply have no effect in your code. – Victor Gubin Apr 04 '18 at 16:36
  • Thank you, i just updated the post with 3 approaches I have in mind – simplygaogao Apr 04 '18 at 16:39
  • I do want to evict rules, as some item doesn't exist in db anymore, so i want to remove it from cache too. – simplygaogao Apr 04 '18 at 16:41

0 Answers0