1

I want to use a java collection (list, map, etc.) which will cache some data so that I can use this cache instead of directly checking the database. My only worry is the collection size, I want this cache to save let's say only 1000 entries, once this count is reached, I want to remove the oldest entry and put a new one. Is this possible?

trincot
  • 317,000
  • 35
  • 244
  • 286
Wael
  • 1,533
  • 4
  • 20
  • 35
  • Yes, but how would this speed up processing? – Makoto Aug 14 '12 at 12:49
  • instead of querying the database to get the id of an object, I check the map first to see if I searched for this same item in the database before. – Wael Aug 14 '12 at 12:51
  • 2
    If it's a small number of entries (as in less than 10,000), you wouldn't want to do unnecessary optimization. Databases are pretty quick about getting results for you without the need to buffer them, thus using up more memory. – Makoto Aug 14 '12 at 12:53
  • @Makoto Maybe it is an ultra-low latency financial application. Then a database lookup is a no-go. But we don't know that... – maba Aug 14 '12 at 13:38
  • You may want to read this: [Soft reference LinkedHashMap in Java?](http://stackoverflow.com/questions/862648/soft-reference-linkedhashmap-in-java) – maba Aug 14 '12 at 13:54

5 Answers5

7

You should have a look at LinkedHashMap. If you override removeEldestEntry you can control when the oldest entry in the map gets removed (when put or putAll is called).

Mathias Schwarz
  • 7,099
  • 23
  • 28
3

You can use the caching utilities provided by Google Guava: http://code.google.com/p/guava-libraries/wiki/CachesExplained

Ionuț G. Stan
  • 176,118
  • 18
  • 189
  • 202
1

Depending on the 'weight' of each cached object, there are multiple variants. Select what fits your use case better:

  • Fixed size cache (can be implemented using a Collection and tracking its size). This works well if the objects are reasonably small and the memory footprint can be well estimated ahead of time. The other answers basically illustrate ways to implement this type.

  • Dynamic cache with automatic eviction through the garbage collector. This works well if the objects to be cached are big (or of wildly varying size, e.g. files or images) and you want to use as much heap as available for the cache. The cache manages a collection of java.lang.SoftReference to keep the objects alive. The garbage collector will reclaim cached objects when it needs memory (by clearing the reference). A disadvantage of this approach is that you have no control over object eviction, the GC decides when and which objects are evicted.

  • Combination of both, (small) fixed size cache for most recent hits, dynamic GC'd one for second level.

Neither will ever cause any OutOfMemory errors when configured appropiately.

Durandal
  • 19,919
  • 4
  • 36
  • 70
  • 1
    And here is a discussion on that topic: [Soft reference LinkedHashMap in Java?](http://stackoverflow.com/questions/862648/soft-reference-linkedhashmap-in-java) – maba Aug 14 '12 at 13:53
0

Java offers a interface called Queue and some implemetations os this interface.

You can see what is the best choice for your problem. Take a look

http://docs.oracle.com/javase/1.5.0/docs/api/java/util/Queue.html

http://docs.oracle.com/javase/tutorial/collections/implementations/queue.html

Renato Lochetti
  • 4,558
  • 3
  • 32
  • 49
0

Apache commons has a circular fifo buffer. I guess that is what you are looking for.

from its docs

CircularFifoBuffer is a first in first out buffer with a fixed size that replaces its oldest element if full.

or else

you can create your own class extending AbstractQueue in java library.

Ravi Bhatt
  • 3,147
  • 19
  • 21