12

I'm thinking about implementing the first layer of my caching in an Android app. I was considering SoftReferences to surely avoid OOM exceptions, but since there are many articles about how Android frees these up "too soon", I decided to look into android.util.LruCache cache.

Question: How do I size it up properly for the actual device? It all sounds very nice that an LRU cache is the real solution and not SoftReferences, but if you're really keen to avoid OOM Exceptions, it feel extremely unsafe to go with any number of megabytes of hard references. It's just unsafe if you ask me. Anyway, this seems to be the only option. I was looking into getMemoryClass to find out the heap size of the app on the actual device (+checking the free heap size before sizing the cache up). The base line is 16 Megs which sounds Ok, but I've seen devices (G1 for example in the old days) throwing OOM exceptions just around 5 Megabytes of heap size (according to Eclipse MAT). I know a G1 is very old, but the point is that my experiences don't really align with the 16 Megs baseline the documentation mentions. Therefore I'm completely uncertain how should I scale up an LRU cache if I need the most I can reasonably get. (would be happy with 8 Megs and would go with as small as 1 Meg on a low-spec device)

Thanks for any hints.

Edit: The Android LRU cache class I'm referring to: http://developer.android.com/reference/android/util/LruCache.html

trincot
  • 317,000
  • 35
  • 244
  • 286
user289463
  • 2,804
  • 3
  • 20
  • 21

2 Answers2

14

I think a valid solution to calculate the LruCache size is outlined in the dev guide:

int memClass = ( ( ActivityManager )context.getSystemService( Context.ACTIVITY_SERVICE ) ).getMemoryClass();
int cacheSize = 1024 * 1024 * memClass / 8;

More information can be found here: http://developer.android.com/training/displaying-bitmaps/cache-bitmap.html

Moritz
  • 10,124
  • 7
  • 51
  • 61
  • Yeah, I ended up using the memory class a similar way. I still find it too much of a guesswork, but I didn't come across a more exact method. – user289463 Jun 07 '12 at 14:17
  • 3
    Make sure that sizeOf also returns the size in bytes. In the link above they returned bitmap.getByteCount() / 1024 (in kilo bytes! this works of course but then your cacheSize should be 1024 * memClass / 8) – DominicM Feb 17 '15 at 20:53
0

From your question its a bit confusing to understand what you are asking. Let me give it a shot.

Various caching products AppFabric, memcached, ncache and scaleout have a 1M limitation on per object. I think scaleout does offer some kind of customization.

But all of these are server side products. So for a android device, which will most probably be a single host local cache only, I would probably go with a max of 64kb. I mean, why would anyone need more than 64kb per object on a device. Just my guess.

If I were you, I would study memcached (most famous open source caching solution). And may be scaleout since its easy to get a hello world working with scale out too. And proportionally decide.

Siddharth
  • 9,349
  • 16
  • 86
  • 148
  • Hi Siddharth, On Android a very typical purpose of caching (memory & storage) is caching lists and thumbnails. One thumbnail can easily be 20kbytes in itself and you would display 40-50 thumbnails in a Grid. I'm talking about megabytes. The question is more about finding the proper size from the Android specific perspective. It doesn't relate to server side solutions, since you know your server's hardware, but you don't know all the 700 hundred different HW configurations on which your Android app will run. – user289463 Feb 23 '12 at 14:39
  • So If I am not mistaken, Android does not have a secondary storage that is slower than memory right ? They are all the same memory, primary and secondary. So why would anyone want a LRU cache ? I wonder ? – Siddharth Feb 24 '12 at 08:25
  • There is the RAM which I'm going to use with the LRU cache. This is the fastest. Then there is Internal Storage which is the next fastest. There is also External Storage which can be anything, even a slow SD Card. I'm talking about RAM here as my L1 cache. – user289463 Feb 24 '12 at 11:52
  • Do you know the % distribution between the fastest to the internal storage that is next fasted ? SD card is a not good. May be your can create a LRU to SD card for old objects ? Since a mobile app may be active for not more than 3o mins I guess. Like facebook app. So may be take some scenarios that you want to support to start with. OR the scenario that you want to demonstrate with, and then go from there. That is one way to narrow down the sizing I guess. – Siddharth Feb 24 '12 at 12:03
  • 1
    I'm 100% sure I need to use RAM. My question is specific to the usage of the Android LRU cache (android.util.LruCache). I need to find the right strategy for sizing it up. There are over 700 different hardware configurations on the market. So this is not a trivial question. – user289463 Feb 24 '12 at 12:07
  • 1
    It isn't, but you need to start somewhere. All products/features start with scenarios in mind. You too can take that approach. Start with a scenario that you hope to support for most of your consumers. Like facebook, or a game with 5 screens and 30 moving characters. Size that up and set your defaults. Then allow configuration both load time and runtime. Can you talk a bit about the scenarios who hope to support ? – Siddharth Feb 24 '12 at 12:12