0

In my app I have different list view that contains some thumbnails. Today I'm starting the refactoring and I want to implement the LRU Caching. I'm following the Android guide lines, but I'm wondering if is better to initialize only one LRU Cache for entire app, or is better initialize LRU cache for each list view. I'm afraid of outOfMemory. Thus I have the following questions that I can't answer with myself: - one LRU cache initialized with singleton pattern is a good idea ? - if the memory is low, does lead to outOfMemory situation the following initializion of the LRU Cache ?

 @Override
protected void onCreate(Bundle savedInstanceState) {
    ...
    // Get max available VM memory, exceeding this amount will throw an
    // OutOfMemory exception. Stored in kilobytes as LruCache takes an
    // int in its constructor.
    final int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);

    // Use 1/8th of the available memory for this memory cache.
    final int cacheSize = maxMemory / 8;

    mMemoryCache = new LruCache<String, Bitmap>(cacheSize) {
        @Override
        protected int sizeOf(String key, Bitmap bitmap) {
            // The cache size will be measured in kilobytes rather than
            // number of items.
            return bitmap.getByteCount() / 1024;
        }
    };
    ...
}
  • if memory is low, the LRU cache is automatically released? I'm wondering if the app will have problem to release memory when I use LRU Cache (app crash because out of memory ? )

  • Only one LRU Cache for entire app , can it be a problem ?

  • More than one LRU Cache for entire app , can they be a problem ?
aeroxr1
  • 1,014
  • 1
  • 14
  • 36

1 Answers1

1

I had a lot of problems with this, so I can give you some helpful answers here.

if memory is low, the LRU cache is automatically released? I'm wondering if the app will have problem to release memory if I use LRU Cache

The LRU Cache will not automatically release memory. You will need to evict the entries programmatically.

Only one LRU Cache for entire app , can it be a problem ?

More than one LRU Cache for entire app , can they be a problem ?

The LruCache class is a generic class with a type for the key and a type for the value. I would say you want to have one LRU Cache for each object type that you are caching. You are doing what I did: caching Bitmaps and keying with Strings.

Just a side note: Be careful with using bitmap.getByteCount(). The JavaDocs say this:

As of KITKAT, the result of this method can no longer be used to determine memory usage of a bitmap. See getAllocationByteCount().

I was using an LRU Cache for Bitmaps. I thought it would be sufficient to override Application.onTrimMemory() and clear the cache whenever this method was called.

Also, I did the same thing you did and set a cache size based on some percentage of heap memory available to the app.

But here's what would happen: When my app was in a low memory state and my app would try to download an image with insufficient memory, the GC would run but the BitmapFactory would still be unable to allocate memory for the Bitmap. The logs were showing that the onTrimMemory() method was being called asynchronously, sometimes almost a full second after the OutOfMemoryError was thrown!

HEY GOOGLE: IF THE SYSTEM CAN'T TELL ME I'M LOW ON MEMORY UNTIL AFTER OutOfMemoryError IS THROWN, HOW IN THE HELL AM I SUPPOSED TO MANAGE MY MEMORY?

Insanity. Sheer, utter insanity.

Here's what I ended up doing: I would catch OutOfMemoryError in a try block, clear the cache there then retry the image request to the server. The exact thing they tell you not to do. But it ended up fixing my problem. The app is a lot more stable now.

So after you implement your LRU Cache, make sure you stress test your app; try to get it into a low memory situation and see how it behaves. For me it worked best when using an emulator. The emulator would have a small 96M heap limit, but if it had a high screen resolution, the image resources would scale to be pretty big, which made pushing the memory to the max fairly easy.

If you are displaying thumbnails, but you are getting images from a server and they might be larger than your ImageView, make sure you read this article: Loading Large Bitmaps Efficiently | Android Developers to learn how to download bitmaps of an appropriate size without wasting memory.

You might also experiment with just letting the system do the caching and set up an HttpResponseCache.

Whatever you end up doing, make sure to stress your app and see how it behaves when there's not much heap left.

And be prepared to deal with a little frustration.

Community
  • 1
  • 1
kris larson
  • 30,387
  • 5
  • 62
  • 74
  • Hi ! At this time I'm initializing a weakHashMap in the listview constructor, and I put the thumbnails in that map. I was thinking that was better to use LRU cache, but after reading your answer I am a bit scared.. – aeroxr1 Sep 17 '16 at 07:36
  • 1
    I just looked at `WeakHashMap`. I didn't realize that it's the *keys* that have the weak reference in that class, not the values. For that reason, I think you might want to make that map like `Map>`. I'm sort of wishing now that I had done that to begin with. Since the `LruCache` class doesn't have any implicit ties to the GC, it's pretty much useless. So you may be on the right track. Just test the crap out of it, and if it behaves acceptably, declare victory and call it a day. – kris larson Sep 17 '16 at 13:44
  • In the Weak Hash Map is it the key that have the weak reference ??? really ? I didn't know this !!! – aeroxr1 Sep 17 '16 at 13:49
  • 1
    That's why I looked it up. I can't remember using it before, so I go to the JavaDocs to check my assumptions. This kind of thing has happened to me so many times that checking the docs is my instinctive first reaction. – kris larson Sep 17 '16 at 14:00
  • 1
    `WeakHashMap` would be good to use in a case where the life cycle of the values is dependent on the life cycle of the keys. However, that's not really the situation we have in this case. We don't want to keep external references to the keys, so the map would need strong references to them. I think `Map>>` is the way to go. I'm seriously thinking about rewriting this part of my app now and ditching `LruCache`. – kris larson Sep 17 '16 at 14:14
  • The problem of "our" idea is that theorically the size of our cache is unlimited instead of a lru cache – aeroxr1 Sep 17 '16 at 14:29
  • 1
    I think I can fix that. I'll start with `LinkedHashMap` which already has some LRU capability http://chriswu.me/blog/a-lru-cache-in-10-lines-of-java/ Now I just have to figure out how to know exactly when the GC is going to clear the weak reference so I can update the current cache size. I'm going to try this in my app. If I get some good working code, I'll update the answer so you can see what I came up with. – kris larson Sep 18 '16 at 09:18
  • Wonderful :D But can I ask you why do you not want use anymore the lru cache ? – aeroxr1 Sep 18 '16 at 09:23
  • 1
    Because it's not integrated with the GC. My problem was that I used `Application.onTrimMemory()` to tell me when to clear my cache, but when you don't even have enough heap to allocate a bitmap, `onTrimMemory` doesn't get called until *after* the `OutOfMemoryError` is thrown (and caught). In the logs I can see the GC running trying to clear some memory for the bitmap before the OOM, so I know if my cache works with the GC and doesn't wait for `Application.onTrimMemory()`, it'll do the right thing. Much cleaner than catching `OutOfMemoryError`. – kris larson Sep 18 '16 at 09:37