1

I'm running a memcached service on my windows system and I've configured my dev settings to have the following cache settings:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
        'LOCATION': '127.0.0.1:11211',
        'TIMEOUT': 3600,
        'OPTIONS': {
            'MAX_ENTRIES': 100
        }
    }
}

And I'm setting and getting the content in/from cache using the following code:

from django.core.cache import cache
def get_element(fsid):
    element = cache.get(str(fsid))  # get element from memcache if present
    if element is None:
        info("cache miss for fsid-"+str(fsid))
        fs = FlowStatusModel.objects.get(pk=fsid)
        pickle_path = fs.pickle
        gcs_pickle_path = fs.gcs_pickle
        try:
            info("reading from local disk")
            with open(pickle_path, 'rb') as handle:
                element = pickle.load(handle)
        except:
            info("local disk failed. copying file from cloud storage bucket to local disk")
            create_and_copy(gcs_pickle_path, pickle_path)
            with open(gcs_pickle_path, 'rb') as handle:
                element = pickle.load(handle)
        info("adding fsid-"+str(fsid) + " to cache with 1hour timeout")
        cache.set(str(fsid), element, 3600)
    return element

I see in the log that there is always a cache miss. I could not figure out if django was able to set the element in the cache. If I try from python manage.py shell with simple set and get, I'm able to get the data back. I tried to run the command memcached.exe -vv from the command prompgt to see if its receiving the requests but in neither scenarios(from dev server/manage.py shell) I see any set or get information printed on the console. I appreciate any help in figuring out the issue.

Jo Kachikaran
  • 562
  • 1
  • 6
  • 12

1 Answers1

0

Probably because of this:

    'OPTIONS': {
        'MAX_ENTRIES': 100
    }

That's a very small number. You are probably having a great deal of thrashing. Each time you add something to the cache it's very likely to be replacing something else. That's probably what's happening to your fsid key.

It could also be possible that you have run foul of the maximum item size of memcached. Fix that by changing the memcached configuration file (memcached.conf) and adding the following

MAXITEMSIZE=12m

However it should be noted that if you are storing such large objects, memcached may not be ideal for you.

e4c5
  • 52,766
  • 11
  • 101
  • 134
  • I'm hardly inserting 5 entries as of now. Do you think that is still a problem? – Jo Kachikaran Jun 27 '16 at 02:04
  • How about using memcached tools to confirm that this the case? It will show you how many items the cache actually contains – e4c5 Jun 27 '16 at 02:08
  • I did `telnet 127.0.0.1 11211` and I see that `STAT cmd_set` is not getting changed when memcached is hit by django server however it is increasing when I try to set from `python manage.py shell`. I think there is something wrong with the Django server. Do you see any issues in my current configuration? By the way I did not add any cache middleware classes to the `MIDDLEWARE_CLASSES` field in the settings file. – Jo Kachikaran Jun 27 '16 at 02:27
  • Is there any limit on the size of the object I could set as a value? The pickle that I'm trying to set as the value is around 10MB on disk. – Jo Kachikaran Jun 27 '16 at 02:30
  • Looks like the limit is 1MB for the values. I'll see what I can do to change that or I'll see if redis fits my needs. Thanks for your time. – Jo Kachikaran Jun 27 '16 at 02:44
  • Nope. I tried setting the item size to 64mb but it still did not work. Perhaps the pickle file when unpickled is taking much more than that and it seems the maximum value I could set is only 128mb. In my case it is common to have pickle files having sizes ranging from 10mb to 100mb, so apparently mencached does not fit my needs. I'm using cachetools to cache these pickles but the downside is that with my apache server I've to set the number of processes to 1 otherwise the object will be read from disk for every process. – Jo Kachikaran Jun 29 '16 at 01:35
  • Yes, an unpickled file takes up a lot more memory than it takes on disk. What are you doing with these pickles anyway? Shall we discuss that in another question? – e4c5 Jun 29 '16 at 01:42
  • [Here](http://stackoverflow.com/questions/38089482/caching-python-objects-with-apache-server) I posted another question explaining my scenario. – Jo Kachikaran Jun 29 '16 at 02:44
  • Great, will have a look – e4c5 Jun 29 '16 at 03:14