28

I have a Redis server which I query on almost every Django view for fetching some cached data. I've done some reading on some stackoverflow questions and learned that making a new Redis connection via r = redis.StrictRedis(host='localhost', port=6379, db=0) for every single web request is bad and that I should be using connection pooling.

Here is the approach I came up with for connection pooling in Django:

In settings.py so I can pull it up easily in any Django view as this is like a global variable:

# Redis Settings
import redis
REDIS_CONN_POOL_1 = redis.ConnectionPool(host='localhost', port=6379, db=0)

In some views.py:

from django.conf import settings  
REDIS_CONN_POOL_1 = settings.REDIS_POOL_1   
r = redis.Redis(connection_pool=REDIS_CONN_POOL_1)
r.get("foobar") # Whatever operation  

So, my question is: Is this the right way to do connection pooling in Django? Are there any better approaches you guys use for those have experienced a similar scenario like this? This is probably better than my old approach of opening and closing a new connection to redis on every request.

EDIT: Gathered my understanding about why it's wrong to open a new connection on every request from this stackoverflow question.

Community
  • 1
  • 1
user1757703
  • 2,925
  • 6
  • 41
  • 62

1 Answers1

3

A better approach would be to setup redis as your Django's cache backend with Django redis cache app. It gives you a done solution for your problem and you can use Django's official cache library to reach redis whenever you want get or set cached information. You can also avoid compatibility issues in your application if you decide to change your cache backend to something else.

Here's an easy to follow tutorial:

Using Redis as Django's session store and cache backend

martintrapp
  • 769
  • 6
  • 15
  • This seems like a good solution for one DB. Any idea how to specify multiple DBs or switch between DBs when performing operations via `from django.core.cache import cache`? From what I've gathered reading Django redis cache docs, we can only specify a single DB. – user1757703 Dec 13 '14 at 08:59
  • Well the number of DBs are limited in redis to 12. It'd much easier to use key prefixes. You can even write your own helper methods to reach them without messing a lot. Why do you need multiply DBs by the way? – martintrapp Dec 13 '14 at 11:31
  • I need multiple DBs because I organize my data into the multiple DBs based on the type of data it is. – user1757703 Dec 13 '14 at 17:55
  • You can set up multiple caches in Django's `CACHES`, and use a different DB number, see http://django-redis-cache.readthedocs.io/en/latest/advanced_configuration.html#database-number – Tino Dec 09 '16 at 11:02
  • Can you use mset, mget, lpush etc etc with Django Redis Cache? – ignabe Mar 07 '17 at 15:58
  • 1
    The problem with the cache backend is that is uses pickle. Which means you can't share your cache, you get upgrade issues, and you increase your attack surface. – Bite code Feb 21 '18 at 16:49
  • @nachopro I don't think so bc Django must be compatible with N cache servers, so they kept the API very simple. – EralpB Jun 03 '19 at 12:56