1

I'm having the following scheme of my project. Every 4 hours I download some data from source, deserialise it and store to database. At the same time, nearly the whole day round some sort of client retrieve this data from my database.

I would like to have no downtimes while updating the database. I've asked similar question before (Django use two databases for seamless updates), but now I'm learned about caching and think that two databases is real over-engineering.

What is the best option in Django to create caches for database or/and request (I use django-rest-framework for it)?

I suppose that this question is not opinional at all, I'm just searching for the correct way to perform it in Django.

Community
  • 1
  • 1
vladfau
  • 1,003
  • 11
  • 22
  • why don't you spin op a background task using f/e celery, so that you can still process incoming requests? – windwarrior Mar 16 '15 at 20:46
  • 1
    This is a bit too broad a question because there are so many right answers... It will also invite opinion-based answers like mine, which is frowned upon here. About the update, just make all operations within a single transaction and you will be fine. Changes go live when you commit, no downtime or inconsistency. For cache, I like to set the appropriate expire headers and let the clients and/or intermediate proxies do the caching. – Paulo Scardine Mar 16 '15 at 20:50
  • My update process drops all the database and inserts about 30k items. If I do things like `bulk_update` and `transaction.savepoint()`, will it be fine? – vladfau Mar 16 '15 at 20:55

0 Answers0