21

I have web pages that take 10 - 20 database queries in order to get all the required data.

Normally after a query is sent out, the Django thread/process is blocked waiting for the results to come back, then it'd resume execution until it reaches the next query.

Is there's any way to issue all queries asynchronously so that they can be processed by the database server(s) in parallel?

I'm using MySQL but would like to hear about solutions for other databases too. For example I heard that Postgresql has an async client library - how would I use that in this case?

Continuation
  • 12,722
  • 20
  • 82
  • 106

3 Answers3

5

This very recent blog entry seems to imply that it's not built in to either the django or rails frameworks. I think it covers the issue well and is quite worth a read along with the comments.

http://www.eflorenzano.com/blog/post/how-do-we-kick-our-synchronous-addiction/ (broken link)

I think I remember Cal Henderson mentioning this deficiency somewhere in his excellent speech http://www.youtube.com/watch?v=i6Fr65PFqfk

My naive guess is you might be able to hack something with separate python libraries but you would lose a lot of the ORM/template lazy evaluation stuff django gives to the point you might as well be using another stack. Then again if you are only optimizing a few views in a large django project it might be fine.

Tom Leys
  • 18,473
  • 7
  • 40
  • 62
michael
  • 11,667
  • 2
  • 26
  • 25
4

I had a similar problem and I solved it with javascript/ajax

Just load the template with basic markup and then do severl ajax requsts to execute the queries and load the data. You can even show loading animation. User will have a web 2.0 feel instead of just gloomy page loading. Ofcourse, this means several more HTTP requests per page, but it's up to you to decide.

Here is how my example looks: http://artiox.lv/en/search?query=test&where_to_search=all (broken link)

poolie
  • 9,289
  • 1
  • 47
  • 74
Silver Light
  • 44,202
  • 36
  • 123
  • 164
0

Try Celery, there's a bit of a overhead of having to run a ampq server, but it might do what you want. Not sure about concurrency of the DB tho. Also, if you want speed for your DB, I'd recommend MongoDB (but you'll need django-nonrel for that).

fedosov
  • 1,989
  • 15
  • 26
yee379
  • 6,498
  • 10
  • 56
  • 101
  • 1
    Running an ampq server isn't compulsory. A simple memcache or redis queue works fine. I would *not* personally recomend MongoDB. Aside from the fact that django-norel is largely abandonware (Its still at 1.4. 1.4 was EOLed in 2013, and extended suppoort (security etc) EOLed in 2015. ) , MongoDB really only is faster if you forfeit some degree of ACID compliance by running it as an in-memory store. And in my view thats a terrible thing to do to customer data. – Shayne Jun 22 '17 at 02:36
  • The redis backend is being debated for deprecation. FYI. – DylanYoung Oct 05 '17 at 13:44
  • 1
    @DylanYoung, in 2022 and still rocking – emanuel sanga Feb 20 '22 at 04:02
  • How is celery relevent to the op question. it specifically ask to retrieve data for db. celery can help write to the db async, to fetch that would add a whole lot of complexity – Jonatan Cloutier Apr 03 '23 at 12:57