1

I am aware of Python's GIL and threading in Python isn't as easy as spawning a go routine in Go. However, it seems to me that Ruby was able to pull it off with Puma and Unicorn to achieve concurrency with multi-threading. My question is actually two-fold. My experience is limited to Django Channel's Daphne.

  1. Besides Daphne, what are the other choices of web server that is multi-threading like the puma and unicorn in Rails?

  2. From Daphne's documentation, I learned that parallelism is achieved by spawning new processes (workers)

    Because the work of running consumers is decoupled from the work of talking to HTTP, WebSocket and other client connections, you need to run a cluster of “worker servers” to do all the processing. Each server is single-threaded, so it’s recommended you run around one or two per core on each machine; it’s safe to run as many concurrent workers on the same machine as you like, as they don’t open any ports (all they do is talk to the channel backend).

As stated, each worker is single threaded. When it comes to I/O function call, the worker is blocked completely. My question is, why can't Daphne spawn multiple threads for each request. When one thread is blocked by I/O e.g. database access, CPU switches to execute another thread until the previous thread is unblocked. Similarly, Node.js is single-threaded but it does concurrency really well through non-blocking I/O. Why is it difficult to achieve the same feat. in Python? (besides the fact that it lacks a good event loop.)

mofury
  • 644
  • 1
  • 11
  • 23
  • What you're talking about is aync, not threading. It's certainly possible, but the issue is that the internals of Django itself don't currently work that way. The author of Daphne and Channels has a [new proposal](https://www.aeracode.org/2018/06/04/django-async-roadmap/) for changing Django to make this possible. – Daniel Roseman Jun 07 '18 at 07:31

3 Answers3

2

Right now, uvicorn is the only alternative for daphne which suppports multi processing and is ready for production usage.

$ pip install uvicorn

$ uvicorn avilpage.asgi --workers 4

This starts server with 4 workers.

Since daphne/uvicorn use asyncio for multi tasking, I guess multi threading doesn't make sense.

Chillar Anand
  • 27,936
  • 9
  • 119
  • 136
  • https://www.uvicorn.org/deployment/ - Use the uvicorn workers with gunicorn for production usage. – Babu Sep 18 '18 at 16:35
  • i swtich to `uvicorn myproject.asgi:application --port $PORT --bind 0.0.0.0` now it saies did you mean `--fd`? what is that ? – Ali Husham Jul 01 '21 at 15:09
0

The workers are not single threaded. Each one of them opens up a thread pool to run all of the database queries and anything you run sync_to_async on. Daphne's focus is on async, and the more you keep in the main thread with asyncio the faster it goes. You want to eliminate context switching as much as possible so the CPU caches stay fresh. Also, Python's GIL keeps only one thread working at a time, which is why there is no speed increase with multiple threads. You could have 25 threads and it would run as fast as 1 thread: https://gist.github.com/agronick/692d9a7bc41b75449f8f5f7cad93a924

kagronick
  • 2,552
  • 1
  • 24
  • 29
-1

You can use the following command:

uvicorn myproject.asgi:application --port 8000 --host 0.0.0.0 --workers 4
William
  • 512
  • 5
  • 17
  • Answer needs supporting information Your answer could be improved with additional supporting information. Please [edit] to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers [in the help center](https://stackoverflow.com/help/how-to-answer). – moken Jul 07 '23 at 13:37