I am using aiohttp
to create an Async/IO webserver. However, to my understanding, Async/IO means the server can only run on one processing core. Regular, synchronous servers like uwsgi
, on the other hand, can fully utilize the computer's computing resources with truly parallel threads and processes. Why, then, is Async/IO new and trendy if it less parallel than multiprocessing? Can async servers like aiohttp
be multi-processed?

- 153
- 1
- 8
-
2The whole point of asynchronous programming is to stuff multiple execution threads in one CPU thread. This allows you to process multiple (like, thousands) I/O requests on one CPU core. However, asynchronous code _sucks_ for parallelizing CPU-bound tasks because everything runs in one thread anyway. – ForceBru Sep 16 '18 at 17:34
-
@ForceBru I tried something like this: `from multiprocessing import Pool; pool = Pool(4); pool.map(web.run_app, [app, app, app, app])`. Does this make any sense performance-wise? – J Winnie Sep 16 '18 at 17:57
-
Well, if your server can't handle too many requests when run in one process, than it may be. Otherwise `asyncio` should be fine. And it also won't load your computer too much. – ForceBru Sep 16 '18 at 19:55
-
Re: trendiness -- basically, some of the models for being "more parallel" can, especially if poorly implemented (and historically, they often *were* poorly implemented), have much more overhead than the async approach. Switching between threads involves context-switching from userspace to kernelspace; jumping between two different call-stack contexts in a single process is much lower overhead than switching threads. – Charles Duffy Sep 16 '18 at 23:40
-
...often, an ideal solution is somewhere between the two worlds -- ie. using a thread pool, but then doing lightweight application-level context switches *within* each thread. That said, there are a lot of strong opinions on this topic, and questions where answers are likely to be controversial are a place we try to stay away from here. – Charles Duffy Sep 16 '18 at 23:43
-
Did `pool.map(web.run_app, [app, app, app, app])` actually work? In the least I would expect it to fail because of multiple processes trying to listen on the same port. Also, an aiohttp (and generally asyncio) application expects to be able to share state between tasks, and that won't work when tasks are spawned by multiple processes. – user4815162342 Sep 17 '18 at 09:24
-
Surprisingly it did work. Nothing was different in my application except that it output 4 different KeyboardInterrupt errors on ^C – J Winnie Sep 21 '18 at 01:07
-
Does this answer your question? [Python AIOHTTP.web server multiprocessing load-balancer?](https://stackoverflow.com/questions/59217853/python-aiohttp-web-server-multiprocessing-load-balancer) – xendi Sep 28 '21 at 07:29
2 Answers
Why, then, is Async/IO new and trendy if it less parallel than multiprocessing?
The two solve different problems. Asyncio allows writing asynchronous code sans the "callback hell". await
allows the use of constructs like loops, ifs, try/except, and so on, with automatic task switching at await
points. This enables servicing a large number of connections without needing to spawn a thread per connection, but with maintainable code that looks as if it were written for blocking connections. Thus asyncio mostly helps with the code whose only bottleneck is waiting for external events, such as network IO and timeouts.
Multiprocessing, on the other hand, is about parallelizing execution of CPU-bound code, such as scientific calculations. Since OS threads do not help due to the GIL, multiprocessing spawns separate OS processes and distributes the work among them. This comes at the cost of the processes not being able to easily share data - all communication is done either by serialization through pipes, or with dedicated proxies.
A multi-threaded asyncio-style framework is possible in theory - for example, Rust's tokio is like that - but would be unlikely to bring performance benefits due to Python's GIL preventing utilization of multiple cores. Combining asyncio and multiprocessing can work on asyncio code that doesn't depend on shared state, which is supported by asyncio through run_in_executor
and ProcessPoolExecutor
.

- 141,790
- 18
- 296
- 355
Gunicorn can help you:
gunicorn module:app --bind 0.0.0.0:8080 --worker-class aiohttp.GunicornWebWorker --workers 4

- 1,721
- 1
- 17
- 22