As Daniil answered in great detail - I'll just add a simple illustration about how IO works are like, and an alternative design choice.
About how IO works
As Daniil said, asyncio does not provide parallelism, but provides concurrency.
But - We can achieve IO parallelism in python because python actually doesn't do any of IO works itself (Nor pretty much every user programs). OS does. All Python does meanwhile is doing nothing.
And even for CPU, it's not their job to consistently polling every device if IO is done or not - each individual devices send signals(Interrupts) to CPU, then CPU finally starts checking which device's IO work is done.
So, in process/thread's perspective - IO is more like this:
"Hey OS, please do this IO works for me. Wake me up when it's done."
Thread 1 goes to sleep
Some time later, OS punches Thread 1
"Your IO Operation is done, take this and get back to work."
OS does IO works for you, and punches you out of the sleep for you, too - which is called Interrupt.
This is why you see in many applications & frameworks(including asyncio) uses Threading to improve throughput in python despite the existence of Global Interpreter Lock(GIL) limiting python code to be ran in only 1 thread at any given time.
That is, despite being limited in parallel execution - python low-level IO codes written in C releases GIL when waiting for OS do it's IO works so other threads' python code can do more useful job than itself.
So, TL;DR script itself is not parallel, but IO can be parallel - all networks jobs are sent (Despite not simultaneously) - and is waiting for server's response simultaneously(Which is doing nothing until OS Interrupt).
Some example
And for producer-consumer pattern-ish example - well, more like pool because there is no pair. Servers will usually ban/cut the connection when there's a lot of simultaneous connections.
But with this approach - we can guarantee that there will only be at best 3 simultaneous connections and won't make server angry.
server.py - receive GET, randomly waits and response:
import asyncio
from random import randint
from quart import request, jsonify, Quart
app = Quart("Very named Much app")
@app.get("/json")
async def send_json():
"""
Sleeps 0~4 seconds before returning response.
Returns:
json response
"""
key = request.args["user"]
print("Received " + key)
await asyncio.sleep(randint(0, 4))
return jsonify({"user": key})
asyncio.run(app.run_task())
client.py:
import asyncio
import httpx
async def request_task(id_, in_queue: asyncio.Queue, out_queue: asyncio.Queue):
"""Get json response data from url in queue. It's Consumer and also Producer.
Args:
id_: task ID
in_queue: Queue for receiving url
out_queue: Queue for returning data
"""
print(f"[Req. Task {id_}] Started!")
# create context for each task
async with httpx.AsyncClient() as client:
while True:
user = await in_queue.get()
print(f"[Req. Task {id_}] Processing user '{user}'")
data = await client.get("http://127.0.0.1:5000/json?user=" + str(user))
# do what you want here
print(f"[Req. Task {id_}] Received {data}")
await out_queue.put(data)
# inform queue that we are done with data we took
in_queue.task_done()
async def main():
"""
Starter code
"""
# create queues
in_queue = asyncio.Queue()
out_queue = asyncio.Queue()
# create consumer tasks
pool = [asyncio.create_task(request_task(n, in_queue, out_queue)) for n in range(3)]
# populate queue with numbers as user's name
for n in range(30):
in_queue.put_nowait(n)
# wait for enqueued works are complete
await in_queue.join()
# cancel tasks
for task in pool:
task.cancel()
# check data
print(f"[Main task] Processed {out_queue.qsize()} data!")
if __name__ == '__main__':
asyncio.run(main())
output:
[Req. Task 0] Started!
[Req. Task 0] Processing user '0'
[Req. Task 1] Started!
[Req. Task 1] Processing user '1'
[Req. Task 2] Started!
[Req. Task 2] Processing user '2'
[Req. Task 2] Received <Response [200 ]>
[Req. Task 2] Processing user '3'
[Req. Task 1] Received <Response [200 ]>
[Req. Task 1] Processing user '4'
[Req. Task 2] Received <Response [200 ]>
[Req. Task 2] Processing user '5'
[Req. Task 0] Received <Response [200 ]>
[Req. Task 0] Processing user '6'
...
[Req. Task 2] Received <Response [200 ]>
[Req. Task 2] Processing user '22'
[Req. Task 1] Received <Response [200 ]>
[Req. Task 1] Processing user '23'
[Req. Task 0] Received <Response [200 ]>
[Req. Task 0] Processing user '24'
[Req. Task 1] Received <Response [200 ]>
[Req. Task 1] Processing user '25'
[Req. Task 1] Received <Response [200 ]>
[Req. Task 1] Processing user '26'
[Req. Task 2] Received <Response [200 ]>
[Req. Task 2] Processing user '27'
[Req. Task 0] Received <Response [200 ]>
[Req. Task 0] Processing user '28'
[Req. Task 1] Received <Response [200 ]>
[Req. Task 1] Processing user '29'
[Req. Task 1] Received <Response [200 ]>
[Req. Task 2] Received <Response [200 ]>
[Req. Task 0] Received <Response [200 ]>
[Main task] Processed 30 data!