21

There's a way to transform a Python 3.5 async for statement in a Python 3.4 code?

PEP 0492 says that async for

async for TARGET in ITER:
    BLOCK
else:
    BLOCK2

is equivalent to

iter = (ITER)
iter = type(iter).__aiter__(iter)
running = True
while running:
    try:
        TARGET = await type(iter).__anext__(iter)
    except StopAsyncIteration:
        running = False
    else:
        BLOCK
else:
    BLOCK2

but __aiter__ does not exists in Python 3.4

Marco Sulla
  • 15,299
  • 14
  • 65
  • 100
  • 3
    If you have a working Python 3.5 code then look at the source of `.__aiter__()` and `.__anext__()` methods (it may be different for different ITER). – jfs Dec 17 '15 at 15:54
  • @OldBunny2800 I believe you are looking for this https://stackoverflow.com/questions/30191556/coroutine-in-python-between-3-4-and-3-5-how-can-i-keep-backwords-compatibility – Tarun Lalwani Feb 07 '18 at 08:16

1 Answers1

12

No, there is not, async/await (__aiter__, etc as well) was introduced in python 3.5. On py3.4 the closest thing is asyncio.gather (if you can run all tasks at once/in parallel and wait until they are all finished) or pushing results into an asyncio.Queue (which is sequential, just as async for). Edit: see last example for an async for alternative, as described in the question.

Here is an example ala python docs for asyncio.gather:

import asyncio

@asyncio.coroutine
def task(id):
    print("task: {}".format(id))
    yield from asyncio.sleep(random.uniform(1, 3))
    return id

tasks = [
    task("A"),
    task("B"),
    task("C")
]
loop = asyncio.get_event_loop()
results = loop.run_until_complete(asyncio.gather(*tasks))
loop.close()
print(results)

Output:

task: B
task: A
task: C
['A', 'B', 'C']

Here is one for asyncio.Queue:

import asyncio

@asyncio.coroutine
def produce(queue, n):
    for x in range(n):
        print('producing {}/{}'.format(x, n))
        # todo: do something more useful than sleeping :)
        yield from asyncio.sleep(random.random())
        yield from queue.put(str(x))


@asyncio.coroutine
def consume(queue):
    while True:
        item = yield from queue.get()
        print('consuming {}...'.format(item))
        # todo: do something more useful than sleeping :)
        yield from asyncio.sleep(random.random())
        queue.task_done()


@asyncio.coroutine
def run(n):
    queue = asyncio.Queue()
    # schedule the consumer
    consumer = asyncio.ensure_future(consume(queue))
    # run the producer and wait for completion
    yield from produce(queue, n)
    # wait until the consumer has processed all items
    yield from queue.join()
    # the consumer is still awaiting for an item, cancel it
    consumer.cancel()


loop = asyncio.get_event_loop()
loop.run_until_complete(run(10))
loop.close()

Edit: async for alternative as described in the question:

import asyncio
import random

class StopAsyncIteration(Exception):
    """"""

class MyCounter:
    def __init__(self, count):
        self.count = count

    def __aiter__(self):
        return self

    @asyncio.coroutine
    def __anext__(self):
        if not self.count:
            raise StopAsyncIteration

        return (yield from self.do_something())

    @asyncio.coroutine
    def do_something(self):
        yield from asyncio.sleep(random.uniform(0, 1))
        self.count -= 1
        return self.count

@asyncio.coroutine
def getNumbers():
    i = MyCounter(10).__aiter__()
    while True:
        try:
            row = yield from i.__anext__()
        except StopAsyncIteration:
            break
        else:
            print(row)

loop = asyncio.get_event_loop()
loop.run_until_complete(getNumbers())
loop.close()

Note that this can be simplified by removing both __aiter__ and __anext__ and raising a stop exception within the do_something method itself or return a sentinel result when done (usually an invalid value like: None, "", -1, etc)

nitely
  • 2,208
  • 1
  • 22
  • 23
  • If an asynchronous function returns an iterable, would you be able to assign it to a variable and use a standard for-in loop to iterate over it? – AAM111 Feb 08 '18 at 14:59
  • Yes, if you have something like `result = await somecoro()` and `somecoro` returns an iterable (i.e: list, tuple, dict, set, etc), then sure, you can iterate it later. The question here was about iterating over an async iterator, one that for example makes a bunch of HTTP requests and yields the content of each of them as soon as one is available, instead of having to wait for all to complete. – nitely Feb 08 '18 at 16:30
  • 1
    I've added some examples for `asyncio.gather` and `asyncio.Queue`. Of course, if you are on py3.5 an async iterator would be better (as in simpler/more readable) than a queue, at least in most situation I can think of. – nitely Feb 08 '18 at 16:46
  • 1
    @OldBunny2800 I've thought of another way of doing this and added as an example. It's basically the equivalent code showed in the OP question. – nitely Feb 09 '18 at 03:48
  • IMHO the third it's more simple and clear, very good job. Anyway, I've not well understood the role of `__aiter__`. In all examples I found it always returns `self`. When you would set `__aiter__` a a different object? – Marco Sulla Jun 08 '19 at 17:42
  • Generally when doing class composition and delegating some of the operations to the composed objects. Personally, I've never needed it, though. – nitely Jun 09 '19 at 06:25