9

Let's have a class that has function that fails from time to time but after some actions it just works perfectly.

Real life example would be Mysql Query that raises _mysql_exceptions.OperationalError: (2006, 'MySQL server has gone away') but after client reconnection it works fine.

I've tried to write decorator for this:

def _auto_reconnect_wrapper(func):
    ''' Tries to reconnects dead connection
    '''

    def inner(self, *args, _retry=True, **kwargs):
        try:
            return func(self, *args, **kwargs)

        except Mysql.My.OperationalError as e:
            # No retry? Rethrow
            if not _retry:
                raise

            # Handle server connection errors only
            # http://dev.mysql.com/doc/refman/5.0/en/error-messages-client.html
            if (e.code < 2000) or (e.code > 2055):
                raise

            # Reconnect
            self.connection.reconnect()

        # Retry
        return inner(self, *args, _retry=False, **kwargs)
    return inner

class A(object):
    ...

    @_auto_reconnect_wrapper
    def get_data(self):
        sql = '...'
        return self.connection.fetch_rows(sql)

And if client loses connection it just silently reconnect and everybody is happy.

But what if I want to transform get_data() to generator (and use yield statement):

    @_auto_reconnect_wrapper
    def get_data(self):
        sql = '...'
        cursor = self.connection.execute(sql)
        for row in cursor:
            yield row

        cursor.close()

Well, previous example won't work because inner function already returned generator and it will break after calling first next().

As I understand it if python sees yield inside method it just yields control immediately (without executing one single statement) and waits for first next().

I've managed to make it work by replacing:

return func(self, *args, **kwargs)

With:

for row in func(self, *args, **kwargs):
    yield row

But I'm curious whether there is more elegant (more pythonic) way to do this. Is there a way to make python run all the code up to first yield and then wait?

I'm aware of possibility of just calling return tuple(func(self, *args, **kwargs)) but I want to avoid loading all records at once.

Vyktor
  • 20,559
  • 6
  • 64
  • 96

2 Answers2

8

First, I think the solution you're currently using is fine. When you decorate a generator, the decorator is going to need to at least behave like an iterator over that generator. Doing that by making the decorator a generator, too, is perfectly ok. As x3al pointed out, using yield from func(...) instead of for row in func(...): yield row is a possible optimization.

If you want to avoid actually making the decorator a generator, too, you can do that by using next, which will run until the first yield, and return the first yielded value. You'll need to make the decorator somehow capture and return that first value, in addition to the rest of the values to be yielded by the generator. You could do that with itertools.chain:

def _auto_reconnect_wrapper(func):
    ''' Tries to reconnects dead connection
    '''

    def inner(self, *args, _retry=True, **kwargs):
        gen = func(self, *args, **kwargs)
        try:
            value = next(gen)
            return itertools.chain([value], gen)
        except StopIteration:
            return gen
        except Mysql.My.OperationalError as e:
            ...
            # Retry
            return inner(self, *args, _retry=False, **kwargs)
    return inner

You could also make the decorator work with both generator and non-generator functions, using inspect to determine if you're decorating a generator:

def _auto_reconnect_wrapper(func):
    ''' Tries to reconnects dead connection
    '''

    def inner(self, *args, _retry=True, **kwargs):
        try:
            gen = func(self, *args, **kwargs)
            if inspect.isgenerator(gen):
                value = next(gen)
                return itertools.chain([value], gen)
            else: # Normal function
                return gen
        except StopIteration:
            return gen
        except Mysql.My.OperationalError as e:
            ...
            # Retry
            return inner(self, *args, _retry=False, **kwargs)
    return inner

I would favor the yield/yield from-based solution, unless you have a requirement to decorate regular functions in addition to generators.

dano
  • 91,354
  • 19
  • 222
  • 219
  • I like this answer (+1-ed), but how about case when `next(gen)` raises `StopIteration`? – Vyktor Oct 30 '14 at 16:17
  • @Vyktor I've edited my answer to handle that case. You can just catch that exception and return the generator object, which will no-op if an attempt to iterate over it is made (or raise `StopIteration` again if `next` is called on it). – dano Oct 30 '14 at 16:27
  • Honestly, that's the reason I suggested to add another `yield` before the loop in `get_data`. The dano's sollution is much more clear though. – x3al Oct 30 '14 at 16:51
  • @x3al I really don't like the idea of editing the decorated function to make it work properly with the decorator. Because then if you remove the decorator for some reason, `get_data` is now broken. It also just looks bizarre to someone reading `get_data` without also reading the decorator. – dano Oct 30 '14 at 17:04
3

Is there a way to make python run all the code up to first yield and then wait?

Yes and it's called next(your_generator). Call next() once and the code will wait exactly after first yield. You can place another yield right before the loop if you don't want to lose first value.

If you're using python 3.3+, you can also replace

for row in func(self, *args, **kwargs):
    yield row

with yield from func(self, *args, **kwargs).

x3al
  • 586
  • 1
  • 8
  • 24
  • 2
    You should use `next(your_generator)`, `obj.next()` has been removed in Python 3.x – dano Oct 30 '14 at 14:46