1

I have a class that I want to take an arbitrary method at initialization to use for string parsing depending on the context.

Using a method defined outside of a class as discussed here: Python define method outside of class definition?

def my_string_method(self):
    return self.var.strip()

class My_Class():
    def __init__(self, string_method):
        self.var = ' foo '
        self.string_method = string_method

    def use_string_method(self):
        return self.string_method()

instance = My_Class(string_method=my_string_method)
print instance.use_string_method()

I get the error "TypeError: use_string_method() takes exactly 1 argument (0 given)".

Shouldn't the self argument be passed implicitly to use_string_method? Is there a way to define the function such that this occurs, or do I need to explicitly pass the self argument to methods defined outside of the class as such:

class My_Class():
    def __init__(self, string_method):
        self.var = ' foo '
        self.string_method = string_method

    def use_string_method(self):
        return self.string_method(self)
Community
  • 1
  • 1
Dan
  • 175
  • 1
  • 12

1 Answers1

1

You will have to wrap the passed in function in "MethodType".

From within your init:

self.string_method = types.MethodType(string_method, self)

This binds the method to the class and allows it to receive the implicit self parameter. Make sure you import types at the top of your script.

Brian Schlenker
  • 4,966
  • 6
  • 31
  • 44
  • The other option is to wrap it in `functools.partial()` or a lambda that passes self for you. – Kevin May 07 '16 at 16:33