I write a custom op in Python layer. implemented with operators of mx.nd.op_name
, It work normal when the shape of input is in the same. But it tells out of memory when its' shape is different.the custom op as below. It seems like the memory of self.output
haven't be free when forward function is done. and i try to del self.output
, but it don't work. can you provided some suggestions?
def forward(...):
self.output = mx.nd.op_name(inputs)
do others
def backward(...):
do backward