1

I am trying to modify the weight of convolution like this. To do this, I make, initialize my parameters(weight, bias), convolute input image using them. But, it shows an error because my parameters are not arguments in symbol.

How to add my parameters to arguments in symbol? If you let me know, I would be very grateful.

  • 1
    Hey, could you give more info about the type of error and also the code you are currently using. I will be able to help out in a better way. Thanks – Chaitanya Bapat Nov 15 '18 at 19:36

1 Answers1

1

If you want to pass in arguments to a Custom Operator you have to do so via the init method.

From https://github.com/apache/incubator-mxnet/issues/5580 here's a snippet illustrating what you need:

class Softmax(mx.operator.CustomOp):

    def __init__(self, xxx, yyy):  # arguments xxx, and yyy
        self.xxx = xxx
        self.yyy = yyy
    def forward(self, is_train, req, in_data, out_data, aux):
        x = in_data[0].asnumpy()
        y = np.exp(x - x.max(axis=1).reshape((x.shape[0], 1)))
        y /= y.sum(axis=1).reshape((x.shape[0], 1))
        print self.xxx, self.yyy
        self.assign(out_data[0], req[0], mx.nd.array(y))

    def backward(self, req, out_grad, in_data, out_data, in_grad, aux):
        l = in_data[1].asnumpy().ravel().astype(np.int)
        y = out_data[0].asnumpy()
        y[np.arange(l.shape[0]), l] -= 1.0
        self.assign(in_grad[0], req[0], mx.nd.array(y))

@mx.operator.register("softmax")
class SoftmaxProp(mx.operator.CustomOpProp):
    def __init__(self, xxx, yyy):
        super(SoftmaxProp, self).__init__(need_top_grad=False)

        # add parameter
        self.xxx = xxx
        self.yyy = yyy
    def list_arguments(self):
        return ['data', 'label', 'xxx', 'yyy']

    def list_outputs(self):
        return ['output']

    def infer_shape(self, in_shape):
        data_shape = in_shape[0]
        label_shape = (in_shape[0][0],)
        output_shape = in_shape[0]
        return [data_shape, label_shape], [output_shape], []

    def create_operator(self, ctx, shapes, dtypes):
        return Softmax(xxx=self.xxx, yyy=self.yyy)

Take a look at https://mxnet.incubator.apache.org/faq/new_op.html for full info.

Vishaal

Vishaal
  • 735
  • 3
  • 13