0

When training a Deep Convolutional Neural Network using MxNet, what is the simplest way to turn on some amount of dropout in order to reduce overfitting? Is there a way to add a dropout rate without manually implementing dropout in the network architecture?

Pierre-Antoine
  • 7,939
  • 6
  • 28
  • 36

1 Answers1

1

Yes you can set the dropout rate. The dropout layer API is here: https://mxnet.incubator.apache.org/api/python/symbol/symbol.html?highlight=dropout#mxnet.symbol.Dropout

Where p is the dropout rate.

  • Thanks George for the answer. I found that API while searching too. However it is not clear to me how we are supposed to use it if we have an implementation of a network like Resnet or InceptionV3. Could you add an example? – Pierre-Antoine Dec 05 '17 at 13:56
  • 2
    It is just like adding any other layer to the network: `conv1 = mx.sym.Convolution(data=act1, num_filter=int(num_filter*0.25), kernel=(1,1)...) dropout1 = symbol.Dropout(conv1, p = 0.2)` – Guy Dec 07 '17 at 23:42