When training a Deep Convolutional Neural Network using MxNet, what is the simplest way to turn on some amount of dropout in order to reduce overfitting? Is there a way to add a dropout rate without manually implementing dropout in the network architecture?
Asked
Active
Viewed 383 times
1 Answers
1
Yes you can set the dropout rate. The dropout layer API is here: https://mxnet.incubator.apache.org/api/python/symbol/symbol.html?highlight=dropout#mxnet.symbol.Dropout
Where p is the dropout rate.

George Jiajie Chen
- 66
- 1
-
Thanks George for the answer. I found that API while searching too. However it is not clear to me how we are supposed to use it if we have an implementation of a network like Resnet or InceptionV3. Could you add an example? – Pierre-Antoine Dec 05 '17 at 13:56
-
2It is just like adding any other layer to the network: `conv1 = mx.sym.Convolution(data=act1, num_filter=int(num_filter*0.25), kernel=(1,1)...) dropout1 = symbol.Dropout(conv1, p = 0.2)` – Guy Dec 07 '17 at 23:42