I want to implement a logistic regression with dropout regularization but so far the only working example is the following:
class logit(nn.Module):
def __init__(self, input_dim = 69, output_dim = 1):
super(logit, self).__init__()
# Input Layer (69) -> 1
self.fc1 = nn.Linear(input_dim, input_dim)
self.fc2 = nn.Linear(input_dim, 1)
self.dp = nn.Dropout(p = 0.2)
# Feed Forward Function
def forward(self, x):
x = self.fc1(x)
x = self.dp(x)
x = torch.sigmoid(self.fc2(x))
return x
Now the problem of setting dropout in between layers is that at the end I do not have a logistic regression anymore (correct me if I'm wrong).
What I would like to do is drop out at the input level.