-1

My task is classfication with neural net. Input dimension is 4 and outputs are 4 layers. And each output dim is 4 (4 classes for each output).

This is the input data

They are normalized.

        [0.7502, 0.1567, 0.1063, 0.8041],
        [0.5052, 0.5634, 0.7159, 0.0273],
        [0.7539, 0.1044, 0.3207, 0.6528],
                         
              ・
              ・
              ・

        [0.2311, 0.8376, 0.5036, 0.1267],
        [0.2609, 0.7965, 0.6194, 0.0416],
        [0.2588, 0.7995, 0.1704, 0.4476],
        [0.0893, 0.5472, 0.8131, 0.1713],
        [0.3774, 0.3312, 0.6459, 0.3589],
        [0.0340, 0.6632, 0.8019, 0.1103]])
This is the label for one of output layers
    0   45  90  135
0   1   0   0   0
1   1   0   0   0
2   1   0   0   0
3   0   1   0   0
4   1   0   0   0
... ... ... ... ...
103 1   0   0   0
104 1   0   0   0
105 1   0   0   0
106 1   0   0   0
107 1   0   0   0

(0 45 90 135) is class. Every output layer predicts like ([0.2271, 0.3658, 0.2374, 0.1697]) with sofmax *In fact, the dataset has (0 45 90 135) × 4.

This is the my neural net.

class Net(nn.Module):    
    def __init__(self):
        super(Net, self).__init__()
        n_n=10
        self.fc1 = nn.Linear(4, n_n)
        self.fc_y1 = nn.Linear(n_n,4)
        self.fc_y2 = nn.Linear(n_n,4)
        self.fc_y3 = nn.Linear(n_n,4)
        self.fc_y4 = nn.Linear(n_n,4)

    def forward(self, x):
        #print("step1=",x)
        x = self.fc1(x)
        x = F.leaky_relu(x)

        y1 = self.fc_y1(x)
        y1 = F.softmax(y1,dim=1)
        y2 = self.fc_y2(x)
        y2 = F.softmax(y2,dim=1)
        y3 = self.fc_y3(x)
        y3 = F.softmax(y3,dim=1)
        y4 = self.fc_y4(x)
        y4 = F.softmax(y4,dim=1)
        
        return y1,y2,y3,y4

#####below, loss function and optimizer ###

criterion = nn.BCELoss()
(or criterion = nn.CrossEntropyLoss())
optimizer = optim.Adam(net.parameters(), lr=1e-4)

My problem is that outputs(y1,y2,y3,y4) become

y1=y2=y3=y4=

tensor([[0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500],
        [0.2500, 0.2500, 0.2500, 0.2500]]

for some reason.

What I've done are ・tuning learning rate ・make more hidden layer but no improvement

Do you have any idea to classify correctly?

Kenta
  • 1
  • I know that softmax is for output layer, so the coding of my neural net is wrong? I try to make four output-layers like :                      input : x → hidden layers → softmax → four different classes of outputs (y1,y2,y3,y4) – Kenta Aug 12 '23 at 07:08

0 Answers0