0

I'm trying to make a model for a deep neural network that has a backbone like this:

Each block should have a certain number of convolutional layers (lets say 3) It should also have a linear layer. The input is processed through average pooling, the linear layer and a non-linear activation function. It's also processed by the convolutional layers. The outcome of the linear layer and the convolutional layers should be combined (First number in the linear vector with the first convolutional layer and so on)

import torch.nn as nn

class MyModel(nn.Module): def init(self): super(MyModel, self).init() self.conv1 = nn.Conv2d(3, 16, 3, padding=1) self.relu1 = nn.ReLU() self.pool1 = nn.AvgPool2d(2, 2) self.conv2 = nn.Conv2d(16, 32, 3, padding=1) self.relu2 = nn.ReLU() self.pool2 = nn.AvgPool2d(2, 2) self.conv3 = nn.Conv2d(32, 64, 3, padding=1) self.relu3 = nn.ReLU() self.pool3 = nn.AvgPool2d(2, 2) self.conv4 = nn.Conv2d(64, 128, 3, padding=1) self.relu4 = nn.ReLU() self.pool4 = nn.AvgPool2d(2, 2) self.flatten = nn.Flatten()

def forward(self, x):
    x = self.conv1(x)
    x = self.relu1(x)
    x = self.pool1(x)
    x = self.conv2(x)
    x = self.relu2(x)
    x = self.pool2(x)
    x = self.conv3(x)
    x = self.relu3(x)
    x = self.pool3(x)
    x = self.conv4(x)
    x = self.relu4(x)
    x = self.pool4(x)
    x = self.flatten(x)
    return x

I don't think this fits the specification.

Any help with understanding would be appreciated.

0 Answers0