0

Running the following code

from torchvision import models
dnet121 = models.densenet121(pretrained = True)
dnet121

yields a DenseNet121 description which ends as follows : enter image description here

Based on this, I would appreciate your assistance of the following:

  • As per PyTorch BatchNorm2d documentation https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm2d.html the BatchNorm2d transform accepts a 4-dim tensor (here with channels C=1024) and outputs a 4-dim tensor. But, as per above image, how does the output 4-dim tensor convert to the 1024 in_features of the Linear layer ? Is it due to a Global Average Pooling layer ?

    As per DenseNet official paper https://arxiv.org/pdf/1608.06993.pdf the classifier starts indeed with a Global Average Pooling (GAP) layer, but why can't we see it here?

  • For the purposes of a task, I would like to replace the GAP layer with a nn.Flatten layer. How can I get code access to the GAP layer though?

john_ny
  • 173
  • 8
  • The DenseNet article only mentions "Global Average Pool" twice from what Ctrl+F shows me and the article notes "At the end of the last dense block, a global average pooling is performed and then a softmax classifier is attached." // Also, please copy-paste the entire model summary. // If need be, build your own `DenseNet` directly by subclassing or reusing `DenseNet` from `torchvision` source: https://pytorch.org/vision/main/_modules/torchvision/models/densenet.html – KDecker Dec 12 '22 at 14:40
  • You are right. Also in the first mention of Ctrl+F, in table 1, the layer is shown in the beginning of the Classification layer. In both cases though the layer isn't visible in the code description. No idea why. // Sadly i can not paste the entire summary as my post would exceed the 30000 characters limit. Also it doesn't fit to any image. // Thank you. I will try that. – john_ny Dec 12 '22 at 14:57

0 Answers0