0

Trying to use transfer learning (fine tuning) with InceptionV3, removing the last layer, keeping training for all the layers off, and adding a single dense layer. When I look at the summary again, I do not see my added layer, and getting expectation.

RuntimeError: You tried to call count_params on dense_7, but the layer isn't built. You can build it manually via: dense_7.build(batch_input_shape).

from keras import applications
pretrained_model = applications.inception_v3.InceptionV3(weights = "imagenet", include_top=False, input_shape = (299, 299, 3))

from keras.layers import Dense
for layer in pretrained_model.layers:
  layer.trainable = False

pretrained_model.layers.pop()

layer = (Dense(2, activation='sigmoid'))
pretrained_model.layers.append(layer)

Looking at summary again gives above exception.

pretrained_model.summary()

Wanted to train compile and fit model, but

pretrained_model.compile(optimizer=RMSprop(lr=0.0001), 
              loss = 'sparse_categorical_crossentropy', metrics = ['acc'])

Above line gives this error,

Could not interpret optimizer identifier:

Pranit Kothari
  • 9,721
  • 10
  • 61
  • 137
  • You can't use `pop()` on `layers` attribute to modify the architecture. [This](https://stackoverflow.com/a/53312991/2099607) or [this](https://stackoverflow.com/a/52282558/2099607) might be helpful. – today May 19 '19 at 14:51

1 Answers1

0

You are using pop to pop the fully connected layer like Dense at the end of the network. But this is already accomplished by the argument include top = False. So you just need to initialize Inception with include_top = False, add the final Dense layer. In addition, since it's InceptionV3, I suggest you to add GlobalAveragePooling2D() after output of InceptionV3 to reduce overfitting. Here is a code,

from keras import applications
from keras.models import Model
from keras.layers import Dense, GlobalAveragePooling2D

pretrained_model = applications.inception_v3.InceptionV3(weights = "imagenet", include_top=False, input_shape = (299, 299, 3))


x = pretrained_model.output
x = GlobalAveragePooling2D()(x) #Highly reccomended

layer = Dense(2, activation='sigmoid')(x)

model = Model(input=pretrained_model.input, output=layer)

for layer in pretrained_model.layers:
  layer.trainable = False

model.summary()

This should give you the desired model to fine tune.

Chandan M S
  • 391
  • 1
  • 6