0

(I couldn't find the solution with the previosly asked questions.) I use VGG16 to test on my data. My class number is 2 and I used this page to freeze conv layers and train top layers. Here is the code:

from keras.applications import VGG16
model = VGG16(include_top=False,classes=2,input_shape(224,224,3),weights='imagenet')

Then I created top_model which will be top model of my Vgg16:

top_model=Sequential()
top_model.add(Flatten(input_shape=model.output_shape[1:]))
top_model.add(Dense(4096,activation='relu'))
top_model.add(Dense(4096,activation='relu'))
top_model.add(Dense(2,activation='softmax'))
model= Model(inputs=model.input, output=top_model(model.output))

Then I freezed some layers and compiled model:

for layer in model.layers[:19]:
 layer.trainable=False

model.compile(loss='binary_crossentropy',optimizer=optimizers.SGD(lr=1e-4, momentum=0.9),
           metrics=['accuracy'])

After some data augmentation process, I trained the model and saved the weights like that:

model.fit_generator(trainGenerator,samples_per_epoch=numTraining//batchSize,
                  epochs=numEpoch,
                  validation_data=valGenerator,
                  validation_steps=numValid//batchSize)

model.save_weights('top3_weights.h5') 

After that, my weights are saved and I changed second part of my code to be able to test whole model with my data:

model = VGG16(include_top=False,classes=2,input_shape=(224,224,3),weights='imagenet')

top_model=Sequential()
top_model.add(Flatten(input_shape=model.output_shape[1:]))
top_model.add(Dense(4096,activation='relu'))
top_model.add(Dense(4096,activation='relu'))
top_model.add(Dense(2,activation='softmax'))
top_model.load_weights(r'C:\\Users\\Umit Kilic\\Komodomyfiles\\umit\\top3_weights.h5') #(this line is added)
model= Model(inputs=model.input, output=top_model(model.output)) 

Finally when I tried to summary the model with the code:

print(model.summary())

I am getting that errors and outputs:

Using TensorFlow backend.
Traceback (most recent call last):
  File "C:\Users\Umit Kilic\Komodomyfiles\umit\test_vgg16.py", line 38, in <module>
    top_model.load_weights(r'C:\\Users\\Umit Kilic\\Komodomyfiles\\umit\\top3_weights.h5')
  File "C:\Users\Umit Kilic\AppData\Local\Programs\Python\Python36\lib\site-packages\keras\engine\network.py", line 1166, in load_weights
f, self.layers, reshape=reshape)
  File "C:\Users\Umit Kilic\AppData\Local\Programs\Python\Python36\lib\site-packages\keras\engine\saving.py", line 1030, in load_weights_from_hdf5_group
str(len(filtered_layers)) + ' layers.')
ValueError: You are trying to load a weight file containing 14 layers into a model with 3 layers.

Any help, please?

Dominique
  • 16,450
  • 15
  • 56
  • 112
umitkilic
  • 327
  • 1
  • 4
  • 17

1 Answers1

1

model contains the full model, that is the stacked model of VGG plus top_model. When you save its weights, it has 14 layers. You cannot load it in top_model because the VGG weights are in it too, but you can in your new model.

model = VGG16(include_top=False,classes=2,input_shape=(224,224,3),weights='imagenet')
top_model=Sequential()
top_model.add(Flatten(input_shape=model.output_shape[1:])) 
top_model.add(Dense(4096,activation='relu'))
top_model.add(Dense(4096,activation='relu'))
top_model.add(Dense(2,activation='softmax'))
model= Model(inputs=model.input, output=top_model(model.output))

model.load_weights(r'C:\\Users\\Umit Kilic\\Komodomyfiles\\umit\\top3_weights.h5') #(this line is added)
Diane M
  • 1,503
  • 1
  • 12
  • 23
  • Thanks, it works now. But I couldn't get the logic. "model" already has all layers and weights (imagenet weights) except last three layer (fully connected layers). So, I train the last three layers and created file ('top3_weights.h5') should save just last three layers' weights. If we considered the answer, top3_weights.h5 file has to store all layers' weights (imagenet for conv layer + new created weights for last three layers). Which one is true for the new created weight file? – umitkilic Jan 18 '19 at 14:06
  • The weight file has all, I believe if you want only to save the layers you have trained you must call top_layer.save_weights and not model.save_weights. Note that even though they are not trainable all layer weights are saved. – Diane M Jan 18 '19 at 14:13