0

I load the model in this way:

model = EfficientNetB7(weights='imagenet', include_top=False, input_shape=input_shape) 

What i am trying to do is remove the layer in position:

model.layers[1] #Rescaling
model.layers[2] #Normalization

What i tried is:

del_res= Dense(1, activation='relu', name='remove_rescaling')(base_model.layers[1].output)
del_nor= Dense(1, activation='relu', name='remove_normalization')(base_model.layers[2].output)

but the point is that both layers are still there.

I even tried:

model.layer.pop(1)
model.layer.pop(2)

But nothing to do!!

Do you have any recommendation?

  • If you're using tf 2.4.1, you can use `pop` method but in a later version you can't;; discussed [here](https://github.com/keras-team/keras/issues/15542). And here is an [open ticket](https://github.com/keras-team/keras/issues/16081) to address this, (WIP). – Innat Jul 21 '22 at 15:30
  • @M.Innat ok man, many thanks. I read online however that pop create a copy of the structure of the network and then print only the model architecture without the layer popped out: is that true? Or the pop effect for real the model? – Diego Rando Jul 22 '22 at 14:49

0 Answers0