0

I am trying to make the first 10 layers in this TFHub model to be non-traininable. I want to freeze those layers so that I can finetune the remaining layers. I could not find any example to do this. I have seen similar examples in Keras models such as resNet50 where layers.trainable can be exclusively set to True or False. I am not able to do this in TFHub models. Any pointer will be appreciated. Thanks

import tensorflow_hub as tfhub

model_loc = "https://tfhub.dev/google/imagenet/resnet_v1_50/classification/5"
model = tfhub.KerasLayer(
    model_loc,
    input_shape=(224, 224, 3),
    trainable=True)
learner
  • 2,582
  • 9
  • 43
  • 54
  • check this - [tf hub transfer learning](https://www.tensorflow.org/tutorials/images/transfer_learning_with_hub#simple_transfer_learning) – ahmedshahriar May 26 '22 at 07:09
  • 1
    @ahmedshahriar, That page only says how to freeze the entire model or unfreeze the entire model. I am looking to unfreeze only few layers and not all the layers. Please see my OP. – learner May 26 '22 at 22:23
  • 1
    I suggest create a github issue then, sorry can't help you more – ahmedshahriar May 27 '22 at 09:45

0 Answers0