1

I am using transfer learning to build a model that with three categories. I do not know why I am having an error due to logits and labels. This is my code

baseModel = tf.keras.applications.VGG19(input_shape=(128,128,3), include_top=False, weights='imagenet')
baseModel.trainable = False
labels = ['glass', 'paper', 'plastic']
trainGenerator = ImageDataGenerator(preprocessing_function=tf.keras.applications.vgg19.preprocess_input, rescale=(1/255.0)) \
    .flow_from_directory(directory=trainDir, target_size=(128,128), classes=['glass', 'paper', 'plastic'], batch_size=10)
testGenerator = ImageDataGenerator(preprocessing_function=tf.keras.applications.vgg19.preprocess_input, rescale=(1/255.0)) \
    .flow_from_directory(directory=testDir, target_size=(128,128), classes=['glass', 'paper', 'plastic'], batch_size=10)
validGenerator = ImageDataGenerator(preprocessing_function=tf.keras.applications.vgg19.preprocess_input, rescale=(1/255.0)) \
    .flow_from_directory(directory=validDir, target_size=(128,128), classes=['glass', 'paper', 'plastic'], batch_size=10)
images, label = next(trainGenerator)
model.add(Input(shape=(128,128,3)))
model.add(baseModel)
model.compile(optimizer=Adam(learning_rate = 0.0001), 
                   loss='sparse_categorical_crossentropy', 
                   metrics=['sparse_categorical_accuracy'])
history = model.fit(trainGenerator, 
                    epochs=20, 
                    shuffle=True,
                    validation_data=validGenerator 
                    )

This is the error I am getting

InvalidArgumentError:  logits and labels must have the same first dimension, got logits shape [160,3] and labels shape [30]
     [[node sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits
 (defined at C:\Users\ugouc\anaconda3\lib\site-packages\keras\backend.py:5114)
]] [Op:__inference_train_function_4832]

Errors may have originated from an input operation.
Input Source operations connected to node sparse_categorical_crossentropy/SparseSoftmaxCrossEntropyWithLogits/SparseSoftmaxCrossEntropyWithLogits:
In[0] sparse_categorical_crossentropy/Reshape_1 (defined at C:\Users\ugouc\anaconda3\lib\site-packages\keras\backend.py:5109)   
In[1] sparse_categorical_crossentropy/Reshape (defined at C:\Users\ugouc\anaconda3\lib\site-packages\keras\backend.py:3561)

When I try to add more layers (e.g. flatten layer and dense with relu), I get an error saying it cannot squeeze the dimension of 3 to 1. Please, help

Uchechi Ugo
  • 21
  • 1
  • 1
  • 2

1 Answers1

0

remove the code

model.add(Input(shape=(128,128,3)))

You have already specified the input shape the the VGG19 model code. In the VGG code change to

baseModel = tf.keras.applications.VGG19(input_shape=(128,128,3), include_top=False, weights='imagenet', poooling='max')

This makes the output of the base model a vector that can be used as input to a dense layer. Then add code after baseModel

x=base_model.output
x=BatchNormalization(axis=-1, momentum=0.99, epsilon=0.001 )(x)
x = Dense(256, kernel_regularizer = regularizers.l2(l = 0.016),activity_regularizer=regularizers.l1(0.006),
                bias_regularizer=regularizers.l1(0.006) ,activation='relu')(x)
x=Dropout(rate=.4, seed=123)(x)       
output=Dense(3, activation='softmax')(x)
model=Model(inputs=base_model.input, outputs=output)
lr=.001 # start with this learning rate
model.compile(Adamax(learning_rate=lr), loss='categorical_crossentropy', metrics=['accuracy']) 

Also in your test generator set shuffle=False

Gerry P
  • 7,662
  • 3
  • 10
  • 20