0

I know that there are similar questions. Although I have checked them, I did not solve my problem.

I tried to implement mini-batching on fashion-Mnist dataset. Therefore I converted the dataset from np.array to tensor with tf.data.Dataset.from_tensor_slices but I could not solve the data shape incompatibility problem. Here is my code:

Loading Data

(train_images, train_labels) , (test_images, test_labels) = fashion_mnist.load_data()

Converting to tf.Dataset:

 train_ds = tf.data.Dataset.from_tensor_slices((train_images, train_labels))
 test_ds = tf.data.Dataset.from_tensor_slices((test_images, test_labels))

My model

model_1 = tf.keras.Sequential([
    
    tf.keras.layers.Flatten(input_shape = [28,28]),
    tf.keras.layers.Dense(50, activation = "relu"),
    tf.keras.layers.Dense(30, activation = "relu"),
    tf.keras.layers.Dense(10, activation = "softmax"),
    
])

model_1.compile( loss = tf.keras.losses.SparseCategoricalCrossentropy(),
               optimizer = tf.keras.optimizers.Adam(),
               metrics = ["accuracy"])

info = model_1.fit(train_ds,
                  epochs = 10,
                  validation_data = (test_images, test_labels))

But that gives me this error:

ValueError: Input 0 of layer dense_1 is incompatible with the layer: expected axis -1 of input shape to have value 784 but received input with shape [28, 28]

I checked the input shape with the following code: (Output is [28, 28])

list(train_ds.as_numpy_iterator().next()[0].shape)

How can I solve this problem, I would appreciate if you could help me.

Thanks!

1 Answers1

0

Since you are using tf.data.Dataset API for feeding your model, you should define the batch_size from the dataset.

train_ds = tf.data.Dataset.from_tensor_slices((train_images, train_labels)).batch(256)
test_ds = tf.data.Dataset.from_tensor_slices((test_images, test_labels)).batch(256)

Now you can use both datasets to train your model like:

info = model_1.fit(x=train_ds, epochs = 10, validation_data=test_ds)
Georgios Livanos
  • 506
  • 3
  • 17