I am currently working to train a dataset stored as numpy arrays using
train_dataset=tf.data.Dataset.from_tensor_slices(train_data)
Here, train_data is a numpy array of data without the associated labels. The model that I am running was created to work on datasets as DatasetV1Adapters(MNIST and dataset for pix to pix GANs). I have been looking for documentation for making the required correction for quite a while now(around 4 weeks). and this method hasn't solved my problem.
For the training process, I was running:
for images in train_dataset:
#images=np.expand_dims(images, axis=0)
disc_loss += train_discriminator(images)
Which would give me an error of
ValueError: Input 0 of layer conv2d_2 is incompatible with the layer: expected ndim=4, found ndim=3.
The array shape was [32,32,3] so the 100 from number of images was lost. I tried to run the commented out line images=np.expand_dims(images, axis=0). Thus I got [1,32,32,3] which matched my required dimensionality. I thought my problem would be solved, but instead I now have the following error:
ValueError: Input 0 of layer conv2d_4 is incompatible with the layer: expected axis -1 of input shape to have value 1 but received input with shape [1, 32, 32, 3]
Which I don't fully understand. It seems like the error is definitely related to the datasetV1Adapter as I get the same type of error was various codes. I have tried uploading my dataset to github, but as a 10GB folder, I am unable to actually upload it. Any help will be appreciated
EDIT: Followed @Sebastian-Sz's advice(Kind off). I set my channels in the model to three to accommodate RGB instead of grayscale. Running this code gave me
TypeError: Value passed to parameter 'input' has DataType uint8 not in list of allowed values: float16, bfloat16, float32, float64
So I added
train_data = np.asarray(train_data, dtype=np.float)
Now I get an error saying:
Input 0 of layer dense_5 is incompatible with the layer: expected axis -1 of input shape to have value 6272 but received input with shape [1, 8192]
Which makes no sense to me