2

I am trying to create training, validation, and testing datasets using 'tf.keras.utils.image_dataset_from_directory'. When I create my datasets, I get the following confirmation output:

Found 6198 files belonging to 2 classes.
Found 896 files belonging to 2 classes.
Found 896 files belonging to 2 classes.

But when I try to work with my training dataset to, for instance, retrieve label information via this code:

train_y = np.concatenate([y for x, y in train_ds], axis=0)

I get the following error:

---------------------------------------------------------------------------
InvalidArgumentError                      Traceback (most recent call last)
<ipython-input-26-7b4581e22404> in <module>()
----> 1 test = np.concatenate([y for x, y in train_ds], axis=0)

4 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/framework/ops.py in raise_from_not_ok_status(e, name)
   7184 def raise_from_not_ok_status(e, name):
   7185   e.message += (" name: " + name if name is not None else "")
-> 7186   raise core._status_to_exception(e) from None  # pylint: disable=protected-access
   7187 
   7188 

InvalidArgumentError: Input is empty.
     [[{{node decode_image/DecodeImage}}]] [Op:IteratorGetNext]

I also cannot use my training dataset to train my model as I get a similar error. I am creating the folder directories using the Split Folders API, and I am oversampling my training dataset because of label imbalance. I have tried deleting and remaking my directories, and interestingly sometimes it's the testing dataset that doesn't work, and other times its the training dataset. At the moment, the validation and testing datasets seem to be working fine. I have no idea what's going wrong...

This is the code I am using to create the datasets:

def create_datasets(path, image_size, batch_size, label_mode='int'):
  train_ds = tf.keras.utils.image_dataset_from_directory(
      os.path.join(path,'train'),
      labels='inferred',
      label_mode=label_mode,
      seed=123,
      shuffle=True,
      image_size=image_size,
      batch_size=batch_size
  ) 
  val_ds = tf.keras.utils.image_dataset_from_directory(
      os.path.join(path,'val'),
      labels='inferred',
      label_mode=label_mode,
      seed=123,
      shuffle=True,
      image_size=image_size,
      batch_size=batch_size
  )
  test_ds = tf.keras.utils.image_dataset_from_directory(
      os.path.join(path,'test'),
      labels='inferred',
      label_mode=label_mode,
      shuffle=False,
      image_size=image_size,
      batch_size=batch_size
  )
  return train_ds, val_ds, test_ds

benign_malignant_split_dir = pathlib.Path(os.path.join(f"/content/gdrive/MyDrive/cw/data_split_benign_malignant"))
image_size = (135, 180)
batch_size = 64

train_ds, val_ds, test_ds = create_datasets(path=benign_malignant_split_dir, image_size=image_size, batch_size=batch_size)

Thanks for any insight into this.

Jimblim
  • 65
  • 7

1 Answers1

0

Found the issue, one of the images in my folders had become corrupted. Remaking the folder directories fixed the issue.

Jimblim
  • 65
  • 7