1

I am still new to this field, and am trying to learn. So I am trying to figure out how to feed an image and corresponding mask to a unet in R.

I found the following example: Keras | Image data preprocessing.

I could follow the steps below and implement in R until I met some trouble

# we create two instances with the same arguments
data_gen_args = dict(featurewise_center=True,
                     featurewise_std_normalization=True,
                     rotation_range=90,
                     width_shift_range=0.1,
                     height_shift_range=0.1,
                     zoom_range=0.2)
image_datagen = ImageDataGenerator(**data_gen_args)
mask_datagen = ImageDataGenerator(**data_gen_args)
# Provide the same seed and keyword arguments to the fit and flow methods
seed = 1
image_datagen.fit(images, augment=True, seed=seed)
mask_datagen.fit(masks, augment=True, seed=seed)
image_generator = image_datagen.flow_from_directory(
    'data/images',
    class_mode=None,
    seed=seed)
mask_generator = mask_datagen.flow_from_directory(
    'data/masks',
    class_mode=None,
    seed=seed)
# combine generators into one which yields image and masks
train_generator = zip(image_generator, mask_generator)
model.fit(
    train_generator,
    steps_per_epoch=2000,
    epochs=50)

I have two questions:

  1. I have to fit the datagen with images / masks to be able to use featurewise transformations(Transforming images and masks together (Keras example)). My question is: what is an efficient method to load multiple images and change to arrays? keras::image_load() %>% keras::image_to_array() load a single image. Or should I write a for loop?
  2. Is there a function in R similar to zip in python? Or in what format should I try to combine the image and corresponding mask (generators) to be able to fit the model.

Thank you for your help

Shayan Shafiq
  • 1,447
  • 5
  • 18
  • 25
Selsheikh
  • 46
  • 3

0 Answers0