-1

I am trying to create a CNN on google collab but I am getting an error when I run it. The error I am getting is InvalidArgumentError: Graph execution error:

Everything is fine up until the last line. Is there a reason why I am getting an error?

from google.colab import drive
drive.mount('/content/drive')

import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.models import Sequential
from tensorflow.keras.preprocessing.image import ImageDataGenerator

from sklearn.metrics import classification_report
from sklearn.metrics import confusion_matrix

from tensorflow.keras.callbacks import EarlyStopping
from tensorflow.keras import layers

import matplotlib.pyplot as plt
import numpy
import os

DIRECTORY = "/content/drive/MyDrive/TennisImages"
CLASS_MODE = "categorical"
COLOR_MODE = "rgb"
BATCH_SIZE = 32

training_data_generator = ImageDataGenerator(rescale=1.0/255, zoom_range=0.1, width_shift_range=0.05, height_shift_range=0.05, horizontal_flip = True)
validation_data_generator = ImageDataGenerator()

training_iterator = training_data_generator.flow_from_directory(DIRECTORY, class_mode = "categorical", color_mode = "rgb", batch_size = BATCH_SIZE)
training_iterator.next()
validation_iterator = validation_data_generator.flow_from_directory(DIRECTORY,class_mode='categorical', color_mode='rgb',batch_size=BATCH_SIZE)

def design_model(training_data):
    # sequential model
    model = Sequential()
    # add input layer with rgb image shape
    model.add(tf.keras.Input(shape=(414, 896, 3)))

    model.add(layers.Conv2D(filters = 32, kernel_size = (5,5), activation = 'relu'))
    model.add(layers.MaxPooling2D(2,2))

    model.add(layers.Conv2D(filters = 64, kernel_size = (3,3), activation = 'relu'))
    model.add(layers.MaxPooling2D(2,2))

    model.add(layers.Flatten())
    model.add(layers.Dense(64, activation = 'relu'))

    model.add(layers.Dense(2, activation = 'softmax'))
    model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=.001), loss=tf.keras.losses.CategoricalCrossentropy(), metrics=[tf.keras.metrics.CategoricalAccuracy(),tf.keras.metrics.AUC()],)
    # summarize model
    model.summary()
    return model

model = design_model(training_iterator)


history = model.fit(training_iterator, steps_per_epoch = 1, epochs = 5, validation_data = validation_iterator, validation_steps = 1)

The error message is: How can I edit my code so that I can fix this error? Thank you.

InvalidArgumentError                      Traceback (most recent call last)
<ipython-input-19-fcce5cac1884> in <module>
      2 
      3 
----> 4 history = model.fit(training_iterator, steps_per_epoch = 1, epochs = 5, validation_data = validation_iterator, validation_steps = 1)

1 frames
/usr/local/lib/python3.7/dist-packages/tensorflow/python/eager/execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name)
     53     ctx.ensure_initialized()
     54     tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
---> 55                                         inputs, attrs, num_outputs)
     56   except core._NotOkStatusException as e:
     57     if name is not None:

InvalidArgumentError: Graph execution error:

Detected at node 'sequential_3/flatten_3/Reshape' defined at (most recent call last):
    File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
      "__main__", mod_spec)
    File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
      exec(code, run_globals)
    File "/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py", line 16, in <module>
      app.launch_new_instance()
    File "/usr/local/lib/python3.7/dist-packages/traitlets/config/application.py", line 846, in launch_instance
      app.start()
    File "/usr/local/lib/python3.7/dist-packages/ipykernel/kernelapp.py", line 612, in start
      self.io_loop.start()
    File "/usr/local/lib/python3.7/dist-packages/tornado/platform/asyncio.py", line 132, in start
      self.asyncio_loop.run_forever()
    File "/usr/lib/python3.7/asyncio/base_events.py", line 541, in run_forever
      self._run_once()
    File "/usr/lib/python3.7/asyncio/base_events.py", line 1786, in _run_once
      handle._run()
    File "/usr/lib/python3.7/asyncio/events.py", line 88, in _run
      self._context.run(self._callback, *self._args)
    File "/usr/local/lib/python3.7/dist-packages/tornado/ioloop.py", line 758, in _run_callback
      ret = callback()
    File "/usr/local/lib/python3.7/dist-packages/tornado/stack_context.py", line 300, in null_wrapper
      return fn(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 1233, in inner
      self.run()
    File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 1147, in run
      yielded = self.gen.send(value)
    File "/usr/local/lib/python3.7/dist-packages/ipykernel/kernelbase.py", line 365, in process_one
      yield gen.maybe_future(dispatch(*args))
    File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 326, in wrapper
      yielded = next(result)
    File "/usr/local/lib/python3.7/dist-packages/ipykernel/kernelbase.py", line 268, in dispatch_shell
      yield gen.maybe_future(handler(stream, idents, msg))
    File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 326, in wrapper
      yielded = next(result)
    File "/usr/local/lib/python3.7/dist-packages/ipykernel/kernelbase.py", line 545, in execute_request
      user_expressions, allow_stdin,
    File "/usr/local/lib/python3.7/dist-packages/tornado/gen.py", line 326, in wrapper
      yielded = next(result)
    File "/usr/local/lib/python3.7/dist-packages/ipykernel/ipkernel.py", line 306, in do_execute
      res = shell.run_cell(code, store_history=store_history, silent=silent)
    File "/usr/local/lib/python3.7/dist-packages/ipykernel/zmqshell.py", line 536, in run_cell
      return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 2855, in run_cell
      raw_cell, store_history, silent, shell_futures)
    File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 2881, in _run_cell
      return runner(coro)
    File "/usr/local/lib/python3.7/dist-packages/IPython/core/async_helpers.py", line 68, in _pseudo_sync_runner
      coro.send(None)
    File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 3058, in run_cell_async
      interactivity=interactivity, compiler=compiler, result=result)
    File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 3249, in run_ast_nodes
      if (await self.run_code(code, result,  async_=asy)):
    File "/usr/local/lib/python3.7/dist-packages/IPython/core/interactiveshell.py", line 3326, in run_code
      exec(code_obj, self.user_global_ns, self.user_ns)
    File "<ipython-input-19-fcce5cac1884>", line 4, in <module>
      history = model.fit(training_iterator, steps_per_epoch = 1, epochs = 5, validation_data = validation_iterator, validation_steps = 1)
    File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 64, in error_handler
      return fn(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 1384, in fit
      tmp_logs = self.train_function(iterator)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 1021, in train_function
      return step_function(self, iterator)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 1010, in step_function
      outputs = model.distribute_strategy.run(run_step, args=(data,))
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 1000, in run_step
      outputs = model.train_step(data)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/training.py", line 859, in train_step
      y_pred = self(x, training=True)
    File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 64, in error_handler
      return fn(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py", line 1096, in __call__
      outputs = call_fn(inputs, *args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 92, in error_handler
      return fn(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/sequential.py", line 374, in call
      return super(Sequential, self).call(inputs, training=training, mask=mask)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/functional.py", line 452, in call
      inputs, training=training, mask=mask)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/functional.py", line 589, in _run_internal_graph
      outputs = node.layer(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 64, in error_handler
      return fn(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/engine/base_layer.py", line 1096, in __call__
      outputs = call_fn(inputs, *args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/utils/traceback_utils.py", line 92, in error_handler
      return fn(*args, **kwargs)
    File "/usr/local/lib/python3.7/dist-packages/keras/layers/core/flatten.py", line 96, in call
      return tf.reshape(inputs, flattened_shape)
Node: 'sequential_3/flatten_3/Reshape'
Input to reshape is a tensor with 984064 values, but the requested shape requires a multiple of 1435008
     [[{{node sequential_3/flatten_3/Reshape}}]] [Op:__inference_train_function_4377]
Justin
  • 1
  • 1
  • 2
    Does this answer your question? [Invalid Argument Error / Graph Execution Error](https://stackoverflow.com/questions/71153492/invalid-argument-error-graph-execution-error) – nathan liang Sep 24 '22 at 20:24
  • Always include the complete error message, without it we cannot help you. – Dr. Snoopy Sep 24 '22 at 20:32
  • `Input to reshape is a tensor with 984064 values, but the requested shape requires a multiple of 1435008` seems pretty clear. – takendarkk Sep 24 '22 at 20:43
  • Maybe you should add `target_size=(414, 896)` to `flow_from_directory` – AndrzejO Sep 24 '22 at 21:21

1 Answers1

0

When you provide a dataset to the fit method, it will not check the image sizes. So they go through the conv layers without problems (they accept any image size), but at the Flatten layer there will be a problem, because it expects exactly 1435008 numbers based on the the layers before and the input_shape. So it fails at the flatten layer. You can resolve this problem by providing images of the correct size, add target_size=(414, 896) to flow_from_directory:

training_iterator = training_data_generator.flow_from_directory(DIRECTORY, target_size=(414, 896), class_mode = "categorical", color_mode = "rgb", batch_size = BATCH_SIZE)
training_iterator.next()
validation_iterator = validation_data_generator.flow_from_directory(DIRECTORY, target_size=(414, 896), class_mode='categorical', color_mode='rgb',batch_size=BATCH_SIZE)
AndrzejO
  • 1,502
  • 1
  • 9
  • 12