I am trying to use the Dataset API along with Keras and I am trying to use the third option in the action plan mentioned here. Also I assumed that the third option was already done when seeing the second comment by @fchollet here.
But then when I tried to implement it, I got the following error:
When feeding symbolic tensors to a model, we expect the tensors to have a static batch size. Got tensor with shape:
(None, 32, 64, 64, 3)
I used the following strategy to fit the model:
training_filenames = [.....]
dataset = tf.data.TFRecordDataset(training_filenames)
dataset = dataset.map(_parse_function_all) # Parse the record into tensors.
dataset = dataset.batch(20)
iterator = dataset.make_initializable_iterator()
videos, labels= iterator.get_next()
model = create_base_network(input_shape = ( 32, 64, 64 3))
# output dimension will be (None, 10) for the model above
sgd = SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy', optimizer=sgd)
model.fit(videos, labels, , epochs=10, steps_per_epoch=1000)
I can solve the problem by using fit_generator
. I found the solution here I applied @Dat-Nguyen's solution. But then I wasn't able to access the validation dataset within the custom callback in order to compute the AUC metric for example. So I need to use fit
instead of fit_generator
, but first need to git rid of this error.
So can anyone tell me why I got this error? Is the third step of fitting the model working now in Keras or does it still have issues?