1

I can iterate 1 by 1 through all the training examples (which is painfully slow) and find the training examples that don’t successfully get predicated.

I can ‘very quickly’ batch evaluate the same examples, however, I only can see the loss & accuracy (not the failing predictions) with verbose=1

Is there a way to have the batch evaluate emit info for the unpredicated items?

This is for a seq2seq problem.

def decode_sequence(input_seq): # Encode the input as state vectors. states_value = encoder_model.predict(input_seq)

# Generate empty target sequence of length 1.
target_seq = np.zeros((1, 1, num_decoder_tokens))
# Populate the first character of target sequence with the start character.
target_seq[0, 0, target_token_index["\t"]] = 1.0

# Sampling loop for a batch of sequences
# (to simplify, here we assume a batch of size 1).
stop_condition = False
decoded_sentence = ""
while not stop_condition:
    output_tokens, h, c = decoder_model.predict([target_seq] + states_value)

    # Sample a token
    sampled_token_index = np.argmax(output_tokens[0, -1, :])
    sampled_char = reverse_target_char_index[sampled_token_index]
    decoded_sentence += sampled_char

    # Exit condition: either hit max length
    # or find stop character.
    if sampled_char == "\n" or len(decoded_sentence) > max_decoder_seq_length:
        stop_condition = True

    # Update the target sequence (of length 1).
    target_seq = np.zeros((1, 1, num_decoder_tokens))
    target_seq[0, 0, sampled_token_index] = 1.0

    # Update states
    states_value = [h, c]
return decoded_sentence

Thanks in advance

soAcct
  • 21
  • 2

1 Answers1

1

You can use tf.keras.Model.predict to predict on a batch. Then you just need to compare the predicted value and the true value with tf.math.equal

Note: this answer was posted before the author mentioned that this is seq2seq related problems

dinhanhx
  • 198
  • 2
  • 13
  • Thanks for the reply.... I forgot to mention this is for a seq2seq, so the predict is a series of predictions. Each sequence may be a different length. Can you think of a good way to do this in the context of seq2seq? – soAcct Feb 18 '21 at 15:23
  • In this case, because I don't have experience with seq2seq models, I can't help you deeply. – dinhanhx Feb 18 '21 at 23:02