1

I'm trying to export a tensorflow model like

  feature_spec = { 'words': tf.FixedLenSequenceFeature([], tf.int64, allow_missing=True) }

  def serving_input_receiver_fn():
      """Build the serving inputs."""
      serialized_tf_example = tf.placeholder(dtype=tf.string,
                                             shape=[1],
                                             name='input_example_tensor')
      features = tf.parse_example(serialized_tf_example, feature_spec)
      receiver_tensors = {'words': serialized_tf_example}
      return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

  export_dir = classifier.export_savedmodel(export_dir_base=args.job_dir,
                                            serving_input_receiver_fn=serving_input_receiver_fn)

but I'm receiving this error

Cannot infer num from shape (1, ?, 128, 128)

I don't know where the ? comes from, I'm guessing it's from tf.parse_example. Any ideas on what I'm doing wrong here?

andresbravog
  • 583
  • 5
  • 9

1 Answers1

0

Without knowing the full reason this code seems to work nice

  def serving_input_receiver_fn():
    feature_spec = { "words": tf.FixedLenFeature(dtype=tf.int64, shape=[4]) }
    return tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)()
andresbravog
  • 583
  • 5
  • 9