1

I am adapting to my own needs the following code https://github.com/tensorflow/models/blob/master/slim/datasets/download_and_convert_flowers.py

I need to add zero-padding and resize the images to 299x299 (inception V3 input size).

I am doing this adding some lines of code changing the original

        image_data = tf.gfile.FastGFile(filenames[i], 'r').read()
        height, width = image_reader.read_image_dims(sess, image_data)

        class_name = os.path.basename(os.path.dirname(filenames[i]))
        class_id = class_names_to_ids[class_name]

        example = dataset_utils.image_to_tfexample(
            image_data, 'jpg', height, width, class_id)

with this

        image_data = tf.gfile.FastGFile(filenames[i], 'r').read()
        height, width = image_reader.read_image_dims(sess, image_data)

        image_decoded = tf.image.decode_jpeg(image_data, channels=None, ratio=None, fancy_upscaling=None, try_recover_truncated=None, acceptable_fraction=None, name=None)

        M=max(width,height)

        image_decoded = tf.image.pad_to_bounding_box(image_decoded, int(math.floor((M-height)/2)), int(math.floor((M-width)/2)), M, M)

        image_decoded = tf.expand_dims(image_decoded, 0)

        image_decoded = tf.image.resize_bilinear(image_decoded, [299, 299], align_corners=None, name=None)

        image_decoded = tf.squeeze(image_decoded)

        image_decoded = tf.bitcast(image_decoded, tf.uint8)

        image_data = tf.image.encode_jpeg(image_decoded)

        class_name = os.path.basename(os.path.dirname(filenames[i]))
        class_id = class_names_to_ids[class_name]

        example = dataset_utils.image_to_tfexample(image_data, b'jpg', height, width, class_id)

I get the following error

  File "convert_dataset.py", line 236, in <module>
tf.app.run()
  File "/home/franco/tensorflow/local/lib/python2.7/site-packages/tensorflow/python/platform/app.py", line 44, in run
_sys.exit(main(_sys.argv[:1] + flags_passthrough))
  File "convert_dataset.py", line 233, in main
run(FLAGS.dataset_dir)
  File "convert_dataset.py", line 217, in run
dataset_dir)
  File "convert_dataset.py", line 165, in _convert_dataset
example = dataset_utils.image_to_tfexample(image_data, b'jpg', height, width, class_id)
  File "/home/franco/Desktop/dataset_originario/dataset/dataset_utils.py", line 58, in image_to_tfexample
'image/encoded': bytes_feature(image_data),
  File "/home/franco/Desktop/dataset_originario/dataset/dataset_utils.py", line 53, in bytes_feature
return tf.train.Feature(bytes_list=tf.train.BytesList(value=[values]))
  File "/home/franco/tensorflow/lib/python2.7/site-packages/google/protobuf/internal/python_message.py", line 508, in init
copy.extend(field_value)
  File "/home/franco/tensorflow/lib/python2.7/site-packages/google/protobuf/internal/containers.py", line 275, in extend
new_values = [self._type_checker.CheckValue(elem) for elem in elem_seq_iter]
  File "/home/franco/tensorflow/lib/python2.7/site-packages/google/protobuf/internal/type_checkers.py", line 109, in CheckValue
raise TypeError(message)
  TypeError: <tf.Tensor 'EncodeJpeg:0' shape=() dtype=string> has type <class 'tensorflow.python.framework.ops.Tensor'>, but expected one of: ((<type 'str'>,),)

I have only found this open issue https://github.com/tensorflow/models/issues/726

Maybe there is something else wrong in my code

user1188243
  • 61
  • 1
  • 5

1 Answers1

0

I add .eval() at this step image_data = tf.image.encode_jpeg(image_decoded)

It stays like this:

image_data = tf.image.encode_jpeg(image_decoded).eval()

Your github link is non-existent so I could not evaluate if dataset_utils.image_to_tfexample follows this same pattern as below, but it looks like it does from the arguments.

def bytes_feature(values):
    """Returns a TF-Feature of bytes.
    Args:
      values: A string.
    Returns:
     a TF-Feature.
   """
   return tf.train.Feature(bytes_list=tf.train.BytesList(value=[values]))

def image_to_tfexample(image_data, image_format, height, width):
   feature={
       'image/encoded': bytes_feature(image_data.eval()),
       'image/format': bytes_feature(image_format),
       'image/height': int64_feature(height),
       'image/width': int64_feature(width),
   }
   return tf.train.Example(features=tf.train.Features(feature=feature))
ricoms
  • 952
  • 15
  • 22