I want to inference with a fp32 model using fp16 to verify the half precision results. After loading checkpoint, the params can be converted to float16, then how to use these fp16 params in session?
reader = tf.train.NewCheckpointReader(model_file)
var_to_map = reader.get_variable_to_dtype_map()
for key, val in var_to_map.items():
tsr = reader.get_tensor(key)
val_f16 = tf.cast(tsr, tf.float16)
# sess.restore() ???