I used tf.keras to build a fully-connected ANN, "my_model". Then, I'm trying to minimize a function f(x) = my_model.predict(x) - 0.5 + g(x)
using Adam optimizer from TensorFlow. I tried the below code:
x = tf.get_variable('x', initializer = np.array([1.5, 2.6]))
f = my_model.predict(x) - 0.5 + g(x)
optimizer = tf.train.AdamOptimizer(learning_rate=.001).minimize(f)
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
for i in range(50):
print(sess.run([x,f]))
sess.run(optimizer)
However, I'm getting the following error when my_model.predict(x)
is executed:
If your data is in the form of symbolic tensors, you should specify the
steps
argument (instead of thebatch_size
argument)
I understand what the error is but I'm unable to figure out how to make my_model.predict(x)
work in the presence of symbolic tensors. If my_model.predict(x)
is removed from the function f(x)
, the code runs without any error.
I checked the following link, link where TensorFlow optimizers are used to minimize an arbitrary function, but I think my problem is with the usage of underlying keras's model.predict()
function. I appreciate any help. Thanks in advance!