5

I try to implement a visualization for optimization algorithms in TensorFlow.

Therefore I started with the Beale's function

Baele's functionBaele's function

The global minimum is at

global minimum

A plot of the Beale's function does look like this plot of Baele's function

I would like to start at the point f(x=3.0, y=4.0)

How do I implement this in TensorFlow with optimization algorithms?

My first try looks like this

import tensorflow as tf

# Beale's function
x = tf.Variable(3.0, trainable=True)
y = tf.Variable(4.0, trainable=True)
f = tf.add_n([tf.square(tf.add(tf.subtract(1.5, x), tf.multiply(x, y))),
          tf.square(tf.add(tf.subtract(2.25, x), tf.multiply(x, tf.square(y)))),
          tf.square(tf.add(tf.subtract(2.625, x), tf.multiply(x, tf.pow(y, 3))))])

Y = [3, 0.5]

loss = f
opt = tf.train.GradientDescentOptimizer(0.1).minimize(loss)

sess = tf.Session()
sess.run(tf.global_variables_initializer())

for i in range(100):
    print(sess.run([x, y, loss]))
    sess.run(opt)

Obviously this doesn't work. I guess I have to define a correct loss, but how? To clearify: My problem is that I don't understand how TensorFlow works and I don't know much python (coming from Java, C, C++, Delphi, ...). My question is not on how this works and what the best optimization methods are, it's only about how to implement this in a correct way.

Spenhouet
  • 6,556
  • 12
  • 51
  • 76
  • 1
    I think you need to change your definition of function `f` to `f = tf.add_n([tf.square(tf.add(tf.subtract(1.5, x), tf.multiply(x, y))), tf.square(tf.add(tf.subtract(2.25, x), tf.multiply(x, tf.square(y)))), tf.square(tf.add(tf.subtract(2.625, x), tf.multiply(x, tf.pow(y, 3))))])` – Vladimir Bystricky Jul 22 '17 at 06:53
  • Yes, sorry I did see this too but forgot to update my question. – Spenhouet Jul 22 '17 at 10:55

1 Answers1

4

Oh I already could figure it out. The problem is that I need to clip the max and min values of x and y to -4.5 and 4.5 so that they don't explode to infinite.

This solution works:

import tensorflow as tf

# Beale's function
x = tf.Variable(3.0, trainable=True)
y = tf.Variable(4.0, trainable=True)
f = tf.add_n([tf.square(tf.add(tf.subtract(1.5, x), tf.multiply(x, y))),
          tf.square(tf.add(tf.subtract(2.25, x), tf.multiply(x, tf.square(y)))),
          tf.square(tf.add(tf.subtract(2.625, x), tf.multiply(x, tf.pow(y, 3))))])

opt = tf.train.GradientDescentOptimizer(0.01)
grads_and_vars = opt.compute_gradients(f, [x, y])
clipped_grads_and_vars = [(tf.clip_by_value(g, -4.5, 4.5), v) for g, v in grads_and_vars]

train = opt.apply_gradients(clipped_grads_and_vars)

sess = tf.Session()
sess.run(tf.global_variables_initializer())

for i in range(100):
    print(sess.run([x, y]))
    sess.run(train)

If someone knows if it's possible to add multiple neurons / layers to this code, please feel free to write a answer.

Spenhouet
  • 6,556
  • 12
  • 51
  • 76
  • Your solution is still far from a global optimization method. – Jonas Adler Jul 22 '17 at 10:40
  • Could you be more specific? What is still wrong and what can I improve? If you have a better solution, feel free to share :) – Spenhouet Jul 22 '17 at 10:56
  • Well your original question was "find global minimum of a function", which is a well studied (and very hard) problem in optimization, see e.g. [wikipedia](https://en.wikipedia.org/wiki/Global_optimization). It is well known that gradient descent *does not* (in general) find the global minimum, so you would need to totally change your method to e.g. simulated annealing or basin hopping. – Jonas Adler Jul 22 '17 at 11:00
  • Ah oh, you did missunderstand my question. I very well know that and all the techniques. That was not the point of my question. Even the parameters are not optimized at all (the solution above doesn't even come near the global minimum). My problem is that I don't understand TensorFlow and python and couldn't figure out how to write this correct. So you don't have any improvement about the implementation itself? – Spenhouet Jul 22 '17 at 11:07
  • Nope, not really, but I would change the title of this question since "global minimum" has a well defined meaning. – Jonas Adler Jul 22 '17 at 11:16
  • 1
    I see where you are coming from. Sure, I changed the title. :) – Spenhouet Jul 22 '17 at 11:18
  • How do you print out the values of the function in the loop? – David293836 May 27 '18 at 18:04
  • @user293836 Could you clarify your question? – Spenhouet May 28 '18 at 07:15
  • In the end of the script, it has "print (sess.run([x,y])" which prints out the values of x and y. I also want to see the values of the function, f. I am asking how to print f. – David293836 May 28 '18 at 17:50