I have an optimization problem that involves minimizing a function whose gradient I know, but the actual value of objective function at any point is unknown.
I'd like to optimize the function using BFGS, but all of the BFGS implementations I've found seem to require knowledge of the value of the objective, especially in the line search step. I've looked at both a python (scipy) and C++ implementation of BFGS.
Obviously I can use gradient descent, but I'd prefer not to reinvent the wheel here.
Any ideas?
Some more detail: I want to minimize h. But I'm not given h. What I'm given is h = f(g), and an explicit formula for g(x). f basically transforms the gradients of g in a kind of tricky geometric way that is not too difficult to calculate, but impossible to integrate. So, it's pretty straightforward to calculate the gradient of h(x), but hard to get explicit values for h(x).