0

I have a function

function [output1 output2] = func(v1,v2,v3,v4,v5,v6,v7,v8,v9,v10)

that I want to discretize. I am going to be performing optimization that involves this function and I think the optimization's efficiency would benefit from discretizing the function and then doing spline interpolation on the data instead of having to evaluate the continuous function. Essentially I would want a 10-D double for each of output1 and output2 that correlates with varying values of v1, v2, ... v10.

With infinite time and memory I would do the following:

n_pts = 100;

v1 = linspace(v1_min, v1_max, n_pts);
...
v10 = linspace(v10_min, v10_max, n_pts);

[v1g v2g ... v10g] = ndgrid(v1, v2, ... v10);

[output1, output2] = arrayfun(@func, v1g, v2g, ... v10g);

Time and memory (needed to execute ndgrid and arrayfun) obviously do not allow for this. Can anyone think of work-around, or is this problem of discretizing a function of 10 variables totally intractable?

user178831
  • 49
  • 5
  • 1
    So, you want to run your function `1e20` times? If each takes a milisecond, youll be runnign this for *thousands* of years. The reason optimization is a huge field in mathematics is to avoid doing things like this.... – Ander Biguri Mar 16 '16 at 17:24
  • I'm basically trying to conceive of a way that I can collect data on this function (for interpolation, later) without needing a ridiculous amount of data points. – user178831 Mar 16 '16 at 17:35
  • The worng thing is the approach to start with. – Ander Biguri Mar 16 '16 at 17:36
  • so essentially the function has way too many variables, there's no way to discretize it? – user178831 Mar 16 '16 at 17:43
  • You are using the wrong term here, you are not "discretizing". You are trying to evaluate the function 1e20 times. If you want to optimize it, that is a HUGE field in mathematics, where *no one* solves it by just giving values and seeing which one is the minimum. If that were possible an incredible amount of problems that are unsolved would be solved nowadays – Ander Biguri Mar 16 '16 at 17:46
  • My optimization isn't to minimize func. I have an optimization problem in which func appears. one problem with func is that it is not twice continuously differentiable. I figured I could replace func (where it appears in the optimization problem) with a lookup table that can be interpolated with splines to guarantee smoothness properties of func and make the time the optimization spends evaluating func much shorter. Does this make sense? – user178831 Mar 16 '16 at 18:06
  • A function doesnt need to be twice continuous differenciable to be optimizde. There are lots of algorithms for these cases. Still, I am going to be bold: **Spending thousands of years of computations for ANYTHING doesnt make sense, no** – Ander Biguri Mar 16 '16 at 18:10
  • Understood. Maybe the questions I should be asking are 1) would this strategy make sense if func were a function of 2 variables? if yes 2) is there any way of extending the strategy for a function of 10 variables, or does this extension totally change the way the problem has to be addressed? Thanks for your replies, btw. – user178831 Mar 16 '16 at 18:14
  • Indeed! You need a step back to look at how to solve your optimization problem, instead of triying this approach. – Ander Biguri Mar 16 '16 at 18:19
  • Consider accepting the answer if it helped – Ander Biguri Mar 16 '16 at 18:26

1 Answers1

0

You are on a totally wrong path. Assuming you had infinite memory, you would call your function 100^10 times in the last line. That would require a lot of time. No reasonable optimisation strategy would call your function that many times, that's the reason why all those complicated strategies are developed.

You may use your strategy to pre-compute computation intensive sub terms of your function. Replacing a very cost-intensive term with only three variables with a 100^3 lookup table might increase the performance significantly without using to much memory.

Daniel
  • 36,610
  • 3
  • 36
  • 69
  • Just to be clear, my point is to pre-compute this function so that the optimization only needs the lookup table (and can interpolate it) instead of the continuous function. So yes, point taken, I will see if I can use this strategy on parts of the function. – user178831 Mar 16 '16 at 17:38
  • What is the purpose when creating the lookup table takes more time than your search strategy would take to evaluate the function? – Daniel Mar 16 '16 at 17:46
  • One purpose is to be able to use spline interpolation to guarantee a certain level of smoothness, but the other purpose (the main one) was to make the optimization time shorter. I would rather create a lookup table once (even taking a reasonably long time) and then be able to run the optimization multiple times (since it would be quicker). But of course, I need to actually be able to create the lookup table in a reasonable amount of time. I guess this isn't feasible. – user178831 Mar 16 '16 at 17:49
  • *was to make the optimization time shorter*, well yeah, the optimization time will be sorter if you take thousands of years to compute possible solutions. @user178831 – Ander Biguri Mar 16 '16 at 17:50