I am looking to find a good way to find the optimum parameter for a linux program using R.
It takes about 20 seconds to run each time.
You put in a single integer number as input and get a single decmial number out.My goal is to get this output number as close to 4.5 as possible, although sometimes it is not possible.I would also like to keep the input as low as possible.The input can vary from 30 to 10,000.
I have done some research on the subject as well as experimented with toy datasets. But I am unsure which is the best way to proceed. I have experimented with some simple if/for loops exhaustively going through all the inputs(or enough until i get reasonably close). But this seems very crude.I just incrementally increased/decrease the input and measured the effect on the output.This did tell me that the relationship between the input and output is not linear.It was also more complicated than it first seemed with many peaks and troughs.
lpsolve and ompr packages seem like the right ones, but I am unsure how to use them as the theories involved are well beyond me!