I am new to the community. I have two dimensional data (x and y data). Each data point of y can be modeled using some equation for example: y = dln(1+(exp(x-a))/(bc)). I know the value of a and c. Now for fitting curve to data I assigned initial value for b and d as 1 and I can generate the following curve.
I know that if I increase b by some ratio let say b= b + .05 and decrease d by looking at the graph then I will eventually match the data points with some error. But this would be an iterative approach to increase b every time and check error for the fitting. Is there any optimization or fitting techniques that minimize error between ydata and y = dln(1+(exp(x-a))/(bc)) and gives the values for parameters that could generate the curve with minimum possible error. Do you know any techniques related to this problem. Thanks