I am fitting curve using the scipy.optimize.curve_fit. From what I notice, the curve fitting is performed by minimizing the sum of the squared residuals of f(xdata, *popt) - ydata
, whereas I want to minimize the squared residuals of relative error: (f(xdata, *popt) - ydata)/ydata
since my ydata
order of magnitude varies a lot. How to optimize using the relative deviation? I do not need to necessarily use curve_fit
function. Any python function to achieve this is fine.
PS: I am aware of another approach of converting the ydata
into logspace and fitting the resulting data. But I do not want to do this approach.