I am trying to fit a power law of the form a*x**b+c
to some data points, using curve_fit
from scripy.optimize
.
Here's the MWE:
import numpy as np
from scipy.optimize import curve_fit
import matplotlib.pyplot as plt
def func_powerlaw(x, m, c, c0):
return c0 + x**m * c
x = np.array([1.05, 1.0, 0.95, 0.9, 0.85, 0.8, 0.75, 0.7, 0.65, 0.6, 0.55])
y = np.array([1.26, 1.24, 1.2, 1.17, 1.1, 1.01, 0.95, 0.84, 0.75, 0.71, 0.63])
dy = np.array([0.078]*11)
fig, (a1) = plt.subplots(ncols=1,figsize=(10,10))
a1.errorbar(x, y, yerr = dy, ls = '', marker='o')
popt, pcov = curve_fit(func_powerlaw, x, y, sigma = dy, p0 = [0.3, 1, 1], bounds=[(0.1, -2, -2), (0.9, 10, 2)], absolute_sigma=False, maxfev=10000, method = 'trf')
perr=np.sqrt(np.diag(pcov))
xp = np.linspace(x[0],x[-1], 100)
a1.plot(xp, func_powerlaw(xp, *popt), lw=3, zorder = 1, c = 'b')
print(popt, perr)
Output: [0.35609897 3.24929422 -2.] [0.47034928 3.9030258 3.90965249]
For all three parameters, the errors are larger than the value estimations themselves. Judging from experience this cannot be right, since the line fits the data points very well.
Even if I don't set any bounds and/or initial guess, the values change, but the errors remain too high.
The only boundary required is that 0.1<=m<=0.9
.
What am I doing wrong?
Any help is greatly appreciated!