I am trying to do a 1D Gaussian fitting using ODR in Python, but keep getting wrong fitting results.
For simplicity, assume that I have a set of 19 data points. These are the data I want to fit:
import numpy as np
import matplotlib.pyplot as plt
from scipy.odr import ODR, Model, Data
x = np.arange(0,19,1)
y = np.array([5.64998480e+09, 3.03653479e+10, 2.18927521e+11, 6.22541771e+11,
1.24917901e+12, 2.05145638e+12, 2.92904416e+12, 3.74656109e+12,
4.36310058e+12, 4.66564452e+12, 4.59701326e+12, 4.17028923e+12,
3.46549578e+12, 2.60950760e+12, 1.74504950e+12, 9.97650569e+11,
4.49637554e+11, 1.27693929e+11, 6.10512095e+09])
def func(beta,data):
x = data
height,center_x,width = beta
return height*np.exp(-(((center_x-x)/width)**2/2))
data = Data([x],y)
model = Model(func)
odr = ODR(data, model, [1e12,10,2])
res = odr.run()
plt.figure()
plt.plot(x,y)
plt.plot(x,func(res.beta, x),'-o')
What is wrong with my code?
Thanks!