I am trying to minimize a function using spicy but the estimate of sigma is way off. Any help would be greatly appreciated.
Here's my code so far:-
import numpy as np
from scipy.optimize import minimize
import matplotlib.pyplot as plt
import matplotlib.mlab as mlab
from pylab import plot, show, grid, xlabel, ylabel
np.set_printoptions(linewidth=999999)
from scipy.stats import multivariate_normal
import scipy.stats as stats
from numpy import sqrt
from numpy import exp
from numpy import log
from numpy import pi
# Process parameters
beta=1.5
sigma=0.01
theta=0.7
T=50
N=1000
dt = T/N
#n=1 ## number of simulations
M = np.zeros((N))
#for k in range(n):
# Iterate to compute the steps of the Brownian motion.
for i in range(N):
M[i]=(theta + (M[i-1]-theta)*exp(-beta*dt)) + sigma*np.random.normal(0,sqrt((1-exp(-2*beta*dt))/2*beta))
M[0] = 0.7 ## initial value
#print(M)
def mle(params):
beta = params[0]
theta = params[1]
sigma = params[2]
alpha = exp(-beta*dt)
eta = (sigma**2)*(1-exp(-2*beta*dt))/(2*beta)
LL = -(-N/2*log(2*pi)-N*log(sqrt(eta)) - (np.sum((M[i]-M[i-1]*alpha - theta*(1-alpha))**2))/(2*eta))
return(LL)
initParams = [1, 1, 1]
#params1 = np.array([1,1,1])
res = minimize(mle, initParams ,method='nelder-mead')
print(res.x)
What I am getting is this:-
runfile('/Users/achalawasthi/Desktop/testmle.py', wdir='/Users/achalawasthi/Desktop')
[ 1.84035906e+00 7.41336913e-01 2.00523821e-23]
As you can see, the estimate of sigma is way off. I do not understand why this is happening ? My intuition is that there is some instability and that sigma just goes away. Would putting some bounds on sigma help?
Thanks