I wanted to print out the execution time of a solution to the system of equations for every t+dt where dt is .01 sec
import numpy as np
from scipy.integrate import odeint
import matplotlib.pyplot as plt
import time
from datetime import timedelta
# function that returns dz/dt
def model(z,t):
start_time = time.clock()
dxdt = z[1]
dydt = -1.2*(z[0]+z[1]) + 0.2314*z[0] + 0.6918*z[1] - 0.6245*abs(z[0])*z[1] + 0.0095*abs(z[1])*z[1] + 0.0214*z[0]*z[0]*z[0]
dzdt = [dxdt,dydt]
#Print
elapsed_time_secs = time.clock()-start_time
msg = "Execution took: %s secs (Wall clock time)" %
timedelta(seconds=round(elapsed_time_secs,5))
print(msg)
return dzdt
# initial condition
z0 = [0,0]
# time points
t = np.linspace(0,10,num=1000)
# solve ODE
z = odeint(model,z0,t)
Since my time intervals are .01 (10/1000), technically I should have a 1000 lines of output statements, but for some reason far fewer lines get printed. This varies according to the initial conditions set. For [0,0], about around 10 lines are print. For [1,2] somewhere around 400 lines are printed. I don't understand why this is happening.