I'm simulating a Brownian motion in MATLAB, however I'm getting a strange outcome where the variance of the increments of the Brownian motion grow over time when it should stay constant. For example I construct a Brownian motion system,
brown_drift = @(t,X) 0;
brown_vol = @(t,X) .2;
brown_sys = bm(brown_drift, brown_vol);
And then I interpolate 1000 trials with timestep 1 and length 10
inter_brown = interpolate(brown_sys, 0:1:10, zeros(1,1,1000),'Times',[0]);
inter_brown = squeeze(inter_brown);
The increments of a Brownian motion should be independent so if I construct a matrix of the increments and take the variance, they should all be the same and equal to the volatility parameter squared.
inc = inter_brown(2:end,:)-inter_brown(1:end-1,:);
var(inc')
ans = 0.0374 0.1184 0.2071 0.2736 0.3516 0.4190 0.5511 0.5891 0.6767 0.7647
However it clearly doesn't satisfy what comes out of the simple theory where the variance should be 0.2^2 for each increment. It looks like each increment in the future adds 2*0.2^2 to the variance of the increment before it. I can't seem to figure out why this is happening when the Brownian motion seems to satisfy the other theory, e.g. the variance of the motion at a given time. Is there something obvious I'm missing here?