I'm running Monte Carlo simulation for a Simulink model with a Matlab script that looks more or less like this :
model = 'modelName';
load_system(model)
for ii = 1 : numberOfMC
% Some set_param...
% Some values are set
sim(model);
results{ii, 1} = numberOfMC;
% ect...
end
close_system(model,0);
As the number of Monte Carlo trials increase, the time of one simulation increases as well like n^2.
Is there a simple explanation for that and is there a solution to have something linear in time?
Thank you!
EDIT:
When I split my simulation in 6 batchs, and I run them in series, the sum of the simulation times is far less than when I run the entire simulation ine one shot.