I am not sure if this is the place for this question, but here it goes.
I am working on computational physics model in C++ and I received a big code that I am tiding up.
The field is quantum electronics, so there are lots of integrals and expressions, but my question concerns the optimization.
Initially the code needed 4 hours to finish the simulation, I tidied this up to 30 min after removing several bottlenecks.
What I still do not like is the fact that the data in the code (mostly arrays of type double
) mix std::vector<(double)>
, std::valarray<(double)>
and armadilo's arma::vec
.
Which one of these is the best way and should I even try to make uniform data type for the arrays?
I think the reason previous PhD students used valarray
is its friendliness since operators *
,/
,+
,-
are overloaded and there is some scarce online promise that valarray
s are faster then vectors (big online debate about this).
Armadillo was brought in, due to its similarity to MATLAB syntax and extremely well documented library, and std::vector<(T)>
is kept when T is class and not double.
I personally prefer armadillo, because it has very simple logic and it feels natural, valarray
s annoy me sometimes because they do not have many member functions, but iteration through them is quite easy, vectors are very annoying, arithmetic operators are not overloaded for <(double)>
and if I chose these I would probably need to make my own class for Array of double
s which I do not really want to do.
Keep in mind that I have no idea how these arrays behave in memory, I started C++ last year, and I have a long way to go. I am planning of developing this code for 2 more dimensions (currently it's 1D) and execution time will certainly grow.
I am therefor seeking advice with which array type I should continue and should I make the code uniform (force all arrays of double
to be of the same type (std::valarray
, std::vector
or arma::vec
))