People have asked similar questions before but none has a satisfactory answer. I'm trying to solve Lindblad Master Equation and the matrix size I'm trying to simulate are of order 10000 x 10000. But the problem is with exponentiation of the matrix, which is consuming a lot of RAM.
The MATLAB and Python expm()
function take around 20s and 80s for a matrix of size 1000 x 1000 respectively. The code I've shown below.
pd = makedist('Normal');
N = 1000;
r = random(pd ,[N, N]);
t0 = tic;
r = expm(r);
t_total = toc(t0);
The problem comes when I try to do the same for a matrix of size 10000 x 10000. Whenever I apply expm()
, the RAM usage grows and it take all the RAM and SWAP memory on my PC (I've 128 GB RAM and 64 Core CPU) and it's same in case of both MATLAB and Scipy. I don't understand what is taking so much RAM and how can I efficiently rum expm()
or if it is not possible at all? Even if I could do it on any other language efficiently it would be really helpful!