I already read this question: memory error in numpy svd and this Applying SVD throws a Memory Error instantaneously? and a bunch of other numpy.linalg.svd questions.
I need to run svd on very large matrices for work. I currently have a 8 gig machine and it sometimes (on certain matrices) crashes when I run it on certain scenarios by taking up all the systems memory and making the computer grind to a halt.
I need to analyze the svd results so I could learn about the clusters model. How can I predict when will it crash and when will it work? Or perhaps how much memory is needed for it to work properly?